Relative Content

Tag Archive for mlactivation-functionrelu

Have questions about ReLU

Why do we use ReLU as in basically its used as it requires less computational power and just converts inputs from negative to positive and keeps the positive input positive. Other activation functions like Sigmoid converts values between 1 and 0 thus is suitable for binary classification. so what is Relu suitable for