Why do we use ReLU as in basically its used as it requires less computational power and just converts inputs from negative to positive and keeps the positive input positive. Other activation functions like Sigmoid converts values between 1 and 0 thus is suitable for binary classification. so what is Relu suitable for
just wanted basic understanding of activation questions