Leaky relu is function that has the capabilities to reduce 0-flow in backward and forward pass. But it is quite unnatural to see Leaky relu in models. In which model it is used?
I tried searching on Web and ChatGpt but didn’t got a proved result. I am looking for list of models.