I would like to implement forward and backward propagation using Numpy by the Leaky ReLU with “mask” and “alpha” and would appreciate advice on the “out” and “dout” sections below.
If it were a simple ReLU function, these sections should show “out[self.mask]=0” and “dout[self.mask] = 0” and it worked.
I thought the same logic applies but below are obviously not correct. Appreciate kind advice. Thank you.
class Leaky_ReLU():
def __init__(self, alpha=0.01):
self.mask = None
self.alpha = alpha
class Leaky_ReLU():
def __init__(self, alpha=0.01):
self.mask = None
self.alpha = alpha
def __call__(self, x):
self.mask = x <= 0
out = x.copy()
out[self.mask] = self.alpha
return out
def backward(self, dout):
dout[self.mask]=self.alpha
dx = dout
return dx
user25498702 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.