Pytorch got Non-reproducible results after adding activation function (i.e., ReLU)
I am unable to reproduce my results in PyTorch after adding a nn.ReLU()
. I am sure the problem must be here instead of other places, since I have tested with ablation for hundreds of times. It is weird that the activation function does affect the reproducibility. Even though I replace the nn.ReLU()
with torch.max()
, it does NOT work yet.