Why does my GPU memory keep increasing when I sweep over model parameters?
I am trying to evaluate model classification error rate with different dropout rates for a specific architecture. As I do so, memory usage increases, and I am not able to stop this from happening (see code below for details):