Contrastive loss don’t work and total loss increase
I try to use Mean-Teacher architecture with two U-Net in a semi-supervised setup but I have a strange behaviour with the loss during the training. I have a supervised loss and a contrastive loss computed on unlabled data that then i sum it and obatin a total loss for backpropagation.
example_of_increasing_loss