I am struggling with an uncommon result (for me) that I cannot understand.
I have a Conditional GAN trained. The metrics I am monitoring during the training process are the losses of D and G (binary-cross entropy) and the Fidelity (the degree of similarity between two matrices). This metric when close to 1 is good and close to 0 is bad.
Fidelity is continuously rising which means that the matrices being generated by the network are becoming closer and closer to my target:
But the losses:
Why is that? Is something wrong or is this acceptable?
My Generator generated a matrix that is 99.99% close to the target matrix but the Discriminator is being perfect on discriminating between the generated and the real one?
Look at the generated sample, it is almost a ctr+c ctr+v from the original:
Generator loss going up and up while discriminator loss going close to zero doesn’t mean convergence failure? From my understanding on convergence failure, this is the case, but it also means that the Generator is generating crap. But here this is not the case, I have:
- big generator loss
- very low discriminator loss
- very good generated sample
Am I not getting something?