Relative Content

Tag Archive for pythontensorflowcolorsneural-network

Why I’m getting massive losses in this color transformation model I’m building?

I have a dataset of 200k+ color patches captured on two different mediums that I’m building a color transformation for. Initially I did a direct RGB-to-RGB input-output in the neural network. This works decently well, but I wanted to use a luminance-chrominance space to perform the match in to potentially better translate luminance and color contrast relationships. While I initially did it in CIELAB and YCbCr, the transformation of the dataset into either space is ultimately inaccurate as the data represents HDR scene data in a logarithmic container and neither space isn’t built for HDR scene representation. So I’m attempting to use Dolby’s ICtCP space which is built from unbounded scene linear information. I performed the transformation into the space and confirmed the output and array structure to be correct. However, upon feeding the variables into the network, it would immediatly start giving astronomical losses before flicking over to inf or nan loss. I can’t figure out what the issue is.