Relative Content

Tag Archive for pythonmachine-learningdeep-learningconv-neural-networkrecurrent-neural-network

Degradation after scaling of coded images

I’m working on a problem of predicting the next image from a past image sequence. My starting images are of the form (50, 101, 128, 128, 3), To prepare them for training with an auto-encoder containing CNN layers, I resize them (50, 101, 128, 128, 3) –> (5050, 128, 128, 3) then normalize by X/255 -0.5, then divide into train and val data: X_train size (4545, 128, 128, 3)
X_test size (505, 128, 128, 3). The latent vector is 250.
Training with the autoencoder goes smoothly, and the reconstructed images have very good fidelity to the original image. After training, I encode my images in preparation for training the LSTM model, but after scaling the encoded images and applying the invert.transform, I re-display my images and observe that they are degraded. I’ve tried varying the normalization method but without success. I’ve also used the NDStandardScaler from this link: How to standard scale a 3D matrix?