cross entropy loss is over 100 it’s crazy
I tried to train sequential (LSTM and GRU) models to predict the chemical structure of drugs for drug discovery process. I got very large cross entropy loss (>100). The input to the model is SMILES string of chemical structures encoded as a one-hot encoded array.