I’m trying to calculate negative log-likelihood for evaluation, ‘pred’ is already a probability vector, and by some reasons I’m trying to get log(pred) by
log_probs = F.log_softmax(pred)
but the numerical result is very wired
after I print it out
print(f'what does pred look like: {pred}') print(f'what does log_probs look like: {log_probs}')
it shows that:
what does log_probs look like: tensor([[-2.4611, -2.4611, -2.4611, -2.4611, -2.4611, -2.4611, -2.4611, -2.4611,
-1.4612, -2.4611]], device=’cuda:0′)
what does pred look like: tensor([[3.5614e-05, 1.0131e-08, 5.1123e-10, 5.1686e-10, 2.9131e-09, 8.3620e-11,
3.1788e-09, 4.6923e-10, 9.9996e-01, 3.9455e-08]], device=’cuda:0′)
obviously wrong(the second dimension should be near to 0), where goes wrong?
Pengxin Wang is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.