x = torch.tensor(xtrain[0], requires_grad=True)
y = torch.tensor(t[0], requires_grad=True)
print('x= ',x)
print('y= ',y)
z = make_prediction(x,y)
print('z= ',z)
z.backward()
print(x.grad, y.grad)
In the above code, z is a number. Also, x and t are the same number. (All elements are tensors of numbers.)
How is derivative calculated with backward?
New contributor
Nsss123 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.