How backward() works in PyTorch?
x = torch.tensor(xtrain[0], requires_grad=True) y = torch.tensor(t[0], requires_grad=True) print(‘x= ‘,x) print(‘y= ‘,y) z = make_prediction(x,y) print(‘z= ‘,z) z.backward() print(x.grad, y.grad) In the above code, z is a number. Also, x and t are the same number. (All elements are tensors of numbers.) How is derivative calculated with backward? python deep-learning neural-network differential-equations pde New contributor […]