I have two PyTorch tensors x, y with the same dimensions. I would like to swap the data behind the two tensors, ideally without having to copy. The purpose of this is to have code elsewhere that holds onto the tensor x to now read & write the data y and vice-versa.
What’s the correct way to do this? Will this swap potentially break any code that holds onto references to these tensors? Will reference counting/GC for the tensors still work correctly after the swap?