I have a concatenated tensor that represents a batch of point clouds with pointwise features.
Shape: (batch_size * num_points, feature_dim)
I also have access to a 1d-tensor that contains the point cloud index for each point from 0 to batch_size – 1.
Shape: (batch_size * num_points)
From those I want to compute a batched tensor of shape
(batch_size, max_num_points, feature_dim)
Since the number of points per point cloud varies, we need to pad all of them to max_num_points.
I have a solution by looping over the indices, but is it possible to do this more efficiently with tensor operations alone?
def reshape_tensor(concatenated_tensor, point_cloud_indices, batch_size):
feature_dim = concatenated_tensor.size(1)
max_points = max((point_cloud_indices == i).sum().item() for i in range(batch_size))
reshaped_tensor = torch.zeros((batch_size, max_points, feature_dim))
for i in range(batch_size):
mask = (point_cloud_indices == i)
num_points = mask.sum().item()
reshaped_tensor[i,:num_points,:] = concatenated_tensor[mask]
return reshaped_tensor
Hölderlin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.