I am working with torch dataset of 1d signals, and would like to standardize the vectors to mean 0, std 1 before further processing the data. If I would have dealt with an image, I could use torchvision.transforms:
import torchvision.transforms as transforms
import torch
data_2d = torch.rand(3, 100,100)
normalized_data_2d = transforms.Normalize(mean = (data_2d.mean(),), std = (data_2d.std(),))(data_2d)
print(f'mean: {normalized_data_2d.mean()} ~ 0 , std: {normalized_data_2d.std()} ~ 1, ok')
I get:
mean: -4.1373571235681084e-08 ~ 0 , std: 0.9999999403953552 ~ 1, ok
When I use 1d data in the same manner:
data_1d = torch.rand(100)
normalized_data_1d = transforms.Normalize(mean = (data_1d.mean(),), std = (data_1d.std(),))(data_1d)
I get TypeError: Tensor is not a torch image
error:
Is there an elegant way to standardize 1d vectors using torch transform?