I’m using a preprocessed, z-score normalized list as the source for my dataset. Here’s a collage of images augmented by Albumentations:
enter image description here
Here’s my Compose:
augmentation = A.Compose([
A.HorizontalFlip(),
A.RandomBrightnessContrast(brightness_limit=(-0.0001, 0.0001), contrast_limit=(-0.01, 0.01)),
A.CoarseDropout(8, 0.1, 0.1),
A.Rotate(limit=15),
A.Affine(shear=(-2, 2), scale=(0.95, 1.05)),
>! ToTensorV2()
])
On the 50% of the images that RandBrightnessContrast is applied even with very small parameters, the whole distribution of the image is squashed to [0, 1] (from ~-2,~2 as expected for z-score normalized images).
Any way around this?
Maybe I should perform z-score normalization after these, but my original intent was to separate all deterministic steps (resize, normalize etc.) from the augmentation steps for efficiency.