How can I pass an ImageDataGenerator to segmentation_model’s U-net,
data_generator = ImageDataGenerator(
rescale = 1./255.
)
train_dataset_images = data_generator.flow_from_directory(
directory=image_directory,
target_size = (256, 256),
class_mode = None,
batch_size = 32,
seed=custom_seed
)
train_dataset_masks = data_generator.flow_from_directory(
directory=mask_directory,
target_size = (256, 256),
batch_size = 32,
class_mode = None,
color_mode = 'grayscale',
seed=custom_seed
)
train_generator = zip(train_dataset_images, train_dataset_masks)
When I run this i run into a valueError saying “exptected 1 input but received 2” so I tried combining them with these functions:
def combine_generator(image_generator, mask_generator):
while True:
image_batch = image_generator.next()
mask_batch = mask_generator.next()
yield (image_batch, mask_batch)
and
def combine_generator(image_gen, mask_gen):
for img, mask in zip(image_gen, mask_gen):
yield img, mask
These don’t seem to work.
- How can I pass ImageDataGenerator() to the U-net?
- Do the images and the masks have to have the same name?
- Do the masks have to be one hot encoded according to the classes?