I’m building LSTM model and it requires multiple Input (each has different features) and because of RAM issue, i have to create custom generator for my model:
class CustomTimeseriesGenerator(TimeseriesGenerator):
def __getitem__(self, idx):
x, y = super().__getitem__(idx)
x_list = []
col_index = 0
for input in input_list:
input_features = feature_dict[input] //list of feature name for each input
x_input = x[:,:,col_index:col_index+len(input_features)]
x_list.append(x_input)
col_index = col_index + len(input_features)
return tuple(x_list), y
Then I will call this custom generator during fitting process
train_gen = CustomTimeseriesGenerator(X.to_numpy(),y.to_numpy(),timesteps,batch_size=batch_size,end_index = train_size)
val_gen = CustomTimeseriesGenerator(X.to_numpy(),y.to_numpy(),timesteps,batch_size=batch_size,start_index = train_size+1)
model.fit(train_gen, validation_data=val_gen, epochs=50)
However, It throws below error when running (model was compiled with tf.keras.metrics.F1Score)
ValueError: FBetaScore expects 2D inputs with shape (batch_size, output_dim), with output_dim fully defined (not None). Received input shapes: y_pred.shape=(None, 3) and y_true.shape=(None, None).
I have tried to use small raw data (without generator) and it works fine, both y_pred and y_true shape are (None, 3) and it seems that there is no issue with model implementation but data generator only.
Please check and let me know how to fix this issue
Dai Nguyen Duc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.