I am trying to build a CNN model in which it would recognize whether it is my face or not my face for educational purposes. However, as soon as I ran model.fit() and saw the training process, I immidiately realized that the model is suffering from a severe overfitting issue (in this case accuracy was close to 1 but val_acc remained at 0.4849). I dont know whether it is my mistake in choosing the right learning rate, epochs and batch size, or the model is too complex, or i just dont have enough data.
This is the model, I will be happy to provide additional code if necessary. Thank you!
size = 224
def create_model():
model = Sequential([
Input(shape=(size,size,3)),
# First Hidden Layer
Conv2D(32, (3,3), activation='relu', kernel_regularizer=tensorflow.keras.regularizers.l2(0.01), bias_initializer=RandomNormal()),
Dropout(0.3),
MaxPooling2D((2,2)),
# Second Hidden Layer
Conv2D(64, (3,3),1,activation='relu', kernel_regularizer=tensorflow.keras.regularizers.l2(0.01)),
BatchNormalization(),
Dropout(0.3),
MaxPooling2D((2,2)),
# Third Hidden Layer
Conv2D(128, (3,3),1, activation='relu'),
MaxPooling2D((2,2)),
# Flatten the model
Flatten(),
Dropout(0.3), # Randomly reduces the data by 50% to prevent overfitting
BatchNormalization(),
# Fully Connected Layer
Dense(256, activation='relu', kernel_regularizer=tensorflow.keras.regularizers.l2(0.01)), # Add L2 regularization
Dropout(0.3),
Dense(1,activation='sigmoid') # Produces a probability output for binary classification
])
prev_weight = model.layers[0].get_weights()
if len(prev_weight) == 2:
weight_mat, bias = prev_weight
print("Current weight is {}".format(weight_mat))
print("Current bias is {}".format(bias))
random_bias = np.random.randn(* bias.shape)
new_weight = [weight_mat * 0.1, random_bias]
model.layers[0].set_weights(new_weight)
else:
raise ValueError("NGU VL!!!!")
model.compile(loss="binary_crossentropy", optimizer=optimizer, metrics=["accuracy", ])
return model
huy the is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.