While using the Keras tuner’s HPO, I was trying to add a custom metric and receive the results as a list, but an issue was identified where the results at the end of training were not saved in the variable defined in init.
The code below is a simple corresponding code, but here again, the value of self.aa does not increase even after training ends.
from tensorflow import keras
from tensorflow.keras.layers import Dense, Flatten
import tensorflow as tf
import IPython
import kerastuner as kt
(X_train, y_train), (X_test, y_test) = keras.datasets.fashion_mnist.load_data()
X_train = X_train.astype('float32') / 255.0
X_test = X_test.astype('float32') / 255.0
def model_builder(hp):
model = keras.Sequential()
model.add(Flatten(input_shape=(28, 28)))
hp_units = hp.Int('units', min_value = 32, max_value = 96, step = 32)
model.add(Dense(units = hp_units, activation = 'relu'))
model.add(Dense(10, activation='softmax'))
hp_learning_rate = hp.Choice('learning_rate', values = [1e-2, 1e-3])
model.compile(optimizer = keras.optimizers.Adam(learning_rate = hp_learning_rate),
loss = keras.losses.SparseCategoricalCrossentropy(from_logits = True),
metrics = ['accuracy'])
return model
tuner = kt.Hyperband(model_builder,
objective = 'val_accuracy',
max_epochs = 10,
factor = 3,
directory = 'my_dir',
project_name = 'intro_to_kt')
class custom_Metric(tf.keras.callbacks.Callback):
def __init__(self):
self.aa=1
def on_train_end(self, epoch, logs={}):
print(f"aa is : {self.aa}")
self.aa+=1
tuner.search(
X_train,
y_train,
epochs = 10,
validation_data = (X_test, y_test),
callbacks = [custom_Metric()]
)
best_hps = tuner.get_best_hyperparameters(num_trials = 1)[0]
I’m not sure if this is a problem due to my inexperience or a problem with the keras_tuner API.
I want to make self.aa increase after training ends. Please help me.
*P.S. This problem was found in both gridsearch and hyperband.
Sunny is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.