I’m working through a Keras/Tensorflow course that uses Keras 2 to build a variational autoencoder and I’m trying to get it working in Keras 3. I’ve managed to overcome a lot of issues but I’m stuck on this part and hoping someone can help me to move forward.
To compute the loss, the original code uses (with me already altering to ops.):
reconstruction_loss = ops.binary_crossentropy(inputs, outputs)
reconstruction_loss *= 784
kl_loss = 0.5 * (ops.exp(z_log_var) - (1 + z_log_var) + ops.square(z_mean))
kl_loss = ops.sum(kl_loss, axis=-1)
total_vae_loss = ops.mean(reconstruction_loss + kl_loss)
It then adds the loss to the model, compiles, and fits:
vae_model.add_loss(total_vae_loss)
vae_model.compile(optimizer='rmsprop', metrics=['accuracy'])
vae_model.fit(x_train_flat, epochs=epochs, batch_size=batch_size)
I can see in the Keras docs that the add_loss feature is removed:
Symbolic Layer.add_loss(): Symbolic add_loss() is removed (you can still use add_loss() inside
the call() method of a layer/model).
but I don’t really understand adding it in a call method. I feel it’s for building your own layers?
If I remove the add_loss
line I get get error:
ValueError: No loss to compute. Provide a `loss` argument in `compile()`.
I’ve tried various ways of adding total_vae_loss
in as a loss in the compile()
but get errors, mostly saying it must be callable.
From here I’m just firing in the dark, I’ve put the loss calculations into a function and added that to the compile which prevents that error, but when I fit I get a ValueError:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[52], line 1
----> 1 vae_model.fit(x_train_flat,
2 epochs=epochs,
3 batch_size=batch_size)
File ~/code/python/mls/mlsenv/lib/python3.10/site-packages/keras/src/utils/traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
119 filtered_tb = _process_traceback_frames(e.__traceback__)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
File ~/code/python/mls/mlsenv/lib/python3.10/site-packages/optree/ops.py:594, in tree_map(func, tree, is_leaf, none_is_leaf, namespace, *rests)
592 leaves, treespec = _C.flatten(tree, is_leaf, none_is_leaf, namespace)
593 flat_args = [leaves] + [treespec.flatten_up_to(r) for r in rests]
--> 594 return treespec.unflatten(map(func, *flat_args))
ValueError: None values not supported.
Is there a way to get this working? I can include the full stack trace, but wasan’t sure the best way to include such a lot of text.