I have the following optimizer being create using optax:
def create_optimizer(learning_rate=6.25e-2, beta1=0.4, beta2=0.999,
eps=2e-4, centered=False):
Returns:
An optax optimizer.
"""
return optax.adam(learning_rate, b1=beta1, b2=beta2, eps=eps)
How during training update this learning rate manually?
I couldn’t find any documentation about that.