I’m trying to set weights on one of the layers within my model to no avail.
I’ve followed the solutions to similar problems online and none of them are seeming to work for me. The variable ‘w’ (as denoted in the code below) is of structure [numpy array, numpy array]. The first is of size(3, 3, 3, 64) and the second is of shape (64,). I want to achieve a similar functionality to that of the ‘weights’ kwarg in tf 2.X, but can’t seem to get it working. Here’s my code and the error:
encoder = Sequential()
encoder.add(layers.Conv2D(64, (3, 3), activation='relu', padding='same', use_bias=False, input_shape=(SIZE, SIZE, 3)))
w = model.layers[0].get_weights()
encoder.layers[0].set_weights([w])
encoder.add(layers.MaxPooling2D((2, 2), padding='same'))
encoder.add(layers.Conv2D(32, (3, 3), activation='relu', padding='same', weights=model.layers[2].get_weights()))
encoder.add(layers.MaxPooling2D((2, 2), padding='same'))
encoder.add(layers.Conv2D(16, (3, 3), activation='relu', padding='same', weights=model.layers[4].get_weights()))
encoder.add(layers.MaxPooling2D((2, 2), padding='same'))
encoder.summary()
ERROR: ValueError: You called `set_weights(weights)` on layer 'conv2d_7' with a weight list of length 2, but the layer was expecting 1 weights.
John is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.