I am working with a simple TensorFlow model given by
model = models.Sequential()
model.add(InputLayer(input_shape = d.shape))
model.add(Ident(NlastDim = d.shape[-1]))
model.add(Projection(Nproj = Nproj))
model.add(Dense(Nout, activation = 'linear'))
model.compile(optimizer = 'adam', loss = 'mse', metrics=[tf.keras.metrics.RootMeanSquaredError()])
prior = MyPrior(simModel['prior'])
trainingGenerator = DataGenerator(simModel, N, NperBatch, Nbatches, prior)
validationGenerator = DataGenerator(simModel, N, Nho, 1, prior)
model.summary()
model.fit(trainingGenerator, validation_data = validationGenerator)
The Ident
and Projection
layers are custom layers. Ident
is defined to be an identity layer for debugging purposes. model.summary
prints the model as follows
Model: "sequential" _________________________________________________________________
Layer (type) Output Shape Param #
================================================================= ident (Ident) (None, 2000, 3) 0
projection (Projection) (None, 4) 0
dense (Dense) (None, 2) 10
================================================================= Total params: 10 (40.00 Byte)
Trainable params: 10 (40.00 Byte)
Non-trainable params: 0 (0.00 Byte)
However after this the Ident
layer is called with
[Ident] call in-shape: (None, None, None)
After poking around it turns out that also a Dense
layer, put first, gets called with this all None-shape. However, it can still provide an output defining the inner-most dimension based on the number of features specified. In my custom layers, I could not find a way to properly deal with this case. Using a placeholder results in:
ValueError: Exception encountered when calling layer 'ident' (type Ident).
in user code:
File "/home/pingu/Documents/Projekte/2024-02-AutomaticStatistician self/src/./freqFlow.py", line 205, in call *
return tf.reshape(inputs, shape = inputs.shape[:-1] + (self.NlastDim,))
ValueError: Tried to convert 'shape' to a tensor and failed. Error: Cannot convert a partially known TensorShape (None, None, 3) to a Tensor.
Call arguments received by layer 'ident' (type Ident):
• inputs=tf.Tensor(shape=(None, None, None), dtype=float32)
The Ident
layer is defined as follows
class Ident(tf.keras.layers.Layer):
def __init__(self, NlastDim = 1, **kwarks):
super().__init__(**kwarks)
self.NlastDim = NlastDim
Log(2, f'[Ident] NlastDim={NlastDim}');
def build(self, input_shape):
Log(2, f'[Projection] build input_shape={input_shape}');
super().build(input_shape)
def call(self, inputs):
Log(2, f'[Ident] call in-shape: {inputs.shape}')
dimLast = inputs.shape[-1]
if dimLast is None:
return tf.reshape(inputs, shape = inputs.shape[:-1] + (self.NlastDim,))
return inputs
I tried various solutions for dealing with the all-Nones by trying to implement compute_output_shape
(which never gets called) or intercepting special cases in the code, trying to impose a defined inner-most dimension.
I did not expect a input tensor with all-Nones. I do use data generators for this model.
Thank you in advance.
S Boehringer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.