I am trying to create a simple recurrent model using Keras. My objective is to have multiple layers, and to connect only the final output of the last layer as an input into the first layer at the next timestep. My understanding is that this cannot be achieved by using SimpleRNN
layers, because chaining those adds an internal recursion inside each layer, not a single one from the very end to the very beginning.
Here is a non-functional example of what I would like to achieve:
from tensorflow.keras.layers import Dense, Input
from tensorflow.keras.models import Model
base_input = Input(shape=[2, 5], name='base_input')
dense_1 = Dense(20, input_shape=[2, 5], name='dense_1')(base_input)
dense_2 = Dense(20, name='dense_2')(dense_1)
base_output = Dense(units=5, name='base_output')(dense_2)
base_model = Model(inputs=[base_input], outputs=base_output)
outer_input = Input(shape=[1, 5], name='outer_input')
outer_model = Model(inputs=[outer_input, base_output], outputs=base_output)
As you see, the base model takes 2*5 inputs, and provides 5 outputs. I want these to be the concatenation of 5 “real” inputs, and the 5 previous outputs. Clearly there is a problem, because simply specifying the same tensor base_output
in both the inputs and outputs is not quite what Keras wants.
Is there a clean way of doing this? I would prefer to use only simple Dense
layers as components, but that may not be possible.