Relative Content

Tag Archive for tensorflownlptensorflow2.0

Why does TensorFlow graph execution require different shapes than eager execution?

I have 10-100 token inputs and ~1000 token outputs, so I have separate embeddings for them so I can have computational efficiency. I’m pretty sure the answer is that there’s no way to have differently lengthed inputs and outputs in graph execution. But I know this isn’t the case because at multiple steps throughout my project I’ve had the code work fine. So maybe it’s a new version or something?