We use tensorflow serving to serve models in production. We have a use case where the output of the model is a ragged tensor.
To see if the tensorflow serving supports ragged tensor as output, we created this toy example.
import tensorflow as tf
# Initiate the model
inputs = tf.keras.Input(shape=(1,), ragged=True, name='input_1')
output = tf.keras.layers.Lambda(lambda x: x + 1, dtype=tf.float32)(inputs)
model = tf.keras.Model(inputs=[inputs], outputs=output)
model.compile()
# Serialise/Deserialise to the format the inferoo expects
model.save("./my_model/1", save_format="tf")
model = tf.keras.models.load_model("./my_model/1")
# Make predictions with ragged tensors
x = tf.ragged.constant([[1, 2, 3], [4, 5]], dtype=tf.float32)
out = model.predict([x])
print(out)
tf.debugging.assert_equal(out, tf.ragged.constant([[2, 3, 4], [5, 6]], dtype=tf.float32))
print("All good!")
We save the model to a local disk and then load the model via tensorflow serving.
I used [saved_model_cli][1] to inspect model signatures.
The model output has datatype DT_INVALID
, I guess the tensorflow serving will fail to load this model.
user@MR26DF61QG ~ % saved_model_cli show --dir /Users/user/ragged_tensor/my_model/1 --all
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['__saved_model_init_op']:
The given SavedModel SignatureDef contains the following input(s):
The given SavedModel SignatureDef contains the following output(s):
outputs['__saved_model_init_op'] tensor_info:
dtype: DT_INVALID
shape: unknown_rank
name: NoOp
Method name is:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['args_0'] tensor_info:
dtype: DT_FLOAT
shape: (-1)
name: serving_default_args_0:0
inputs['args_0_1'] tensor_info:
dtype: DT_INT64
shape: (-1)
name: serving_default_args_0_1:0
The given SavedModel SignatureDef contains the following output(s):
outputs['lambda'] tensor_info:
dtype: DT_INVALID
shape: ()
name:
Method name is: tensorflow/serving/predict
The MetaGraph with tag set ['serve'] contains the following ops: {'StringJoin', 'VarHandleOp', 'ReadVariableOp', 'ShardedFilename', 'Placeholder', 'Select', 'AssignVariableOp', 'Const', 'Pack', 'MergeV2Checkpoints', 'RestoreV2', 'StatefulPartitionedCall', 'NoOp', 'DisableCopyOnRead', 'Identity', 'StaticRegexFullMatch', 'PartitionedCall', 'AddV2', 'SaveV2'}
Concrete Functions:
Function Name: '__call__'
Option #1
Callable with:
Argument #1
DType: RaggedTensorSpec
Value: RaggedTensorSpec(TensorShape([None, None]), tf.float32, 1, tf.int64)
Argument #2
DType: bool
Value: False
Argument #3
DType: NoneType
Value: None
Option #2
Callable with:
Argument #1
DType: RaggedTensorSpec
Value: RaggedTensorSpec(TensorShape([None, None]), tf.float32, 1, tf.int64)
Argument #2
DType: bool
Value: True
Argument #3
DType: NoneType
Value: None
Function Name: '_default_save_signature'
Option #1
Callable with:
Argument #1
DType: RaggedTensorSpec
Value: RaggedTensorSpec(TensorShape([None, None]), tf.float32, 1, tf.int64)
Function Name: 'call_and_return_all_conditional_losses'
Option #1
Callable with:
Argument #1
DType: RaggedTensorSpec
Value: RaggedTensorSpec(TensorShape([None, None]), tf.float32, 1, tf.int64)
Argument #2
DType: bool
Value: False
Argument #3
DType: NoneType
Value: None
Option #2
Callable with:
Argument #1
DType: RaggedTensorSpec
Value: RaggedTensorSpec(TensorShape([None, None]), tf.float32, 1, tf.int64)
Argument #2
DType: bool
Value: True
Argument #3
DType: NoneType
Value: None
```
Does tensorflow serving support ragged tensors as outputs?
[1]: https://www.tensorflow.org/guide/saved_model#the_savedmodel_format_on_disk