I’m currently working on a web application where I need to use custom machine learning models with TensorFlow.js. I’ve successfully converted my Keras model to TensorFlow.js format, but I’m encountering an issue when loading the model in my React application.
When I try to load the model, I get the following error:
Error loading model: Error: An InputLayer should be passed either a `batchInputShape` or an `inputShape`.
at new InputLayer (@tensorflow_tfjs.js:6426:15)
at fromConfig (chunk-XU22ZDBH.js:13275:16)
at deserializeKerasObject (@tensorflow_tfjs.js:4279:25)
at deserialize (@tensorflow_tfjs.js:7594:10)
at fromConfig (@tensorflow_tfjs.js:11603:21)
at deserializeKerasObject (@tensorflow_tfjs.js:4279:25)
at deserialize (@tensorflow_tfjs.js:7594:10)
at loadLayersModelFromIOHandler (@tensorflow_tfjs.js:11046:18)
at async loadModel (loadModels.js:9:17)
at async loadModelAsync (Camera.jsx:16:29)
I converted my Keras model using TensorFlow.js converter. I checked the json and the model have batch_shape :
Here is a part of the modified model.json :
{
"format": "layers-model",
"generatedBy": "keras v3.3.2",
"convertedBy": "TensorFlow.js Converter v4.19.0",
"modelTopology": {
"keras_version": "3.3.2",
"backend": "tensorflow",
"model_config": {
"class_name": "Sequential",
"config": {
"name": "sequential",
"trainable": true,
"dtype": "float32",
"layers": [
{
"class_name": "InputLayer",
"config": {
"batch_shape": [null, 224, 224, 3],
"dtype": "float32",
"sparse": false,
"name": "input_layer_1"
}
},
{
"class_name": "Functional",
"config": {
"name": "mobilenetv2_1.00_224",
"trainable": false,
"layers": [
{
"class_name": "InputLayer",
"config": {
"batch_shape": [null, 224, 224, 3],
"dtype": "float32",
"sparse": false,
"name": "input_layer"
},
"name": "input_layer",
"inbound_nodes": []
},
I even tried to manually change the “bath_shape” to “batchInputShape” or “inputShape” but then I have an error “config corrupted”.
Anyone have an idea what’s going on ?
Thank you.
jay9l is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.