I’m attempting to integrate a custom TensorFlow Lite (TFLite) model, created using Teachable Machine, into a Flutter app. Initially, I followed the live object detection example provided in the tflite_flutter plugin repository, specifically the live_object_detection_ssd_mobilenet example.
I successfully ran the example with the provided model (ssd_mobilenet.tflite), but when I replaced it with my custom TFLite model, the app ceased to function properly. Specifically, it no longer displayed bounding boxes and statistics for detected objects.
Here are the steps I took:
- Downloaded and replaced the default ssd_mobilenet.tflite model with my custom model.
- Updated the model path in detection_service.dart to point to my custom model.
- I check Netron and ensured that the model mlModelInputSize in the code matches the input size used during training (224 instead of the default 300).
class _DetectorServer {
/// Input size of image (height = width = 300)
static const int mlModelInputSize = 300;
/// Result confidence threshold
static const double confidence = 0.5;
Interpreter? _interpreter;
List<String>? _labels;
_DetectorServer(this._sendPort);
final SendPort _sendPort;
...
}
Additional:
- Default model
- My tensorflow model
I’m new to tflite. I need a clear guidance to use my own model in the example that I provided above. If you another suggestion or knowledge related to this issues, please let me know. Thank you.