I have ML inference service that runs on FastAPI and running on container image.
The endpoint on FastAPI is simply just /infer
. It will tries to validate JWT from another authentication service on different language (PHP). If valid and trained model not exists, then it tries to download the trained model to the local storage of container. If valid and exists, it perform preprocessing, infer, postprocessing, and return response. Those steps are depending on Python dependencies like Numpy.
The ML ecosystem is using keras
and tensorflow
and stored it on local storage of container.
Is it possible that if I want to migrate the /infer
endpoint to the Rust?
If so, what the frameworks, libraries, dependencies I need to perform this?
Or, actually this is impossible since tensorflow and keras and another deep learning framework is focuses on Python. So, not only Rust, any other language such as Golang might be not fit for migration. Which implying I shouldn’t have migrate.