How to Solve the problem of Large Size of Docker Image containing ML model with Node APIs?
i have an server for application in which via child Processes in Node.js i am calling python Scripts which are responsible for ML model running.. Server is running good but the issue is of large requirements for ML model running