I have a dockerfile for a Cloud Run app, it has certain idiosincracies that add size to the image,like the fact that i have to “RUN pip uninstall -y opencv-python && pip install –no-cache-dir opencv-python-headless” for the ultralytics package to work. How would you guys suggest optimizing it? My imports are:
import os
from ultralytics import YOLO
import io
import vertexai
from vertexai.preview.generative_models import GenerativeModel
from vertexai.preview.generative_models import Image as ImageVertex
from PIL import Image
from google.cloud import storage
from google.cloud import bigquery
from datetime import datetime, timezone
from google.cloud import storage
from flask import Flask, render_template, request, redirect, url_for, flash, session, jsonify, send_from_directory
from werkzeug.utils import secure_filename
The Dockerfile looks like this:
FROM python:3.12.3-slim
RUN apt-get update &&
apt-get install -y --no-install-recommends
ffmpeg
libsm6
libxext6 &&
apt-get clean &&
rm -rf /var/lib/apt/lists/* &&
pip install --no-cache-dir
numpy
pillow &&
pip install --no-cache-dir
ipython
opencv-python
google-cloud-vision
google-cloud-aiplatform
gunicorn
google-cloud-storage
vertexai
flask &&
pip install --no-cache-dir
torch torchvision --index-url https://download.pytorch.org/whl/cpu &&
pip install --no-cache-dir
ultralytics
RUN pip uninstall -y opencv-python && pip install --no-cache-dir opencv-python-headless
# Set the environment variable for the port**
ENV PORT 8080
# Set the working directory**
WORKDIR /home
# Copy the current directory contents into the container at /home**
COPY . /home
# Expose the application port**
EXPOSE 8080
# Command to run the application**
CMD ["python3", "main.py"]
Its a 2.4 GB image, i dont know how to proceed.I tried separating Build and runtime stages and it barely made any difference.
6