I would like to make a docker-compose which starts ollama (like ollama serve) on port 11434 and creates mymodel
from ./Modelfile
.
I found a similar question about how to run ollama with docker compose (Run ollama with docker-compose and using gpu), but I could not find out how to create the model then.
I tried to use the following:
version: '3'
services:
ollama:
image: ollama/ollama:latest
container_name: ollama
ports:
- "11434:11434"
volumes:
- ollama_volume:/root/.ollama
command: ollama create mymodel -f ./Modelfile
volumes:
ollama_volume:
This fails with unknown command "ollama" for "ollama"
, so I thought maybe command line ollama is not installed so I could use curl and their API, but curl also does not work..
I saw some people using bash -c "some command"
, but bash
is apparently also not found.
How could I create the model from within the docker-compose
? (If it is possible)