I am trying to connect local Ollama 2 model, that uses port 11434 on my local machine, with my Docker container running Linux Ubuntu 22.04. I can confirm that Ollama model definitely works and is accessible through http://localhost:11434/. In my Docker container, I am also running GmailCTL service and was able to successfully connect with Google / Gmail API to read and send emails from Google account. Now I want to wait for an email and let the LLM answer the email back to the sender. However, I am not able to publish the 11434 port in order to connect model with container.
I tried setting up devcontainer.json file to forward the ports:
{ "name": "therapyGary", "build": { "context": "..", "dockerfile": "../Dockerfile" }, "forwardPorts": [80, 8000, 8080, 11434] }
I tried exposing the ports in the Dockerfile:
# Expose any ports if needed (e.g., if running a web server) EXPOSE 80 EXPOSE 8000 EXPOSE 8080 EXPOSE 11434
These seem to add the ports to the container and Docker is aware of them, but when I check the port status for the currently used image, I get this message:
“Error: No public port ‘11434’ published for 5ae41009199a”
I also tried setting up the docker-compose.yaml file:
services: my_service: image: 53794c7c792c # Replace with your actual Docker image name ports: - "11434:11434" - "8000:8000" - "8080:8080" - "80:80"
But there seems to be a problem with it, where any container with it automatically stops.
I tried stopping the Ollama model, before running the container as to not create a conflict, but that did not help either. Any suggestions are very welcome.
Thanks!
Piotr Grochowski is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.