I want to run the phi3 model locally on my laptop using WSL, Ollama, Docker, and Open-webui. Here are the steps I have followed:
Installed WSL.
Installed Ollama.
Pulled the phi3 model using Ollama.
Installed Docker.
Installed Open-webui.
I can chat with the model via the terminal. However, when I set up open-webui and try to select the model, it does not appear in the model list.
I have checked the following and the model still didn’t show up in the list:
- The model phi3 is listed and works correctly in the terminal.
- Docker is running correctly.
- I have restarted Docker and open-webui services.
Checking the logs, I found this:
2024-05-31 11:05:31 INFO:apps.ollama.main:get_all_models()
2024-05-31 11:05:31 ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
How can I resolve this issue? Any help would be appreciated!
Arman Mustamandyar is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.