I’ve installed and can run against ollama locally with llama3. However I cannot seem to get this working in a local google colab. The following python code runs just fine as a python.py. But when I run a local docker instance and connect my colab project to that local colab instance, I cannot get the code to run.
import ollama
MODEL = 'llama3'
def llm_response(content):
response = ollama.chat(
model=MODEL,
messages=[
{
"role": "user",
"content": content,
},
],
)
txt = response["message"]["content"]
return txt
In colab this returns:
ConnectError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py in map_httpcore_exceptions()
68 try:
---> 69 yield
70 except Exception as exc:
20 frames
ConnectError: [Errno 111] Connection refused
When just doing the following:
import ollama
ollama.chat(model=MODEL, messages=[{"role": "user", "content": "The cat sat on the mat"}])