I am testing llama3 here using this simple code below
import ollama
message = "What is football"
# connect to Llama3 model
try:
response_stream = ollama.chat(
model="llama3",
messages=[{ 'role': 'assistant','content': message}],
stream= True
)
print("Connected to Llama3")
for response in response_stream:
print(f"Llama3: {response['content']}")
except Exception as e:
print(f"Error connecting to Llama3: {e}")
I ran it but getting error (llama3 is installed correctly)
Connected to Llama3
Error connecting to Llama3: 'content'
[Done] exited with code=0 in 1.104 seconds
New contributor
Nived Puthumana Meleppattu is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.