enter image description here # setup llm & embedding model
llm=Ollama(model=”llama3″)
embed_model = HuggingFaceEmbedding( model_name=”BAAI/bge-large-en-v1.5″, trust_remote_code=True)
# Creating an index over loaded data
Settings.embed_model = embed_model
index = VectorStoreIndex.from_documents(docs, show_progress=True)
<code> # Create the query engine, where we use a cohere reranker on the fetched nodes
Settings.llm = llm
query_engine = index.as_query_engine(streaming=True)
</code>
<code> # Create the query engine, where we use a cohere reranker on the fetched nodes
Settings.llm = llm
query_engine = index.as_query_engine(streaming=True)
</code>
# Create the query engine, where we use a cohere reranker on the fetched nodes
Settings.llm = llm
query_engine = index.as_query_engine(streaming=True)
i run : streamlit run app.py
When I chat, an error appears
HTTPStatusError: Client error ‘404 Not Found’ for url ‘http://localhost:11434/api/chat’