Ollama Function cannot import ‘_set_config_context’ from ‘langchain_core.runnables.config’
as a follow-up to the thread on using Ollama for with_structured_output() instead of using OpenAI or Mistral, the ollama_functions.py
needs to import from langchain_core.tools
that imports from langchain_core.runnables.config
for _set_config_context
.