I want to make the llm select and actually execute its tools and then use the output from those tools in other tools to come to a final answer. Is this possible? I want to use Ollama, Llama3 and LangGraph.
I’ve seen a few videos about tool calling with ollama but in that the output is simply a JSON with the function name and parameters.
New contributor
Dak is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.