I run a pretty simple scenario with langchain:
const llm = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0
});
//...
const res = await llmWithTools.invoke("What is 3 * 12");
console.log(res.tool_calls);
await awaitAllCallbacks();
But it doesn’t show actual LLM calls when I open this trace in the langsmith.
What am I missing?