So I’m using langchain js for my front-end application ( i know I shouldn’t do that but I use azure oauth so I’m fine), My application is in react and whenevever I try using langchain agents in the front-end it always throws error.
I found a guy with a similar issue -> LangChain simple llm.predict in Angular returns empty result
but he is not using agents
Example Code
export const GPTmodel = (model, temp) => {
return new ChatOpenAI({
model,
temperature,
configuration: {
baseURL: LLM_GATEWAY_BASE_URL,
dangerouslyAllowBrowser: true,
defaultHeaders: {
blabla
},
},
azureOpenAIBasePath: `ourpath`,
azureOpenAIApiDeploymentName: model,
azureOpenAIApiVersion: '2024-02-01',
azureOpenAIApiKey: 'fake-key-we-use-auth',
}
export const ProcessMiningTutor = async () => {
const llm = GPTmodel(GPTModel.GPT4, 0);
const promptTemplate = SystemMessagePromptTemplate.fromTemplate(`
answer the user question in a funny and rap like way.
The user question is: {input}
also what is the value of foo?
`);
const prompt = ChatPromptTemplate.fromMessages([
[ 'system', 'You are a helpful assistant.' ],
promptTemplate,
new MessagesPlaceholder('agent_scratchpad'),
]);
const tools = [
new DynamicTool({
name: 'FOO',
description: 'call this to get the value of foo.',
// eslint-disable-next-line @typescript-eslint/require-await
func: async () => 'baz',
}),
];
const agent = await createOpenAIFunctionsAgent({
llm,
prompt,
tools,
});
return new AgentExecutor({
agent,
tools,
verbose: true,
}).withConfig({ runName: 'Agent' });
};
then when I call it
const {
isLoading, error,
} = useQuery(
{
// eslint-disable-next-line @stylistic/max-len
queryKey: [ 'agentResponse', currentQuestion?.id, currentQuestion, currentQuestion?.agent, currentQuestion?.question, askedQuestions ],
queryFn: async () => {
if (!currentQuestion) {
return;
}
console.log(currentQuestion);
// I have multiple agents so just think I'm giving the agent from above
const agentName = currentQuestion.agent;
const agentCreator = agentRegistry[agentName];
const executor = (await agentCreator()) as AgentExecutor;
// this thing ALWAYS fails.
const result = executor.streamEvents({ input: currentQuestion.question }, { version: 'v1' });
for await (const event of result) {
const eventType = event.event;
if (eventType === 'on_chain_start') {
// Was assigned when creating the agent with `.withConfig({"runName": "Agent"})` above
if (event.name === 'Agent') {
console.log('n-----');
console.log(
`Starting agent: ${event.name} with input: ${JSON.stringify(
event.data.input,
)}`,
);
}
} else if (eventType === 'on_chain_end') {
// Was assigned when creating the agent with `.withConfig({"runName": "Agent"})` above
if (event.name === 'Agent') {
console.log('n-----');
console.log(`Finished agent: ${event.name}n`);
console.log(`Agent output was: ${event.data.output}`);
console.log('n-----');
}
} else if (eventType === 'on_llm_stream') {
const content = event.data?.chunk?.message?.content;
// Empty content in the context of OpenAI means
// that the model is asking for a tool to be invoked via function call.
// So we only print non-empty content
if (content !== undefined && content !== '') {
console.log(`| ${content}`);
}
} else if (eventType === 'on_tool_start') {
console.log('n-----');
console.log(
`Starting tool: ${event.name} with inputs: ${event.data.input}`,
);
} else if (eventType === 'on_tool_end') {
console.log('n-----');
console.log(`Finished tool: ${event.name}n`);
console.log(`Tool output was: ${event.data.output}`);
console.log('n-----');
}
}
Description
I’m trying to use langchain to create an agent with openai functions/tools.
My application is living fully in the front-end. React.
In my network tab I can see the response of the llm is SUCCESSFUL I CAN SEE THE CONTENT but
@tanstack_react-query.js?v=77745d11:2942 Uncaught TypeError: Cannot read properties of undefined (reading 'content')
at OpenAIFunctionsAgentOutputParser._baseMessageToString (chunk-NSISG4FD.js?v=77745d11:32:27)
at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-NSISG4FD.js?v=77745d11:54:22)
at OpenAIFunctionsAgentOutputParser._callWithConfig (chunk-2YXJZ3EI.js?v=77745d11:12447:28)
at async OpenAIFunctionsAgentOutputParser._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:12392:5)
at async OpenAIFunctionsAgentOutputParser.transform (chunk-2YXJZ3EI.js?v=77745d11:12612:5)
at async _RunnableSequence._streamIterator (chunk-2YXJZ3EI.js?v=77745d11:13336:24)
then after using verbose: true on the agent ->
Entering LLM run with input: {
"messages": [
[
{
"lc": 1,
"type": "constructor",
"id": [
"langchain_core",
"messages",
"SystemMessage"
],
"kwargs": {
"content": "You are a helpful assistant.",
"additional_kwargs": {},
"response_metadata": {}
}
},
{
"lc": 1,
"type": "constructor",
"id": [
"langchain_core",
"messages",
"SystemMessage"
],
"kwargs": {
"content": "n answer the user question in a funny and rap like way.
The user question is: what the dawg doing?
also what is the value of foo? ",
"additional_kwargs": {},
"response_metadata": {}
}
}
]
]
}
chunk-2YXJZ3EI.js?v=77745d11:5262 [llm/end] [1:chain:Agent > 2:chain:OpenAIFunctionsAgent > 7:llm:ChatOpenAI] [1.28s] Exiting LLM run with output: {
"generations": [
[
null
]
]
}
chunk-2YXJZ3EI.js?v=77745d11:5225 [chain/start] [1:chain:Agent > 2:chain:OpenAIFunctionsAgent > 8:parser:OpenAIFunctionsAgentOutputParser] Entering Chain run with input: {}
chunk-2YXJZ3EI.js?v=77745d11:5243 [chain/error] [1:chain:Agent > 2:chain:OpenAIFunctionsAgent > 8:parser:OpenAIFunctionsAgentOutputParser] [0ms] Chain run errored with error: "Cannot read properties of undefined (reading 'content')nnTypeError: Cannot read properties of undefined (reading 'content')n at OpenAIFunctionsAgentOutputParser._baseMessageToString (http://localhost:9000/node_modules/.vite/deps/chunk-NSISG4FD.js?v=77745d11:32:27)n at OpenAIFunctionsAgentOutputParser._callWithConfig (http://localhost:9000/node_modules/.vite/deps/chunk-NSISG4FD.js?v=77745d11:54:22)n at OpenAIFunctionsAgentOutputParser._callWithConfig (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:12447:28)n at async OpenAIFunctionsAgentOutputParser._streamIterator (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:12392:5)n at async OpenAIFunctionsAgentOutputParser.transform (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:12612:5)n at async _RunnableSequence._streamIterator (http://localhost:9000/node_modules/.vite/deps/chunk-2YXJZ3EI.js?v=77745d11:13336:24)"
But i can see the network tab
that the responce is there -_-
System Info
“@langchain/core”: “^0.2.31”,
“@langchain/openai”: “^0.2.10”,
“langchain”: “^0.3.0”,
platform macos
node version -> Node.js v20.12.2.
npm version 10.8.2
I tried making a custom output parser didnt work.