I am using langchain batch call in typescript where I am defining humanMessage and a system message. As instructed on the Langchain official site for batch process.(https://js.langchain.com/v0.1/docs/expression_language/interface/#batch)
const humanMessage: HumanMessage = new HumanMessage(`You are acting as a {item}`);
const systemMessage: SystemMessage = new SystemMessage(systemPrompt);
const promptTemplate = ChatPromptTemplate.fromMessages([
humanMessage, systemMessage
]);
const chain = promptTemplate.pipe(model);
const response = await chain.batch(
[{item: "cat"}, {item: "dog"}],
{ maxConcurrency: 4 },
{ returnExceptions: true }
)
However here the item value is not being passed to the model. Any suggestions on how this can be used. I am new to langchain and not sure if this is the correct way to send system message as well to the batch call.
I’ve tried using the ChatPromptTemplate to generate human and system message templates. But the official doc, the batch api is only using one prompt. How can I use system message in the batch call.
My use case is to pass a parameter to each batch call which should be replaced in humanMessage, while also passing a system message to the model.