I have managed to ask one query using langchain like this:
template = """Use the following pieces of context to answer the question at the end.
If you don't know the answer, just say that you don't know, don't try to make up an answer.
Answer questions using the information in this document and be precise.
Provide only one number if asked for it."""
{context}
Question: {question}
Helpful Answer:"""
custom_rag_prompt = PromptTemplate.from_template(template)
rag_chain = (
{"context": retriever | format_docs, "question": RunnablePassthrough()}
| custom_rag_prompt
| llm
| StrOutputParser()
)
rag_chain.invoke(query)
But now I would like to have something like
queries = ["tell me this?", "tell me that?"]
And ask those questions in parallel. Any idea how I could do this please ?