ValueError: Missing some input keys: {‘chat_history’}
Additional Info:
Also received error with key question before, the output_key result i’ve added because of error multiple keys, specify output_key and it printed a dictionary of 2 keys from where I added result.
`
prompt_template = """
Use the following pieces of information to answer the user's question.
If you don't know the answer, just say that you don't know, don't try to make up an answer. Your answer should match the language of the question
Chat History: {chat_history}
Context: {context}
Question: {question}
Answer the question and provide additional helpful information,
based on the pieces of information, if applicable. Be succinct.
Responses should be properly formatted to be easily read.
"""
prompt = PromptTemplate(
template=prompt_template, input_variables=["context", "question", "chat_history"],
)
from langchain.chains.conversation.memory import ConversationBufferWindowMemory
window_memory = ConversationBufferWindowMemory(k=10, memory_key="chat_history", return_messages=True, output_key="result", input_key="question")
qa = RetrievalQA.from_chain_type(
llm=llm,
chain_type="stuff",
retriever=compression_retriever,
return_source_documents=True,
chain_type_kwargs={"prompt": prompt, "verbose": False},
memory=window_memory
)
## Cite sources
def process_llm_response(llm_response):
print(llm_response['result'])
print('nnSources:')
for source in llm_response["source_documents"]:
print(source. Metadata['source'])
%%time
user_q="USER'S QUERY?" #INPUT PROMPT HERE
query = f"""{user_q}"""
llm_Resp = qa(query)
process_llm_response(llm_Resp)
`
1