I’m working with LlamaIndex and I’m interested in implementing a Multishot Prompting mechanism. My goal is to use multiple prompts to refine and diversify the output for a given query.
Eg.:
You are going to translate technical text from English into Chinese using technical Chinese in the translation.
I will provide three examples of a technical paragraph in English and its corresponding translation in Chinese. Follow a similar structure when translating a new paragraph.
Paragraph 1: English Text
Translation: Chinese Translation
Paragraph 2: English Text
Translation: Chinese Translation
Paragraph 3: English Text
Translation: Chinese Translation
I tried it this way:
examples = [
{
"question": "qestion",
"answer": "answer",
},
{
"question": "question",
"answer": "answer",
},
{
"question": "question",
"answer": "answer",
},
{
"question": "question",
"answer": "Answer",
},
{
"question": "question",
"answer": "answer",
},
{
"question": "question",
"answer": "answer",
},
]
text_qa_msgs = [
ChatMessage(role=MessageRole.SYSTEM, content=system_message),
]
for example in examples:
text_qa_msgs.append(ChatMessage(role=MessageRole.USER, content=f"Q: {example['question']}nA: {example['answer']}"))
text_qa_msgs.append(ChatMessage(role=MessageRole.USER, content=f"Q: {user_question}nA: "))
text_qa_template = ChatPromptTemplate(text_qa_msgs)```
Is it even possible to build this into llama index?