I have this below code that works well when I connect to OpenAI. I am trying to do the same using llama2 but I am having trouble using the right embedding technique.. In the below code I use OpenAIEmbeddings
, want some help to find how can this be tweaked to run well using llama2 model
from langchain_community.vectorstores import Chroma
from langchain_core.example_selectors import SemanticSimilarityExampleSelector
from langchain_openai import OpenAIEmbeddings
vectorstore = Chroma()
vectorstore.delete_collection()
example_selector = SemanticSimilarityExampleSelector.from_examples(
examples,
OpenAIEmbeddings(),
vectorstore,
k=2,
input_keys=["input"],
)
example_selector.select_examples({"input": "what is the customer name tagged to order 3455"})
few_shot_prompt = FewShotChatMessagePromptTemplate(
example_prompt=example_prompt,
example_selector=example_selector,
input_variables=["input","top_k"],
)
print(few_shot_prompt.format(input="what is the customer name tagged to order 3455"))