I have a chain in Langchain and there are some input parameter:
input_text:str -> for the template
max_tokens:float -> for llm
how can i distribute these variables/parameters to the right part of chain?
I tried this:
prompt: PromptTemplate = PromptTemplate.from_template("write a joke about: {input_text}")
llm = ChatOpenAI(model_name="gpt-4o",max_tokens=1)
chain = prompt | llm
result=chain.invoke({"max_tokens": 100, "input_text": "programming!"})
print(result.content)
If you run this example you get only 1 token as an answer so the invocation is not giving 100 to the llm.
But i want it to respond 100 tokens as maximum.
Expectation:
*Why do programmers prefer dark mode?
Because light attracts bugs! ????????*
In my understanding there is some pool of variables for all parts of the chain which is given by user at the beginning, but maybe this assumption is wrong.
I saw an answer but I wanted to search some easier/other way for doing it.
text
Michael T is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.