I’m building a personalized chatbot with AWS Bedrock Agent and I was following this documentation for making it know previous informations when starting a new conversation through InvokeAgent API:
https://docs.aws.amazon.com/pt_br/bedrock/latest/userguide/agents-session-state.html
So I want that when the user is about to chat with the LLM (using Claude 3.5 Sonnet here), in his first request I pass the sessionState with sessionAttributes parameters in the call. In the sessionAttributes I’m sending the user’s name, age and previous conversations they had, so the LLM can ‘know’ the user before generating the first answer, based on past interactions and database information
Then I wrote this script:
import uuid
import boto3
def invoke_agent(agent_id, agent_alias_id, session_id, prompt):
"""
Sends a prompt for the agent to process and respond to,
using both sessionAttributes and promptSessionAttributes.
"""
try:
# Define session attributes to maintain user-specific context across interactions
response = agents_runtime_client.invoke_agent(
agentId=agent_id,
agentAliasId=agent_alias_id,
sessionId=session_id,
sessionState={
'sessionAttributes': {
'userName': 'Yoran', # {user.name}
'userAge': '21', # {year(today - user.birthdate)}
'summary': 'The user has talked a lot about Programming Languages like Python and Javascript' # {user.summary_conversation}
},
},
inputText=f"{prompt}", ## user_message,
enableTrace=False,
)
completion = ""
for event in response.get("completion"):
chunk = event["chunk"]
completion += chunk["bytes"].decode()
except Exception as e:
print(f"Couldn't invoke agent. {str(e)}")
raise
return completion
# Define the chat parameters
agent_alias_id = "XXXXXXXX"
agent_id = "XXXXXXXX"
session_id = str(uuid.uuid4())
input_text = "What is my name??"
# Invoke the agent with sessionAttributes and promptSessionAttributes
response = invoke_agent(agent_id, agent_alias_id, session_id, input_text)
print(response)
So I’m passing userName as a paramater of sessionAttributes, but when I ask for my name in the first API call, the LLM return:
I'm sorry, but I don't have your name in my records. Would you like to tell me your name?
Am I using it wrong? Read the entire documentation and thought this was the best way of giving LLM a memory but I just can’t get what I expect.
I tried using the parameters provided in AWS Bedrock documentation for giving the LLM a memory of context/long-term context window.
I was expecting that the LLM would be capable of reading it’s sessionAttributes.