I call user_proxy.initiate_chat() in a python scripts to test my first AutoGen App locally, but getting this error. I’ve set up the server model in Lm Studio, start it locally too.
Any one has any idea why this is happening? My code is like below
import autogen
config_list = [
{
'model': 'TheBloke/Llama-2-7B-Chat-GGUF',
"base_url": "http://127.0.0.1:1234/v1",
#http://localhost:1234/v1
"api_key": "NULL",
# "api_type":"open_ai"
}
]
llm_config = {
#"request_timeout": 600,
"seed": 42,
"config_list": config_list,
"temperature": 0
}
assistant = autogen.AssistantAgent(
name="assistant",
system_message="You are a coder specializing in Python.",
llm_config=llm_config
)
user_proxy = autogen.UserProxyAgent(
name="user_proxy",
human_input_mode="TERMINATE",
max_consecutive_auto_reply=10,
is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
code_execution_config={"work_dir": "web", "use_docker": False},
llm_config=llm_config,
system_message='''Reply TERMINATE if the task has been solved at full satisfaction.
Otherwise, reply CONTINUE, or the reason why the task is not solved yet.'''
)
task = "Write a python method to output numbers from 1 to 100."
user_proxy.initiate_chat(
assistant,
message=task
)
And the error msg:
Write a python method to output numbers from 1 to 100.
--------------------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/fengluepu/Desktop/autogen-main/AutoGenApp.py", line 39, in <module>
user_proxy.initiate_chat(
File "/Users/fengluepu/Desktop/autogen-main/autogen/agentchat/conversable_agent.py", line 1018, in initiate_chat
self.send(msg2send, recipient, silent=silent)
File "/Users/fengluepu/Desktop/autogen-main/autogen/agentchat/conversable_agent.py",
...
File "/Users/fengluepu/Desktop/autogen-main/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1079, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/Users/fengluepu/Desktop/autogen-main/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1046, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502