I am currently trying to stream a response with openAIs new assistant update. I followed the official documentation here but it doesn’t want to work. I keep getting this error:
File "/home/vscode/.local/lib/python3.12/site-packages/openai/lib/streaming/_assistants.py", line 404, in __stream__
for event in stream:
TypeError: 'AsyncStream' object is not iterable
Here is my api endpoint:
@router.post("/new")
async def post_new(request: Request):
"""Start a new chat."""
...
thread = await client.beta.threads.create()
await client.beta.threads.messages.create(
thread_id=thread.id,
content=content_start,
role="user",
metadata={"type": "hidden"},
)
# create the streamed thread run here
async with client.beta.threads.runs.stream(
thread_id=thread.id,
assistant_id=request.state.assistant_id,
event_handler=EventHandler(),
) as stream:
print("nnStream started") # this is printed
stream.until_done() # calling this fails
The only difference is that I add async to with client.beta.threads.runs.stream() otherwise I run into this error:
File "/workspace/backend/routers/chat.py", line 50, in post_new
with client.beta.threads.runs.stream(
TypeError: 'AsyncAssistantStreamManager' object does not support the context manager protocol
I use python 3.12 here with openAI version 1.25.1 (I’ve also tried older versions of both) and fastAPI for my backend.