How to stream LLM response from fastapi to react?
I want to stream an LLM (ollama
) response using fastapi
and react
. I can successfully get an answer from the LLM without streaming, but when I try to stream it, I get an error in react
. The LLM answer streams successfully when I print each chunk in fastapi
. I’m using @microsoft/fetch-event-source
to stream the response.