langchain4j and Ollama – chat does not work because of uppercased role value
I am using Ollama v0.2.3 on Windows with tinyllama, locally installed, and langchain4j v0.32.0.
I followed a very simple example of sending a chat query to Ollama.
To my surprise I got back a very random response, having nothing to do with my query. After some debugging it turned out that the Java code was sending the JSON query to /api/chat
with "role":"USER"
, but it only works with "role":"user"
. In other words: