I am using Ollama v0.2.3 on Windows with tinyllama, locally installed, and langchain4j v0.32.0.
I followed a very simple example of sending a chat query to Ollama.
To my surprise I got back a very random response, having nothing to do with my query. After some debugging it turned out that the Java code was sending the JSON query to /api/chat
with "role":"USER"
, but it only works with "role":"user"
. In other words:
curl http://localhost:11434/api/chat -d "{"model":"tinyllama","messages":[{"role":"user","content":"Provide three short bullet points explaining why Java is awesome"}],"options":{},"stream":false}"
This works fine.
curl http://localhost:11434/api/chat -d "{"model":"tinyllama","messages":[{"role":"USER","content":"Provide three short bullet points explaining why Java is awesome"}],"options":{},"stream":false}"
This gives a random response.
The issue appears to be in the JSON serialization which takes the Role
enum, which is all uppercase.
Since I am only toying around I replaced that enum class with a patched version which has additional serialization instructions to make the role values lowercase:
package dev.langchain4j.model.ollama;
import com.google.gson.annotations.SerializedName;
enum Role {
@SerializedName("system") // added to make it work
SYSTEM,
@SerializedName("user") // added to make it work
USER,
@SerializedName("assistant") // added to make it work
ASSISTANT;
}
Overall, however, serializing into uppercase for the role seems to be wrong. Should Ollama deal with e.g. USER
(instead of user
), or should langchain4j send the role value in lowercase? Is this a bug in Ollama/langchain4j, or am I missing some configuration?