Goal
I’m trying to use Gemini via API to help me write a story for a game. I’m doing this by asking it to write a small piece at a time; first some journal pages showing a character’s backstory, then the intro for the character, then a variety of quests the player can send that character on that will help flesh out the character. This will allow me to create many characters quickly, which I could never accomplish writing everything by hand.
Obviously this will take some proofing and fixing to make it work, and I’ll need to add a human touch to the story elements, but it would be very helpful to have Gemini do the heavy lifting in setting the scene so to speak.
Process
From reading the docs it looks to me like the appropriate way to have it use previous responses as context for the next response is to pass it all previous responses as part of an xml construct, as referenced at https://ai.google.dev/gemini-api/docs/api-overview#chat . My code does this, and quickly begins to look like this:
{
"contents": [
{
"role": "user",
"parts": [
{
"text": "n n I am writing a...To start, write a 5 paragraph short story detailing how the player meets Zied using a second person point of view addressing the player as "You".n n "
}
]
},
{
"role": "model",
"parts": [
{
"text": "%Gemini response%"
}
]
},
{
"role": "user",
"parts": [
{
"text": "n n n Zied wrote a journal before they were found by the player. This journal will provide backstory and clues to Zied's location where the player eventually meets them. The journal has ten entries, which all work together to tell a story.n n Generate page 1 of the journal.n n "
}
]
},
{
"role": "model",
"parts": [
{
"text": "%Gemini response%"
}
]
},
{
"role": "user",
"parts": [
{
"text": "n n n Generate page 2 of the journal.n n "
}
]
},
{
"role": "model",
"parts": [
{
"text": "%Gemini response%"
}
]
},
{
"role": "user",
"parts": [
{
"text": "n n n Generate page 3 of the journal.n n "
}
]
},
{
"role": "model",
"parts": [
{
"text": "%Gemini response%"
}
]
},
{
"role": "user",
"parts": [
{
"text": "n n n Generate page 4 of the journal.n n "
}
]
}
]
}
This actually works somewhat well (other than some unending plot loops), but the larger issue is that this becomes pretty big, pretty quick.
The Problem
Eventually, my code fails with Gemini responding
The maximum length of cell contents (text) is 32767 characters
This isn’t surprising, but makes me feel like there must be a better way to do this. When using the web interface for chatGPT I can ask it about responses it gave 50 questions ago and it can respond coherently, indicating it can take the entire conversation into account. It seems like Gemini should be able to do the same.
Is there a different way I’m supposed to be structuring these calls to get the effect I want? Or is Gemini simply unable to do what I’m asking?