I’m designing a system that stores customer telemetry data on a daily basis. My index format is “TenantId:date”, such as “tenant1:2024-09-01”, and the item is a large JSON blob containing all telemetry data for that day.
However, Cosmos DB has a 2MB size limit per item, and for some tenants, the records could be as large as 5MB. Is there an efficient way to bypass this limit (such as splitting data across multiple rows, pagination, or other techniques)?
I’m more looking for an in-place solution for better performance. I have seem other posts that stores hundreds of MB data and was suggested to use blob storage or something else, totally understand. But my object barely exceeded the limit (max 5MB), so I was hoping it could be done in-place.
Thanks!
I tried to store my records into my test Cosmos DB and resulted in “Request size too large” error.
Jun is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.