I am working on a multiplayer game, in which I have to store a large number of coordinates and color details {type, x1, y1, x2, y2, r1, g1, b1, r2, g2, b2, theta}
. These would be continuously updated at very high frequency (about 1000 inserts per second, ideally, but starting out, 400 to 600 inserts per second should be fine). The values can be classified into all being unsigned integers. 1, 4bit. 5, 16 bit, 6, 8bit
.
Ideally with, 1000 inserts per second, 5 million records should be more than enough, and so I want that after this threshold is reached, the oldest records should start getting dropped to make room for new records. So this sort of forms up like a queue where records are added from the top, and dropped from the bottom, in order of insertion.
These are the stats for each global room, and for now 3 to 5 global rooms should suffice.
So for 5 million records the size approximates to 80mb. Rounding it off to 100mb, whenever a new player joins a room, he’ll also have to download this 100mb of data. But the download should happen behind the scenes, asynchronously, and not pause the main thread.
Please suggest the best approach to tackle this. I have a node.js server.
Currently I have writing all the coordinates into txt files on the server itself separated by ” ” and “n” and reading incrementally from it, using streams. Obviously that is a very not good approach and doesn’t scale well and is the probably the main bottleneck for me right now.
Thank you for reading.