My project requires the functionality to handle large amounts of data server-side received from a stream source (related to software-defined radios). The data rate per second is dependent on the sample rate of the underlying radio system, and can expected to regularly estimate about 1 GB / second, but could exceed 10 GB / second in some extreme cases. Obviously, I cannot practically process this data in memory, so I have been searching for an external database system to keep track of the data.
To better understand my app – the most important part of any data system I use is continuity and data integrity. I will almost certainly regularly discard data from the database because a historical archive is not necessary, and preserving a few seconds or minutes of data (as a short-term cache) will be enough for my purposes. However, it is imperative that the database solution I pick can handle the large amounts of data without any loss, because discontinuities in radio data (especially unknown ones) make analysis much harder.
To me, Postgres seemed like a promising option due to its Large Object datatype and the capability to stream back, which is exactly what I need for recall. On the server, I have implemented a double-buffer intermediate cache which will export data as it fills a certain amount of local memory. Please advise on if Postgres is a smart choice for this volume of data, and if not, another alternative more suitable.
TL;DR: Is Postgres capable of handling writes that reach (or exceed) 1 GB bulk writes every second, without data loss?