Pythonic parallelized streaming of batches (list-of-dicts) of JSON into a single file write
I have a multiprocessing thread pool where each job returns a requested batch of JSON (an array of objects). I want all results to write into a single file, but without storing the full result (as a single list) in-memory due to RAM constraints (the full result contains about 1.5 million records totaling about 1.5 GB).