Go – how to read a large file into chunks and process using multithreading, and aggregate results
I have a very large CSV file that won’t fit entirely into memory. I want to be able to read the file into chunks, and then chain a series of operations together to process the results. Lastly, I need to aggregate the final results together, maintaining the original order of the data.