We are trying to create a zip of all the files related to a particular environment (which would be 100s of 1000s in number with average size of 100KB). After zipping the size of zip file would be around 40-50GB normally and more than that in case of exceptions. These files are stored in Azure blob storage.
We are creating an endpoint to achieve this so that the task can run in background since it will take time to complete.
The hosted service will only have maximum of 2GB of RAM and same memory will be used for file system as well.
I tried creating the endpoint which perform the following flow:
- Gets the number of related files in Blob.
- Iterate through the all the files 1 by 1.
- Gets the file from blob.
- Add or append the downloaded file in a zip (saved in temp folder).
- Add metadata in a CSV file.
- Upload the zip and CSV file in a public Blob.
- Delete the files from the temp folder.
How we can achieve this task, can we do it with streaming?
I have a doubt in streaming process that if we create a zip file in Azure blob and append the files dynamically, Will it be able to create a file bigger than 2GB?
I think while appending the service will download the zip file in-memory and then again uploads it in cloud, so creating a file bigger than 2GB is impossible. Let me know, if I assuming it wrong.
Regards
Shubham Saxena is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.