Is it possible to preload cache slices in parallel? I think this is maybe possible with some custom lua module but I don’t have any experience with it.
Context: an nginx reverse proxy is set up close to users but very far away from the origin and I want to use it as a cache for said origin. The origin serves large files (multiple GBs per file). The slice module works, but downloads one chunk at a time. Because of the high latency (>200 ms), throughout is limited with a single chunk download happening at any given time. I would like to configure nginx to preload a configurable amount of byte ranges ahead ahead of the client and cache them, and keep this going until the entire file is cached. The origin is on a dedicated gigabit connection and has enough leeway to handle parallel streams.
I’m willing to use something different than nginx as long as it can fit the requirement of parallel chunk downloads.
Any ideas?