This seems to be a very usual problem when working with Remote Sensing data.
So I am very sure there is a best-practice out there that I am simply not aware of.
I have an input time series, which is a stack of images.
I want to run an interpolation on the entire stack and
output more images than I put in.
I am at a stage where I need to produce a lot of dummy layers for all the output dates that I want to create in interpolation.
But already at this step I am running into a MemoryError.
This is purely because I am being inefficient and work on the entire tile rather than in chunks.
I know that in computation I can avoid these types of errors with dask.
However what I am wondering is what is the best solution to save my chunks of data right after I manipulate them, so I can clear them from memory right away?
I have some ideas:
- use tifffile.TiffWriter and use the tile, contiguous keywords
- use gdal.WriteArray with the xoff and yoff keywords
- use rioxarray specifying lock keyword
Are there other solutions apart from that? And could anyone tell me which of those options is the best practice?
1