I have a HDF5 file which contains three 1D arrays in different datasets. This file is created using h5py in Python and the 1D arrays are continually being appended to (ie growing). For simplicity, let’s call these 1D arrays as “A”, “B” and “C” and let’s say each array initially contains 100 values, but every second they will grow by one value (eg 101, 102 etc).
What I’m looking to do is create a single virtual dataset which is the concatenation of all three 1D arrays. This is relatively easy for the static case (3 x 100 values) but I want this virtual dataset to grow as more values are added (eg 303 values at 1 second, 306 at seconds etc.).
Is there a pythonic / efficient way to do this which isn’t just delete the virtual dataset and recreate it each second?