I have some unidimensional waveform data, and I wish to exploit its nearby point correlation before entropy-encoding it. The most basic operation that does this is Delta Encoding (using the finite difference formula), and that is simply a [1,-1] filter (convolution). Since I’ll be performing Golomb-Rice Encoding afterwards, I’ll need the operation to minimize both the mean and the standard deviation of the resulting distribution. I’ll also need it to be reversible and not too computationally intensive.
I plan to set up an AI system that will search for the best operation that achieves this. I am asking for your mathematical and computational intuition: is there a more general operation with these properties?
I would really appreciate any insights!