What I am trying to do is
I have a nearly identical situation to this post, and this one (I’d rather not use 4 separate lag fields), and I even found this example that says that basically I need a range of rows. None of them produce the right results.
The objective is to create a series of before and after error rates, where before
is all data 4 days before the given date, and after is all data after; trying to find spikes in the error rate.
Here are some fiddle examples of what I have tried
My ideal results would look like this:
DATE | DEFINITION | SYSTEM_ERRORS | TOTAL_RUN | BEFORE | AFTER |
---|---|---|---|---|---|
7/6/24 | BAR | 134 | 18487 | null | 0.0061 |
7/6/24 | FOO | 35 | 256353 | null | 0.0067 |
7/6/24 | BAT | 1920 | 6258 | null | 0.0074 |
7/7/24 | BAR | 132 | 18492 | null | 0.006 |
7/7/24 | FOO | 10 | 236304 | 0.0074 | 0.0066 |
7/7/24 | BAT | 1972 | 4529 | 0.0041 | 0.0073 |
7/8/24 | BAR | 120 | 18008 | 0.0152 | 0.0064 |
7/8/24 | FOO | 46 | 308912 | 0.0081 | 0.0064 |
7/8/24 | BAT | 1927 | 7972 | 0.0038 | 0.0074 |
7/9/24 | BAR | 104 | 17870 | 0.0120 | 0.0064 |
7/9/24 | FOO | 6 | 320445 | 0.0062 | 0.0064 |
7/9/24 | BAT | 1853 | 8494 | 0.0032 | 0.0077 |
7/10/24 | BAR | 1236 | 18980 | 0.0110 | 0.0066 |
7/10/24 | FOO | 35 | 319427 | 0.0087 | 0.0059 |
7/10/24 | BAT | 1824 | 7571 | 0.0047 | 0.0074 |
7/11/24 | BAR | 110 | 17972 | 0.0140 | 0.0059 |
7/11/24 | FOO | 5 | 322179 | 0.0088 | 0.0059 |
7/11/24 | BAT | 1858 | 8143 | 0.0030 | 0.0081 |
7/12/24 | BAR | 108 | 18082 | 0.0107 | 0.006 |
7/12/24 | FOO | 3 | 318872 | 0.0057 | 0.006 |
7/12/24 | BAT | 1833 | 8011 | 0.0030 | 0.0096 |
7/13/24 | BAR | 104 | 18151 | 0.0108 | 0.0063 |
7/13/24 | FOO | 29 | 255573 | 0.0056 | 0.0063 |
7/13/24 | BAT | 1533 | 4982 | 0.0033 | 0.0123 |
7/14/24 | BAR | 108 | 18087 | 0.0122 | 0.0066 |
7/14/24 | FOO | 1 | 233627 | 0.0060 | 0.0066 |
7/14/24 | BAT | 1564 | 2920 | 0.0033 | 0.5356 |