I am feeding measurements into A GA4 property using the measurement protocol, and then viewing the data from these events in Looker Studio.
The raw data is as follows:
time | Value |
---|---|
21:56:29 | 0 |
21:46:07 | 1.01 |
21:35:46 | 0 |
21:25:25 | 0.67 |
21:15:05 | 0 |
21:04:14 | 0.17 |
The mean of these values is 0.308
However when I plot the same period with an hourly graph on looker studio I see a value of 0.4625.
I see incorrect averages for almost every hour when compared against the raw data.
It appears that the values from Looker Studio can be obtained by calculating the averages using only distinct values, i.e. the above becomes:
time | Value | |
---|---|---|
21:56:29 | 0 | |
21:46:07 | 1.01 | |
21:35:46 | ||
21:25:25 | 0.67 | |
21:15:05 | ||
21:04:14 | 0.17 |
The mean of these values is 0.4625. Applying this logic to my raw data consistently matches it to my looker studio values.
If I plot the event count it is correct for each period. It doesn’t that I am dropping anything.
I have tried several other charts and another GA4 property, this problem seems to be consistent.
The problem doesn’t only apply to zeroes, it occurs with other matching values.
Is there a reason this would be happening?
Additional information:
The value is a GA4 custom dimension
I have not changed the value in the data source:
Value
I have applied only the following filter to the data:
Filter