I’m building an API service. To use this service, third party users are able to sign up and generate their own API keys (similarly to Google cloud console).
My problem is that I want to track usage statistics about these keys, for example how many requests have been made by a given key. I also want to be able to retrieve this data for rate limiting and analytics purposes (especially, I want to show the user their individual usage).
Initially, I was thinking about using a Prometheus counter but my understanding is that this scales poorly with so many different API keys (high cardinality).
Currently I’m thinking that it might be better to manually store the statistics in my PostgreSQL database. Since this would mean one write per request, I’m thinking some sort of in-memory queue combined with batch processing of requests every 5-15 seconds might be a more performant option, though I’m not sure exactly how to do this.
However, I can’t help thinking there should already exist more standardized solutions for use cases such as this. So, I therefore wonder:
- what the best way of tracking per-user API usage efficiently while being able to poll this data within, say 30s of the request,
would be - as well as if there are any standardized solutions for this kind high cardinality tracking?