How are you?
I am using memcached
for a while now and love it.
Bheind the cache I am tipicly using Postgress
or any relational database.
There are some cases when using cache make it complex. and I am not sure about the performance cost of those cases. so better ask here.
Imagine a situation where I have two API’s
def all(filter_1: int = None, filter_2: int = None, filter_3: int = None)
To filter the all results.def update(id, data: dict)
(to update a single item)
on all
I will cache the results, example:
all()
CACHE_KEY_ALL
.all(filter_1=11)
.
CACHE_KEY_ALL_filter_1_11
.all(filter_1=11, filter_3=three)
CACHE_KEY_ALL_filter_1_11_filter_3_three
on eveery update
call I will need to clear the cache for all the all
cached items.
What I am doing today, I such cases I am just not using any cache.
My question is a performance question.
What is faster
- No using cache at all in this cases
- Call
stats items
fetch all keys. look for keys starting withCACHE_KEY_ALL
and invalidate each one of them
What do you think?
????