I have a node js backend service that currently fetches data directly from the postgres database, and I am working to introduce a caching middle layer.
This is my first time using Redis and I am facing a few challenges to how I can use it efficiently for my complex queries. My confusion here is two things, how do we map from a query to a key in the Redis cache. How should update the redis cache once a update/create functions are invoked.
I have e.g. the following function with joining multiple tables (only shown two here for brevity)
async findAll(query: PaginateQuery, user: User): Promise<Paginated<any>> {
const limit = query.limit && query.limit < MAX_LIMIT ? query.limit : MAX_LIMIT
const page = query.page ?? DEFAULT_PAGE
const skip = (page - 1) * limit
const queryBuilder = getConnection().getRepository(tableEntryX).createQueryBuilder('tableX').leftJoin('tableX.fk1', 'fk1').addSelect(['fk1.id', 'fk1.gender']).leftJoin('tableX.fk2', 'fk2').addSelect(['fk2.id', 'fk2.weigth'])
if (query.filter?.createdAt) {
/// udpate query based on this filter
}
// filter 2
// filter 3
// fetch
const [data, count] = await Promise.all([queryBuilder.skip(skip).take(limit).getMany(), queryBuilderCount.getCount()])
const res = EMPTY_PAGINATED_DATA
res.data = data
res.meta.itemsPerPage = limit
res.meta.currentPage = page
res.meta.totalItems = count
res.meta.totalPages = Math.ceil(count / limit)
res.meta.sortBy = sorting
return res
}
The same function checks for a number of filters and based on those filters we alter the query with the and/orWhere functions.
How should we cache this, while also taking into account that saving a new entry in the table might/might not imply a change in the cached data.
So my initial though was to simply to hash the query and cache this. This might be a fairly simple solution and probably not too bad with memory consumption (acknowledning that such approach might lead to duplicated cached entries since some queries could partially overlap each other)
However, I find that the more pressing problem with this approach is when an update / new entry takes place. Here what should we do with all our caches. It seems a waste to simply evict all caches and start from scratch. Especially, given that the update/create of entries happens quite regularly.
atvt is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.