I have a Node.js application with the following code structure:
SqlUtils.js
const queryMap = new Map([
// various query mappings => size over 20KB
]);
const lastPurchaseMap = new Map([
// various last purchase mappings
]);
function generateSQLQuery(select) {
const clauses = new Set();
Object.keys(select).forEach((key) => {
if (queryMap.has(key) && select[key]) {
clauses.add(queryMap.get(key));
}
});
return Array.from(clauses).join(",n");
}
module.exports = {
generateSQLQuery
};
The function generateSQLQuery
is imported in a separate service using require
, this service is almost used on every request.
I’ve noticed that when handling multiple requests (e.g., 400 concurrent requests), the memory usage can be quite high. By moving queryMap and lastPurchaseMap to a separate module and importing them using Node.js module caching, I observed a significant reduction in memory consumption—over 200MB saved.
My questions are:
-
Does Node.js instantiate queryMap and lastPurchaseMap multiple times if generateSQLQuery is called concurrently without module caching?
-
Are there any best practices for optimizing memory usage with large constant data structures in a Node.js application besides moving them to separate module?
Any insights or detailed explanations would be greatly appreciated.
NodeJS runtime 18.12.0
I solved the problem but I would like to understand why it was happening.
Souhaiel Riahi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.