I have a PostgreSQL table named users which contains between 20 to 30 million records. The table has the following columns: id
, tenant_id
, name
, company
, location
, and description
.
I need to perform generic search queries on this table with various combinations of columns. Here is an example of such a query:
SELECT *
FROM users
WHERE name = 'raju'
AND company LIKE '%kg%'
AND description LIKE '%ght%'
AND tenant_id = 'tenant_1234';
Given that there are many possible combinations of columns in these search queries, the response time is currently very high. To address this, I am considering using compound indexes. However, I am concerned that this will slow down the write operations.
My Question:
What are some effective strategies to optimize these search queries without significantly impacting the performance of write operations? Any suggestions or best practices would be greatly appreciated.