I’m trying to help optimize performance on some .NET code running against a MSSQL db via EF core, but I’ve never ran into something like this before. This code takes a custom “query” string (proprietary syntax) and converts it into an Expression<Func<T, bool>> for use with EF query (Where predicate). This query string can have 0-5 parameters. In addition to the query string, it will be associated with an array of strings (currently this lives in a file) comma-delimited, with one value per parameter in the query.
For example there might be a custom query string such as :
"Car.Color = %1 and Car.NumDoors = %2"
And then a list of strings that contains all the parameter iterations for this query, such as:
"Red",2
"Blue,4
"Green",2
"Red",4
The goal here is to run the query for each parameter combination listed and get the PK ID (only) for the records that match the query.
There already exists code that turns this custom syntax string into an Expression, and some code that populates it with each parameter set. This code is fairly fast and doesn’t seem problematic (not that it matters because I can’t modify this portion of the code anyway).
The real problem is because of the custom nature of these queries, they run indivdually…so if there’s 100k lines of parameter combinations, this runs 100k queries against the db.
I’m trying to figure out how I can optimize this by batching these db calls, somehow. I can’t make any assumptions about how complex the custom query string might be, although there’s a 5 parameter max on it…but basically any simple boolean expression can be here (including parenthesis, etc).
Was hoping to find a suggestion to this issue. Converting the Expression to a T-SQL string is trivial, but even using something designed for dynamic sql like sp_executesql doesn’t seem to help much…it still executes one query/parameter set at a time (as far as I can tell).