So, I haven’t seen a thread that talks about this from this direction.
I’m trying to write some code to track the average amount of time it takes to open an odbc connection to a database.
I figured it’d be straightforward. Create connection, stopwatch.start, open connection, stopwatch.close.
What I’m finding is that even though I have pooling set to no in my connection string, the first connection takes 700-ish ms, and all subsequent attempts in the same application execution report 0 ms.
I was planning on having a monitor service run this on a timer, but it’s pointless if I don’t get a “real” connection after the first try.
Shouldn’t it take nearly the same amount of time on each open? Why is it 0 after the first try?
Trivial example in a worker service (I got rid of the using wrapper and tried explicit close/dispose, but doesn’t seem to matter. I had Pooling=False originally and thought that was the problem, too, but nothing seems to change the behavior):
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
while (!stoppingToken.IsCancellationRequested)
{
if (_logger.IsEnabled(LogLevel.Information))
{
_logger.LogInformation("Worker running at: {time}", DateTimeOffset.Now);
}
string connectionString = $"Driver={{ODBC Driver 17 for SQL Server}};Server=blah;Database=blah;UID=blah;PWD=blah;Pooling=No";
Stopwatch stopwatch = new Stopwatch();
OdbcConnection connection = new OdbcConnection(connectionString);
stopwatch.Start();
connection.Open();
stopwatch.Stop();
connection.Close();
connection.Dispose();
connection = null;
Console.WriteLine($"Time taken to open the connection: {stopwatch.ElapsedMilliseconds} ms");
Console.ReadKey();
}
}
Anybody know why it’s doing this and/or how to make it definitively not reuse the connection?