While monitoring our socket server application in Performance Monitor, I noticed that the “% of Time In GC” is significantly high (average 25-30) only on one PC (server) but not on other PCs/servers (average only 0.8) when the servers reached the same client online level.
- All servers run the same .NET 4.5 application server, same version, and same configuration.
- All servers use Windows Server 2019, updated to the latest build, but the issue persists. – The server having the issue is newer in hardware, has more CPU cores, and a higher clock speed
- We use only 32% of memory RAM, so there is much more memory available.
- We use only 10% of the server’s CPU usage, so there is ample CPU capacity remaining.
- All servers use the same .NET Framework version 4.5. I have tried using other .NET Framework versions on the problematic server, such as .NET 4.7.2 and .NET 4.8, but the issue still persists.
What should I consider first that could be causing this strange issue? Could it be related to a hardware issue like RAM?