The timers on windows run at a rate of 15.6 ms per tick. So even if you set a timer to 1ms, it still takes 15.6ms to complete (this does depend on exactly when the timer was started but this is the gist of it).
Using timeBeginPeriod(1) (from the Windows API) you can set this to a higher resolution of 1ms.
Unfortunately, on Windows 11, this resolution increase is not guaranteed if a window-owning process becomes minimized, according to the documentation found here:
https://learn.microsoft.com/en-us/windows/win32/api/timeapi/nf-timeapi-timebeginperiod#:~:text=Starting with Windows 11%2C if a window-owning process becomes fully occluded%2C minimized%2C or otherwise invisible or inaudible to the end user%2C Windows does not guarantee a higher resolution than the default system resolution. See SetProcessInformation for more information on this behavior
Is there a way to guarantee this higher resolution of the Timer on Windows 11 onwards, even when the Window is minimized? Or is there an alternative to timeBeginPeriod that can achieve this resolution?
The reason I am asking:
We want to start polling some high precision instruments at an interval of 1ms over a serial port and collect that data, even when the application that is polling is minimized.
Interesting notes:
- It doesn’t have to be exactly 1ms, 1.342ms is also fine, but it should be around there and not the default timer resolution of 15.6ms.
- We are using C# and WinUI for the interface
- Polling is done off the main thread
- Yes, we are using timeEndPeriod to stop it. 🙂
I have been looking around for any information on this, but have not found any more details that can shed light on this.