Martin Storsjö martin@martin.st writes:
Is there any reason to artifically limit the precision of the current function? And does it make sense to have a test that busyloops calling both of the functions, finding out the minimum increment of each of the functions? By having it run for a couple hundred milliseconds it should probably be enough to get useful data, but this still would be a kinda brittle test, susceptible to scheduler interruptions.
We don't care about the exact precision, so it would be enough to test that both functions always return the same data within 100ms or so, and that GetSystemTimePreciseAsFileTime result changes within 1ms or so.