Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I'm wondering what the precision of the Timer class is in System.Timers, because it's a double (which would seem to indicate that you can have fractions of milliseconds). What is it?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
362 views
Welcome To Ask or Share your Answers For Others

1 Answer

Windows desktop OSes really aren't accurate below about 40ms. The OS simply isn't real time and therefore presents significant non-deterministic jitter. What that means is that while it may report values down to the millisecond or even smaller, you can't really count on those values to be really meaningful. So even if the Timer interval gets set to some sub-millisecond value, you can't rely on times between setting and firing to actually be what you said you wanted.

Add to this fact that the entire framework you're running under is non-deterministic (the GC could suspend you and do collection duing the time when the Timer should fire) and you end up with loads and loads of risk trying to do anything that is time critical.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...