2 hours ago, Kylotan said:Don't use ticks - measure time in seconds. You don't need sub-millisecond precision and the timer doesn't truly have that level of resolution anyway. Time in seconds stored as a float is adequate for gaming purposes
This depends on the game genre highly. If you've got a race car going 300kmh, that's about 8cm per millisecond, which could show up as very noticeable jitter if that's your timer precision. Another way to look at that, is that a 1ms imprecision at 60Hz is 6% of a frame, which is a decently high error rate.
Hardware timers will likely be closer to nanosecond precision than microseconds. So, definitely capable of under 0.006% per frame error if you use ticks (or time in seconds as double-precision float or 64bit fixed point), whereas with 32bit floats it only takes an hour to reach ~0.25ms quantisation (1.5% error). If someone leaves their PC on overnight, floating point time in seconds will quantise to more like half a frame