Advertisement

System timer resolution

Started by April 29, 2001 09:17 PM
6 comments, last by Ironblayde 23 years, 9 months ago
Maybe this is a dumb question, I don''t know. Anyway, I was looking at the MSDN entry for GetTickCount() for whatever reason, and it said that it is limited to the resolution of the system timer, and that the system timer on Windows 95 and 98 has a resolution of 55ms. This is the old 18.2 Hz I remember from DOS. But I could swear GetTickCount() reported finer intervals than once every 55ms, so I added some code to one of my simple demo programs to write the time to a file each frame, and got things like this: Tick count: 40954588 Tick count: 40954593 Tick count: 40954598 Tick count: 40954605 Tick count: 40954608 Tick count: 40954613 Tick count: 40954618 Tick count: 40954624 Tick count: 40954633 Tick count: 40954638 I am running Windows 98 SE. There must be something stupid I am overlooking here. Did I misinterpret what''s in MSDN, or is Microsoft lying to me again? -Ironblayde  Aeon Software Down with Tiberia! "All your women are belong to me." - Nekrophidius
"Your superior intellect is no match for our puny weapons!"
Well, I suppose faster processors have faster system timers?
Advertisement
Faster processors mean faster master clock generators on the processor, but it''s not the same thing. They wouldn''t have included a value based on the OS in the MSDN if that were the case. The system timer reporting the status of the master clock generator (on a 500 MHz PIII) would give a resolution of 2 nanoseconds. That count is orders of magnitude higher than anything you can get out of GetTickCount(). Still, it''s obviously not fixed at 18.2 Hz like MSDN says, so I have no idea what''s going on.

-Ironblayde
 Aeon Software

Down with Tiberia!
"All your women are belong to me." - Nekrophidius
"Your superior intellect is no match for our puny weapons!"
Perhaps they changed the resolution in Windows 98SE? Do you have access to a Windows 95 machine to test it?

Another possibility is that doc was written in the days when Windows 95 was sloooowww, and that it has something to do with supporting low-end processors. In other words, you shouldn''t count on it to always be a high-resolution timer, even though it might be.
If you want a higher resolution then use QueryPerformanceCounter. Could you put a link to where they say the system timer has a resolution of 55ms? I get about 176 interupts per second when idle which is about 5.5ms. I only see one time in that list that is inconsistant with that resolution. There is 7ms in the interval before it and 3ms in that interval. I''m not sure of the details, but my guess would be that the interupts are generated at precise intervals relative to each other and then when the interupt is serviced the time is read. So say the interupt frequency was 5ms so you have interupts at 0, 5 and 10, but you are delayed 2ms servicing the interupt at 5 so you read 7 instead of 5. Then there is no delay at 10 so you read 10. That makes 3ms between reading the time, but still 5ms between interupts. At a macro level the interupt occurs and it is serviced. At a micro level the processor might be busy doing something else.
Keys to success: Ability, ambition and opportunity.
I have Win98/Celeron464 and when I use _ftime function it has resolution about 20Hz, so it is not because of Win95 or super low-end processors…. I don''t know why it happened, but for example I use this code to solve a problem:

_timeb _tm;
__int64 tm,_delta;

tm_count++;
_ftime( &_tm );
tm = ((__int64)_tm.time)*1000 + _tm.millitm; (I think it is the same as GetTickCount???)
_delta = tm - last_tm;
if( _delta>100 )
{
__delta = (((F)_delta)/1000)/tm_count;
tm_count = 0;
last_tm = tm;
}
Advertisement
I know I can use a higher-resolution timer, I was just wondering why this disagreement with MSDN arises. I finally got to test it on a Win95 machine, and it was reporting intervals of 1ms with perfect consistency. So I''ll be damned if I know why that is, heh. Anyway, here is the text from MSDN I was referring to:

The following table describes the resolution of the system timer.

System Resolution
Windows NT 3.5 and later - The system timer runs at approximately 10ms.
Windows NT 3.1 - The system timer runs at approximately 16ms.
Windows 95 and later - The system timer runs at approximately 55ms.

Thanks for the responses, guys.

-Ironblayde
 Aeon Software

"In C we had to code our own bugs. In C++ we can inherit them." - Unknown
"Your superior intellect is no match for our puny weapons!"
I was wondering if anyone has any information about the relative strengths/weaknesses of these various ways of getting the system time...

GetTickCount
QueryPerformanceCounter
timeGetTime

I always use the timeGetTime but after reading your post I am starting to wonder if the QueryPerformanceCounter would be better.

Also my understanding of the timer issues in Windows was that even though you may do a test and get 1 ms resolution, that doesn''t mean you will always get it - depends on what other processes are running, what their priorities are versus the priority of your process, etc. Of course in full screen exclusive mode I would imagine that DX ensures you get more of a flat response curve, but in windowed mode all bets are off. So I''m guessing that maybe the 55ms is a worst case number, like your process is guaranteed to get at least that amount of processor time no matter what.

This topic is closed to new replies.

Advertisement