Advertisement

Timing for beginners (minitut)

Started by May 18, 2001 11:16 AM
2 comments, last by myster0n 23 years, 6 months ago
Hi there, Another newbie here. I have just finished the blending tutorial, and one thing that bothered me was that when I was running the resulting program the cube rotated slower when I zoomed in. Obviously this is because it takes longer for my PC to render a close-up. And the fact that the program doesn''t take into account the time between two redraws. So I''ve been looking for a way to correct this. I mean, the answer is obvious : you need a timer. But which function, which library? That wasn''t so obvious to me, and I guess it also isn''t so obvious for people who, like me, have only limited programming experience. But I found the answer, and I''m about to show it here. If anyone has a better solution, feel free to correct me. I found a function DWORD timeGetTime(void) in the winmm library that retreves the system time in milliseconds. First of all, you have to put winmm.lib in your project settings (you know : project - settings - link-tab - Object/library modules : add winmm.lib) Now for the modified code (from tut 7-8) We need another include
  
#include <winmm.h>    // where we will find the timer function
  
Next we''re going to define some global variables : TimeBuf, which will contain the old time, and Factor, the factor by which we multiply the rotation speed and z position.
  
DWORD	Timebuf;	// this will contain the old time

float	Factor=0.0f;	// timing factor

  
In the function InitGL we set Timebuf to the current time.
  
Timebuf=timeGetTime();
  
in DrawGLScene, right at the beginning we have to see what the current time is and use it to calculate Factor, then we can put the current time in Timebuf
  
DWORD  Timenow=timeGetTime();
Factor=(float)(Timenow-Timebuf)/20;  // I devide by 20 because it approximates the original speed IMHO

Timebuf=Timenow;
xrot+=xspeed*Factor;
yrot+=yspeed*Factor;
  
I put the xrot and yrot addition there instead of at the end of DrawGLScene because it looks more logical that way to me. Finally we go to WinMain, and we change
  
if (keys[VK_PRIOR])
{
	z-=0.02f;
}
if (keys[VK_NEXT])
{
	z+=0.02f;
}
  
into this
  
if (keys[VK_PRIOR])
{
	z-=0.02f*Factor;
}
if (keys[VK_NEXT])
{
	z+=0.02f*Factor;
}
  
And that''s it. I hope it was helpful to at least one person.
I''m sure I''m not the ''one person'' but it sure was helpful for me. thankyou for sharing the wisdom!


khrob
_________________________________I used to be indecisive. Now I'm not so sure...
Advertisement
Not exactly a better solution, but more of a suggestion. timeGetTime(void) give you the current time in milliseconds, so the highest precision that you can get out of it is 1 millisecond, which might be good for most things, but in some cases you may want higher precision, especially if you''re running at a high frame rate. @ 150fps the time between frames is 0.0066667 seconds, or 6.6667 milliseconds.
QueryPerformanceCounter() has a much a higher resolution. To get the frequency of the counter, you have to use the accompanying function QueryPerformanceFrequency().

Hope this helps !

J.W.
Hey, that''s great. Thanks for the info. Like I said in the minitut : the problem is finding the right function. And when you''re looking for time function, it doesn''t often occur to you to look for names like QueryPerformanceCounter. Thanks again. One step closer to world domination

This topic is closed to new replies.

Advertisement