Advertisement

hi! is this how you get the FPS for your game?....

Started by April 19, 2002 06:32 AM
14 comments, last by mickey 22 years, 8 months ago
You get zero because you''re doing integer division. When tickDiff / 1000 is evaluated, the type of the result is the same as the type of the operands, which is DWORD. Since there''s no way of storing fractional values in a DWORD, the fractional part is discarded and you get zero. Only when the assignment is evaluated the zero is converted to a floating point 0.0. To do floating point division, you have to make one of the operands a float before the division takes place.
---visit #directxdev on afternet <- not just for directx, despite the name
if (timeGetTime()>LastTime+1000)    {    FrameRate = FrameCounter;    FrameCounter = 0;    DeltaTime = 1/FrameRate; // Float    LastTime = timeGetTime();    } 



use FrameCounter++; just after each page flip or present etc.

SpeedPerFrame = SpeedPerSecond*DeltaTime;

Will work with slow-downs and average out per frame blips. known as ''varible step timing''.

,Jay


Advertisement
Should also mention:

The performance counter is buggy on some chipsets (search MSDN for list), its to do with the PCI bridge causing it to jump upto several seconds at once.

The performance counter is not guarranted to be availible of all hardware, although x86 PC''s all have them.

I''d use timeGetTime, its accurate to approx 10ms (MS says 1ms) and should be enough for any game.

,Jay
indirectX, ahh i see, thanks!

well, thanks so much for all your help guys! and yeah, when i compared the timeGetTime to a real second, it moves slightly faster,
http://www.dualforcesolutions.comProfessional website designs and development, customized business systems, etc.,
Are we forgeting GetTickCount()?

CEO Plunder Studios
[email=esheppard@gmail.com]esheppard@gmail.com[/email]
GetTickCount is even less accurate.
---visit #directxdev on afternet <- not just for directx, despite the name

This topic is closed to new replies.

Advertisement