Advertisement

SDL_GetTicks

Started by July 18, 2004 04:35 AM
14 comments, last by Ra 20 years, 2 months ago
Quote:
Original post by graveyard filla
im facing the same problem too. i want to stay cross platform but need a high precision timer. surely someone out there must have created a simple function / "library" that can do this. i could probably write it myself using preprocessor #if's, but thats a huge pain if someones already done it for me =)


What for?
Not giving is not stealing.
the alternative event timer is said to be thread-save, what about the GetTicks timer?
and if yes: is there any reason to use the event timer? and don't really like it, since it somehow breaks the code design, by taking the timer handling away from the object, what becomes annoying when there are lots of them.
------------------------------------------------------------Jawohl, Herr Oberst!
Advertisement
Quote:
Original post by Lode
the FPS counter switches between 333.33333fps and 250.0000fps constantly, because it sees that the frame either took 3 milliseconds or 4 milliseconds.

Is the only reason you want a high-performance timer to keep the fps from jumping nearly a 100 fps each frame? If this is the case, then I would agree with what other posters have said and average the fps over a few frames.

Personally, I would do something like this:
...globals
int start_time, current_time, fps;
...in main()
start_time = SDL_GetTicks();
...in your Main Game Loop
static int frames_passed++;
current_time = SDL_GetTicks();
if( time_passed - start_time > 1000 )
{
fps = frames_passed;
start_time = current_time;
frames_passed = 0;
}

The code may have some errors in it because I just wrote it off the top of my head. By sampling fps this way, you actually will get the number of frames that are executed every second, instead of basing fps on a each frame.
for the timer you want I suggest you just rip out the code from GLFW, http://glfw.sourceforge.net/
it returns a double
it uses the best timer available
it has every timer I know of
awesome license allows you to go ahead and lift it right out(zlib license)
cross platform and easy to browse source(look in *platform*_time.c - how easy is that)

I would use this in conjunction with averaging frames, framerates vary and an average is a more useful number in many situations
The truth is that you don't really need insanely high precision to achieve what you're trying to do. In fact, you can just use a simple timer with a 1 second resolution.

The idea is that you pump out frames until the timer expires and count how many were rendered in that amount of time. If the timer is set to 1 second then nothing needs to be modified and you can use the raw count as the FPS.

EDIT: What JonnyQuest and wyrzy said.
Ra

This topic is closed to new replies.

Advertisement