Advertisement

GLFW: Two questions about glfwSwapInterval

Started by March 08, 2019 03:22 AM
3 comments, last by lawnjelly 5 years, 8 months ago

If I understood what I read in the documentation: glfwSwapInterval sets the amount of screen updates we're going to wait until we swap buffers.

Question #1

When I'm profiling, I noticed that glfwSwapBuffers takes almost 16ms and my game occasionally drops a few noticeable frames. If I restart the game, I can sometimes get a solid smooth experience, and other times I get random dips in my framerate. I also read glfwSwapInterval is set to 1 by default, so I'm guessing those 15.x ms, are due to the default interval.

On the other hand, when I set the interval to 0, glfwSwapBuffers() only takes 0.1ms, and my game runs smoothly all the time, no matter how many times I restart it. I'm enjoying the smooth framerate I'm getting now, but isn't it recommended setting the interval to 1 in order to only update when the screen is ready?

Question #2 (also related to glfwSwapInterval)

 I noticed if I set it (once) to 0 outside of my game loop, the CPU usage rises a lot more than usual. But if I set it at the beginning of every frame, the CPU usage is a lot less and stable. It seems to me that I should set to 0 within my game loop, but am I using it right?

Thanks!

Btw, maybe not too related, but I'm using a fixed step update loop and interpolating frames at render using the left over accumulated delta time. Just in case it helps.

Advertisement

Somebody at the GLFW forums helped out with some explanations. I'll share what I found, just in case somebody is questioning the same things.

You should not call glfwSwapInterval every frame. In fact, setting the interval to 1 should be the default and the user should be able to toggle between on or off.

So my question regarding why the function uses more CPU time in different parts of the game loop doesn't even matter, just call the function once.

I still don't know why my game stutters when the interval is set to 1 though, but if I go fullscreen the stuttering goes away. I'll work with that for now.

Some useful links: glfwSwapInterval and buffer swapping.

 

The problem could be related to the (timing-)code in your gameloop and how it interacts with the windowing system.

I'd read https://medium.com/@tglaiel/how-to-make-your-game-run-at-60fps-24c61210fe75 and compare with your code.

 

Turning off vsync will make it do the render loop as fast as possible, which will result in very high CPU. If you print out the fps you might find it is rendering at 400fps+.

With vsync on, you are assuming vsync is doing what you think (i.e. your game is getting given time to run every 1/60th of a second). In fullscreen this tends to roughly work, but in windowed mode different operating systems can give wildly different behaviour, and 3d has to interact with the windows compositor system (shadows etc).

First I'd advise isolating whether problems are due to a bug in your fixed timestep interpolation versus full / windowed mode issues, these are 2 totally separate things. You can do this by simply moving a box across the screen and advance it using delta time instead of using your fixed timestep. Providing your interpolation is bug free you will often find there is a problem simply using delta time.

There is some discussion of the issue here, towards the end of the article, the links and particularly read the comments:

A key point is that requesting the time from the OS during your loop is giving you the time for submission of a frame, NOT the render time. And so in many cases you cannot trust spot timings taken during your render loop, or rather they are not directly linked to how far you might advance your simulation.

You can also have issues where you are on the boundary of dropping frames, where the difference between the submit and render times can be wildly variable.

This topic is closed to new replies.

Advertisement