Temporal sampling frequency (aka 'framerate')

Published November 19, 2011
Advertisement

Game player: "d00d, teh framerate totally sux0rz!"
Game developer: "we are experiencing aliasing due to low sampling frequency along the temporal axis..."

One of the many decisions that goes into making a game is whether it is better to run at a high framerate, which means drawing many frames per second and thus having little time to spend on each, or should we choose a lower framerate, get more time per frame, and thus be able to draw larger numbers of higher quality objects? The perfect balance is a matter of heated debate, varying with the game, genre, and personal preference, but there are some widely accepted truths:

  • Dropping below 30 fps is rarely a good idea
  • 30 fps is generally considered ok, but 60 is better
  • Higher than 60 fps offers rapidly diminishing returns

    Yet movies are animated at a mere 24 fps! Why has Hollywood chosen a framerate so much lower than most game developers?
    Perhaps this is just a historical legacy, preserved for backward compatibility with decisions made in the 1920s? But when IMAX was designed in the late 1960s, they specified new cameras, film, projectors, and screens, while keeping the 24 fps sampling frequency. And in the early 21st century, Blu-ray and HD DVD fought an entire format war during the transition to high definition digital video, but oh look, still 24 fps. It sure looks like the movie world just doesn't see any reason to go higher, similar to how few game developers care to go above 60 fps.
    Ok, next theory: perhaps the difference is because games are interactive, while movies are just prerecorded entertainment? The lower the framerate, the more latency there will be between providing an input and seeing the resulting change on screen. The pause button on my DVR remote has ~.5 sec latency, which is irrelevant when watching my favorite romantic comedy but would be a showstopper when trying to nail a Halo headshot.
    And a final theory: perhaps the difference is due to aliasing? Realtime graphics are usually point sampled along the time axis, as we render individual frames based on the state of the game at a single moment in time, with no consideration of what came before or what will happen next. Movies, on the other hand, are beautifully antialiased, as the physical nature of a camera accumulates all light that reaches the sensor while the shutter is open. We've all seen the resulting blurry photos when we try to snap something that is moving too quickly, or fail to hold the camera properly still. Motion blur is usually considered a flaw when it shows up uninvited in our vacation snapshots, but when capturing video it provides wonderfully high quality temporal antialiasing.
    So which theory is correct? Do games care more than movies about framerate because of latency, or because of aliasing?
    We can find out with a straightforward experiment. Write a game that runs at 60 fps. Make another version of the same game that runs at 30 fps. Make a third version at 30 fps with super high quality temporal antialiasing (aka motion blur). Get some people to play all three versions. Get more people to watch the first people playing. Compare their reactions.

    • If aliasing is a significant factor, the watchers will be able to distinguish 60 fps from 30 fps, but unable to distinguish 60 fps from motion blurred 30 fps
    • If latency is a significant factor, people playing the game will be able to distinguish 60 fps from motion blurred 30 fps, even though the watchers cannot
    • If everybody can tell the versions apart, both theories must be wrong
    • If nobody can tell any difference, we might as well just leave the whole game at 30 fps and be done with it :-)

      If you try this experiment, you will find the results depend on which game you choose to test with. Many observers do indeed think motion blurred 30 fps looks the same as 60 fps, so temporal aliasing is surely important. Players also find the two equivalent with some games, while reporting a big difference with other games. So the significance of latency depends on the game in question.
      Sensitivity to latency is directly proportional to how hands-on the input mechanism is. When you move a mouse, even the slightest lag in cursor motion will feel very bad (which is why the mouse has a dedicated hardware cursor, allowing it to update at a higher framerate than the rest of whatever app is using it). Likewise for looking around in an FPS, or pinch zooming on a touch screen. You are directly manipulating something, so expect it to respond straight away and for the motion to feel pinned to your finger. Less direct control schemes, such as pressing a fire button, moving around in an FPS, driving a vehicle, or clicking on a unit in an RTS, can tolerate higher latencies. The more indirect things become, the less latency matters, which is why third person games can often tolerate lower framerates than would be acceptable in an FPS.
      What can we learn from all this rambling?


      • If we have a game at 60 fps, and are trying to find room to add more sophisticated graphics, an interesting option might be to drop down to 30 fps while adding motion blur. As long as we can implement a good blur for less than the cost of drawing one 60 fps frame, we may be able to achieve equivalent visual quality while freeing up a bunch of GPU cycles.
      • If we have a game at 30 fps or lower, we should avoid the sort of input behaviors that will make this latency objectionable to the player. Conversely, if we have a game that uses only the sort of input not sensitive to latency, there is less point bothering to make it run at 60 fps!

        Yeah. yeah, so I should talk about how to actually implement motion blur. Next time...aggbug.aspx?PostID=10238714

        Source
0 likes 0 comments

Comments

Nobody has left a comment. You can be the first!
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement