Advertisement

high fps == bad animation?

Started by May 13, 2002 04:27 PM
26 comments, last by endo 22 years, 9 months ago
I have just upgraded my OS to Windows XP and now I can see how bad my game is After struggling for ages to get some smooth animation in the paddle of my breakout game, it seems my really low frame rate disguised a major problem. The paddle jumps when it is moved initially before moving smoothly which looks terrible. Any ideas how I can fix this? The animation currently works like this: 1. user pushes left or right and causes a call to movePaddle, see below.
  void Paddle::movePaddle( GLfloat movement, DIRECTION dir )
{
	move = movement;
	direction = dir;
}
  
2. movePaddle updates class data members which are used to update the position of the paddle in the timer callback function (GLUT), like so:
  		if( !left || !right )
		{
			paddle.currentLoc->triple[ 0 ] = paddle.currentLoc->triple[ 0 ] + (paddle.move * paddle.direction);
			paddle.move -= 0.2f;			
		}
  
This appears to cause a very rigid movement which looks terrible, but is not so bad at lower frame rates. Any advice would be greatly appreciated, and feel free to ask more question because I''m sure the code above isnt the clearest in the world
You need to use timers to calculate the speed of your object, then if FPS independent.
Advertisement
quote:
Original post by Rickwi22
You need to use timers to calculate the speed of your object, then if FPS independent.


yep.

example equation:
position = velocity * timeSinceLastLoop + position

i think i''m going to hotkey that equation. i feel like i type it on these forums at least 2x/week.

-me
if you ever read something about frame-based vs. time-based, this is why. you''re code is frame-based, assuming a fixed frame rate, and thus making a fixed position movement of 0.2 units. however, the time from one frame to the next can vary. if you use a timer like what was previously mentioned, the animation would be independent of the time animation jumps from one frame to the next. with time-based animation, you would create a timer, take a snapshot of the current time, and then the next frame, take another snapshot, then you have a time difference that can be used to estimate the time to the next frame. this is called the "delta" time. use that in your -= n; and make it to be -= n * dtime; and it will move smoothly independent of machine speed.

a2k
------------------General Equation, this is Private Function reporting for duty, sir!a2k
the key here is dont push the game frame every chance you get. push it 10 times a second for example. then your game will play with a steadier animation. i would contend more games use a server frame velocity then a time elapsed velocity. its good and reliable it just has to be implemented right.





[edited by - declspec on May 13, 2002 7:33:51 PM]
but limiting frame rate means that with my new 120hz monitor and badass graphics card i can still only get 60FPS and thus the animation isn''t as smooth as it could be on my machine. the difference between 60 and 120 hz is _definitely_ noticeable.

it may be the case that there are a great number of games using frame limited animation but that doesn''t mean it''s the better way.

i want whoever plays my game to get the highest possible frame-rate their system allows so that my game will look as good as possible on whatever machine it plays on.

the implementation of time-limited movement is so easy and the payoffs are greater than frame-limited movement. i don''t understand why you''d go for frame-limited movement other than not knowing how to implement it. it''s literally one additional line of code and one additional parameter to add to your move function.

and you can guarantee that everything moves at the same rate on any system. frame-limited movement doesn''t do anything to take care of people who get LESS than your desired framerate. if i''m running through a 3D map and all of a sudden my framerate drops for a couple seconds below your frame limit, it looks RETARDED if my character slows down.

use time-limited movement.

-me
Advertisement
doing velocity by frame ticks doesnt imply you render once per a game frame. im still thinking bout this. not disagreeing.

look when i was modding q2 i know for a _fact_ that game ran internally at 10 frames per a second. all velocities were expressed in frames.

key however is that it seems like you would notice if something only changed position ten times a second? so im still debating on what my actual experience was.

maybe q2s engine interpolated between server frames. im still thinking.



[edited by - declspec on May 13, 2002 8:02:52 PM]
Q2''s game code runs at 10fps, however the graphics do not. When something moves, its movement is lerped.. as are model animations. This gives the effect of smoother movement throughout the game.

Its basically an affair like:

if (one tenth of a second has passed)
UpdateGamePlay();
UpdateGraphics();

-----------------------
"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else''s drivers, I assume it is their fault" - John Carmack
-----------------------"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault" - John Carmack
quote:
Original post by a2k
if you ever read something about frame-based vs. time-based, this is why. you''re code is frame-based, assuming a fixed frame rate, and thus making a fixed position movement of 0.2 units. however, the time from one frame to the next can vary. if you use a timer like what was previously mentioned, the animation would be independent of the time animation jumps from one frame to the next. with time-based animation, you would create a timer, take a snapshot of the current time, and then the next frame, take another snapshot, then you have a time difference that can be used to estimate the time to the next frame. this is called the "delta" time. use that in your -= n; and make it to be -= n * dtime; and it will move smoothly independent of machine speed.

a2k



I know about key frame animation but am confused as to how this is set up. I am using GLUT currently with glutIdleFunc(...) running the animations but this doesn''t account for the passage of time like you say. I now have a simple Timer class based on the the multimedia or performance timer functions, and this has getTime(...) and reset(...) functions as is sensible. But how do I use it to implement key-frame animation? Do I put a timer in the idle callback function? Or inside the main rendering function?
What you need to do is extract the time passed in milliseconds during the last pass. So:

GetTime
Update
Render
GetTime

calculate the difference between the gettimes and you have your elapsed_time. (you could use the average of the last 5 frames instead... Your choice).

Now, on the animation of the keyframe there''s something like this:

RenderModel(frame, frameNext, interpolation).

and a function that does every frame (or pass)

interpolation += x;
if (interpolation>1) {interpolation=0; increase frame;}

You need to change it to:

interpolation += x*elapsed_time;

make sure you adjust x! This should result in machine independent smooth animation with the maximun framerate.

<hr />
Sander Marechal<small>[Lone Wolves][Hearts for GNOME][E-mail][Forum FAQ]</small>

This topic is closed to new replies.

Advertisement