How to do the main loop?
I''ve seen that a common way to update your game objects state is to calculate the time past since your last update and increment the position (or whatever) by multiplying the speed with the amount of time elapsed.
Now. I''m planning to move a great deal of objects and I thought of having a timer that at a specific rate (ie, 25 times per second) jumps to a routine that updates the objects using a constant value.
Question: Is it a good approach? How do usually RTS games update object states? Anyone seen articles on the subject or knows alternative/better ways?
it is common to have an infinite message loop and a timer function that calls the object update functions (blitting, animation, sound, everything) every tick. if, for instance, you have a list of classes with the base class TObject, including, say, TCharacter, TPlayer, TAnimObj, whatever, each having their own update function, a sample timer function would look like this:
--
Float like a butterfly, bite like a crocodile.
void Timer(){ // update object list for (TObject *obj=ObjList; obj; obj=obj->Next) { obj->Update(); if (obj->isVisible()) { obj->Draw(); } }}
--
Float like a butterfly, bite like a crocodile.
--Float like a butterfly, bite like a crocodile.
With all the respect: aren''t timers very slow?
Maybe it''s best to calc the time passed since the last procedure call, and then multiple the distance with the time_passed.
Also: if you use a timer for every object...they can all draw themselves at the same time, and sending your display buffer to the monitor can occur halfway drawing the new object positions...
Correct me if I''m wrong...
Maybe it''s best to calc the time passed since the last procedure call, and then multiple the distance with the time_passed.
Also: if you use a timer for every object...they can all draw themselves at the same time, and sending your display buffer to the monitor can occur halfway drawing the new object positions...
Correct me if I''m wrong...
timers? slow? i''m not sure i understand.
the drawing occurs in the timer function as well, after everything is updated, which also has the benefit of giving you a constant framerate.
--
Float like a butterfly, bite like a crocodile.
the drawing occurs in the timer function as well, after everything is updated, which also has the benefit of giving you a constant framerate.
--
Float like a butterfly, bite like a crocodile.
--Float like a butterfly, bite like a crocodile.
Timers are not that accurate either way. I think the approach you are trying will not work as well as the traditional way.
!!!!!!!!!!!!! Arg!! No no no. Please note the following is only my own oppinion; however...
Timers SUCK! How the hell do you do them so that they work?
Basically; you end up with something like:
get time
do stuff
get time again
Now either: CRAP! We were too slow... um... well... just keep going and hope it doesnt screw up...
Or: It was too fast! Ok; either
while (not right time)
{ do bugger all }
(wastes cpu; bad)
or;
sleep (till we should be now)
(VERY BAD; esp. if your writing on a system like NT which does process priorities; as far as I''m aware, sleep for short periods will not be accurate (best Ive seen is too nearest 15ish ms).
Ok; I admit; I''m very use to coding on linux, where this is a serious issue; and on windows, its not quite Soooo bad; but NO! Do not use a timer!
I dont quiet understand what goltrpoat wrote... but basically the second way is better. Trust me on this; I''ve been writing an RTS myself; and the other way is _much_ more of a hassel to do.
As to the frame rate... well, basically, I cant see how you can do that. If you have a fixed time interval, your time frame MUST be as large as the maximum possible time to draw that frame... and as polygons and screen detail increases, this number increases!
Soo...how do you have a fixed time to draw the frame?
Nb. Timers _also_ suck because you have to calculate the optimum time interval for any particular machine you go on: because of the massive variation is machine speed, the time interval will be different for each machine...
However; do as you please. Basically, people use both. However, as near as I can tell, the "traditional" approach (which incidentally, according to an article I''ve read somewhere... is the newer approach, while the timer idea is the old fashioned approach...) works better; and is much easier to code.
Hope what I''ve said helped. =)
Timers SUCK! How the hell do you do them so that they work?
Basically; you end up with something like:
get time
do stuff
get time again
Now either: CRAP! We were too slow... um... well... just keep going and hope it doesnt screw up...
Or: It was too fast! Ok; either
while (not right time)
{ do bugger all }
(wastes cpu; bad)
or;
sleep (till we should be now)
(VERY BAD; esp. if your writing on a system like NT which does process priorities; as far as I''m aware, sleep for short periods will not be accurate (best Ive seen is too nearest 15ish ms).
Ok; I admit; I''m very use to coding on linux, where this is a serious issue; and on windows, its not quite Soooo bad; but NO! Do not use a timer!
I dont quiet understand what goltrpoat wrote... but basically the second way is better. Trust me on this; I''ve been writing an RTS myself; and the other way is _much_ more of a hassel to do.
As to the frame rate... well, basically, I cant see how you can do that. If you have a fixed time interval, your time frame MUST be as large as the maximum possible time to draw that frame... and as polygons and screen detail increases, this number increases!
Soo...how do you have a fixed time to draw the frame?
Nb. Timers _also_ suck because you have to calculate the optimum time interval for any particular machine you go on: because of the massive variation is machine speed, the time interval will be different for each machine...
However; do as you please. Basically, people use both. However, as near as I can tell, the "traditional" approach (which incidentally, according to an article I''ve read somewhere... is the newer approach, while the timer idea is the old fashioned approach...) works better; and is much easier to code.
Hope what I''ve said helped. =)
Don''t use timers, and whatever you do DON''T use a delay loop! It''s a complete waste of time - literally!
The best way is to get the time ( using the performance counter, or the multimedia timer for systems with no performance counter ) at the start of every frame. Also record the time at the start of the previous time, and then calculate the difference. This gives how long the last frame took to execute:-
OldTime=CurrentTime;
CurrentTime=[Function to get the time from the performance counter];
FrameTime=CurrentTime - OldTime;
Now you use this ''FrameTime'' to determine how far the objects have moved in the last frame, and also how much they have accelerated etc. Here is some code which is approximately correct ( it will do for most games ).
ObjectVelocity = ObjectAccel * FrameTime;
ObjectPosition = ObjectVelocity * FrameTime;
NOTE: ObjectAccel,ObjectVelocity and ObjectPosition are either vectors or the individual components.
This method automatically makes the game run at the same speed on all machines. The faster the machine, the smoother the motion ( IE higher FPS ). And it doesn''t waste any processor time!
The best way is to get the time ( using the performance counter, or the multimedia timer for systems with no performance counter ) at the start of every frame. Also record the time at the start of the previous time, and then calculate the difference. This gives how long the last frame took to execute:-
OldTime=CurrentTime;
CurrentTime=[Function to get the time from the performance counter];
FrameTime=CurrentTime - OldTime;
Now you use this ''FrameTime'' to determine how far the objects have moved in the last frame, and also how much they have accelerated etc. Here is some code which is approximately correct ( it will do for most games ).
ObjectVelocity = ObjectAccel * FrameTime;
ObjectPosition = ObjectVelocity * FrameTime;
NOTE: ObjectAccel,ObjectVelocity and ObjectPosition are either vectors or the individual components.
This method automatically makes the game run at the same speed on all machines. The faster the machine, the smoother the motion ( IE higher FPS ). And it doesn''t waste any processor time!
ok, i think there''s a certain amount of confusion reigning here regarding timers. here''s what i mean by a timer function: a callback that gets called automatically (read: by operating system or BIOS services) n times a second. trapping interrupt 1Ch under DOS does that, or SetTimer() under win32.
there IS a word for what shadow mint wrote, but it''s not "timer," it''s "shooting yourself in the foot with a .45 hollowpoint." ok, that''s more than just a word .
--
Float like a butterfly, bite like a crocodile.
there IS a word for what shadow mint wrote, but it''s not "timer," it''s "shooting yourself in the foot with a .45 hollowpoint." ok, that''s more than just a word .
--
Float like a butterfly, bite like a crocodile.
--Float like a butterfly, bite like a crocodile.
and, to everyone who is so vehemently opposed to timers, i would like to say this - (a) how many commercial game titles have you worked on, (b) have you ever given any thought to latency issues in multiplayer and (c) have you ever had to synchronize to sound?
--
Float like a butterfly, bite like a crocodile.
--
Float like a butterfly, bite like a crocodile.
--Float like a butterfly, bite like a crocodile.
There is a minimum time required to do all the calculations required per frame. To do the method I gave above, only requires an overhead of a single reading of the performance counter. Hence the method will take a time that is very close to this minimum, and this means a MINIMUM response time to network trafic and the player input.
Timers ARE useful for SOME parts of a game, eg synchronising sound to a prerendered animation, or when something MUST happen with at a FIXED interval. The main game loop, object movement, network packet processing, general sound effects, and rendering are NOT in this group. You want them done as often as possible! ( Well perhaps not the rendering, but you can just test the ''FrameTime'' and if it is below a certain value ( corresponding to a maximum FPS ) then only render every other frame )
Timers introduce an extra overhead, and it requires that you know IN ADVANCE how long the calculations in the main game loop are going to take. If you guess too big, you waste time after the gameloop, and if you guess too small, then the timer is invoked partway through you gameloop and you miss it ( meaning you have to wait for the next timer, again wasting time)! Trying to adapt the timer to match the gameloop execution time will do EXACTLY the same as the method given above by me, only with the extra overhead of timers. Think about it!
Timers ARE useful for SOME parts of a game, eg synchronising sound to a prerendered animation, or when something MUST happen with at a FIXED interval. The main game loop, object movement, network packet processing, general sound effects, and rendering are NOT in this group. You want them done as often as possible! ( Well perhaps not the rendering, but you can just test the ''FrameTime'' and if it is below a certain value ( corresponding to a maximum FPS ) then only render every other frame )
Timers introduce an extra overhead, and it requires that you know IN ADVANCE how long the calculations in the main game loop are going to take. If you guess too big, you waste time after the gameloop, and if you guess too small, then the timer is invoked partway through you gameloop and you miss it ( meaning you have to wait for the next timer, again wasting time)! Trying to adapt the timer to match the gameloop execution time will do EXACTLY the same as the method given above by me, only with the extra overhead of timers. Think about it!
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement