Advertisement

Frame rate limiting?

Started by August 26, 2003 03:28 PM
5 comments, last by m_e_my_self 21 years, 6 months ago
Currently my program uses 99% of cpu So I wanted to try to limit the fps My current code is: In the beginning: LastTime = timegetTime(); if(timeGetTime() - LastTime >= 17) { .... draw scene .... LastTime = timeGetTime(); } timegetTime returns time since startup in miliseconds, so every 17 ms should mean about 60fps right? When I implement it, I get a flickering image, but not just flickering, but where parts show for a second, and then other parts,but never the whole model. The more I increase the 17 the more it flickers (less fps), the more I decrease it, the less. The first problem is, it still uses 99% cpu! The second problem being the flickering... Is there a way to fix it? Or do I have to go for another solution?
Don''t worry about 99% CPU, it''s perfectly normal.

Just keep in mind to make the app idle on, say, minimize.
Advertisement
The problem is that you''re still looping continually even if you''re not drawing every time. ie. your CPU will loop as fast as it possibly can using 100% of it''s resources.
The fastest way to reduce CPU usage is to throw a Sleep(1); somewhere in your loop. But like llyod says, you only need to reduce CPU usage when the program is minimized or, if you want, when windowed.
_______________________________________Pixelante Game Studios - Fowl Language
Yeah I read that it was normal but wanted to be able to do other things while it was running. The sleep function... Is the argument in ms or s? And when ''sleeping'' the program still shows the last rendered screen?
I think it''s ms. I never really checked, I just put in 1 and my fps stays around 64 so that''s good enough for me :D And yes, it will show the last rendered frame.
_______________________________________Pixelante Game Studios - Fowl Language
I still have some problems...
#define MaxFPS 17 //is 1000 milliseconds divided by how many fps you want max (about 60 here)float StartTime = GetTickCount() * 0.001f;int drawnFrames = 0,movedFrames = 0;bool UpdateLFPS(void)	// returns true if its not going too fast and its allowed to move{						// returns false if too many fps and it shouldn't move the stuff	char text[50];	float temp;	temp = GetTickCount() * 0.001f; //convert it to seconds	++drawnFrames;	if( temp - StartTime > 1.0f ) //if more than a second passed, update the fps in the titlebar	{		sprintf(text,"%d fps",drawnFrames);		SetWindowText(hWnd,text);		drawnFrames = 0;		movedFrames = 0;		StartTime = temp;	}	if((temp-StartTime)*1000 < MaxFPS*movedFrames)	//If the number of milliseconds it took this	{ //second so far is smaller than it should be with the current number of movedframes, don't move		return false; 	}	else return true;}   


and then

int DrawGLScene(GLvoid){//Setup OpenGL Cameras etc//Draw everythingif(UpdateLFPS()){		++movedFrames;//Move the things here!}		return TRUE;						}    

It may not be perfect, but it doesn't flicker, shows actual fps, and makes all movement equally fast on all pcs (unless they're too slow of course)

Theres my code.
The problem is, despite the fact that if the pc is fast enough it does 60 moves a second, it's still not equally fast :S. If I make it log every time it moves stuff, its 59-60 times a second on both pcs I tried it on. Yet moving an object from one side of the screen to another takes 10 seconds on one pc and 20 on the other? Any help?

[edited by - m_e_my_self on September 6, 2003 5:29:34 PM]
Advertisement
Alright, I finally got it
The code I posted above is correct, only the loop to catch keystrokes and my physics engine caused it to be different.
So now I finally have the same speed on all pcs :D

This topic is closed to new replies.

Advertisement