Advertisement

why is this so slow?

Started by October 22, 2004 11:06 PM
9 comments, last by backbone 20 years, 1 month ago
why am i getting such a bad fps? im running my application, it only draws 2 quads to the screen, loads 3 64x64 textures so why am i only getting 4 frames per second? My video card is a nvidia 32mb, im running in 24bit color and a resolution of 1024x768. How can i get a better fps? i know changing the resolution and color mode will help but are there any other ways?
Yeah that has been happening to me too. I made 2 simple games, nothing special, 2d .. few textures, blending .. and that's it!
It runs smoothly on my comp, but on some others, some of whitch have more powerfull cpu-s the games run very very slow.. i made a timer and drawing a scene on my comp last for 1ms and others
about 160ms.. so i'd like the answer to your question too.. anybody? plz!?
(2B)||(!2B)
Advertisement
First of all, you might want to check if your FPS counter is working properly (test on another computer, for instance).

Second, you might be lacking proper graphic drivers (are you running WinNT/2k/Linux?), try downloading new ones from here: NVidia Drivers
Also, some 3Dcards ,have a hard time with rendering polygons upclose that are blended.

I see in games that use a lot of particle effects (which are blended most times),framerate will slow down once you get too close to the particles.

You might want to post the code you are using to draw your scene with.
thanks for the help, the fps counter is accurate withen 2 frames i think, i tested it on other systems and here are the results (all tests done on windows xp with a resolution of 1024x768):

my computer:
nvidia tnt riva 32mb video card, 4fps, colordepth: 32bit

computer number two:
onbaord video, 30fps, colordepth: 24bit

computer number three:
some generic 32mb video card, 120fps, colordepth: 32bit

the application uses just two poly's, i am using blending, they arnt too close to the screen though, i actually translate -13 units before i draw them here is the code i am using (it is in visual basic .net using csgl)

' drawing loop
While Me.dodraw = True
OpenGLControl1.glDraw()
frate += 1
OpenGLControl1.roty += 0.5
info1.Text = "FPS: " & fps & " X: " & OpenGLControl1.rotx & " Y: " & OpenGLControl1.roty
Me.Refresh()
Application.DoEvents()
End While

' draw the scene
GL.glClear(Convert.ToUInt32(GLFlags.GL_COLOR_BUFFER_BIT Or GLFlags.GL_DEPTH_BUFFER_BIT))
GL.glLoadIdentity()
' do drawing surface rotations
GL.glTranslatef(0.0F, 0.0F, -13.0F)
GL.glRotatef(rotx, 1, 0, 0)
' draw some shapes...
GL.glRotatef(roty, 0, 1, 0)
GL.glEnable(Convert.ToUInt32(GLFlags.GL_BLEND))
GL.glDisable(Convert.ToUInt32(GLFlags.GL_DEPTH_TEST))
GL.glBlendFunc(Convert.ToUInt32(GLFlags.GL_DST_COLOR), Convert.ToUInt32(GLFlags.GL_ZERO))
GL.glBindTexture(Convert.ToUInt32(GLFlags.GL_TEXTURE_2D), texture(2))
GL.glBegin(Convert.ToUInt32(GLFlags.GL_POLYGON))
GL.glTexCoord2f(0, 0)
GL.glVertex3f(-1, -1, -1)
GL.glTexCoord2f(0, 1)
GL.glVertex3f(1, -1, -1)
GL.glTexCoord2f(1, 1)
GL.glVertex3f(1, 1, -1)
GL.glTexCoord2f(1, 0)
GL.glVertex3f(-1, 1, -1)
GL.glEnd()
GL.glBlendFunc(Convert.ToUInt32(GLFlags.GL_ONE), Convert.ToUInt32(GLFlags.GL_ONE))
GL.glBindTexture(Convert.ToUInt32(GLFlags.GL_TEXTURE_2D), texture(1))
GL.glBegin(Convert.ToUInt32(GLFlags.GL_POLYGON))
GL.glTexCoord2f(0, 0)
GL.glVertex3f(-1, -1, -1)
GL.glTexCoord2f(0, 1)
GL.glVertex3f(1, -1, -1)
GL.glTexCoord2f(1, 1)
GL.glVertex3f(1, 1, -1)
GL.glTexCoord2f(1, 0)
GL.glVertex3f(-1, 1, -1)
GL.glEnd()
This is usually caused by asking a pixel format that is not hardware accelerated, hence your rendering is done in software.

Post your pixel format creation code.

Y.
Advertisement
Public Function glLoadBitmaps(ByVal num As Integer) As Boolean
GL.glEnable(Convert.ToUInt32(GLFlags.GL_TEXTURE_2D))
GL.glGenTextures(num, texture)
Dim loopvar = 0
Dim pathbuffer As String
For loopvar = 0 To num - 1
Dim Image As Bitmap = New Bitmap(texar(loopvar))
Image.RotateFlip(RotateFlipType.Rotate90FlipNone)

Dim bitmapdata As System.Drawing.Imaging.BitmapData
Dim rect As Rectangle = New Rectangle(0, 0, Image.Width, Image.Height)

bitmapdata = Image.LockBits(rect, System.Drawing.Imaging.ImageLockMode.ReadOnly, Drawing.Imaging.PixelFormat.Format24bppRgb)

GL.glBindTexture(Convert.ToUInt32(GLFlags.GL_TEXTURE_2D), texture(loopvar))
GL.glTexImage2D(Convert.ToUInt32(GLFlags.GL_TEXTURE_2D), 0, Convert.ToInt32(GLFlags.GL_RGB8), _
Image.Width, Image.Height, 0, Convert.ToUInt32(32992), _
Convert.ToUInt32(GLFlags.GL_UNSIGNED_BYTE), bitmapdata.Scan0)
GL.glTexParameteri(Convert.ToUInt32(GLFlags.GL_TEXTURE_2D), Convert.ToUInt32(GLFlags.GL_TEXTURE_MIN_FILTER), Convert.ToUInt32(9729)) '// Linear Filtering
GL.glTexParameteri(Convert.ToUInt32(GLFlags.GL_TEXTURE_2D), Convert.ToUInt32(GLFlags.GL_TEXTURE_MAG_FILTER), Convert.ToUInt32(9729)) '// Linear Filtering

Image.UnlockBits(bitmapdata)
Image.Dispose()
Next
End Function
I was speaking of the color buffer's pixel format :)
This one is mine;)

static PIXELFORMATDESCRIPTOR pfd=
{
sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW|
PFD_SUPPORT_OPENGL|
PFD_DOUBLEBUFFER,
PFD_TYPE_RGBA,
bits,
0,0,0,0,0,0,
0,
0,
0,
0,0,0,0,
16,
0,
0,
PFD_MAIN_PLANE,
0,
0,0,0
};

i believe that it's identical to nehe's tut 1 code;)
(2B)||(!2B)
On which systems with which video cards do you get these "slowdowns" ? Does it happens both in 16 and 32 bits color mode ?

Use glGetString(GL_RENDERER) to make sure you don't get GDI Generic - this is Microsoft's software implementation of OpenGL.

Y.

This topic is closed to new replies.

Advertisement