Advertisement

A strange error when running my program using OpenGL v1.1

Started by August 03, 2004 08:44 AM
2 comments, last by Steve-B 20 years, 4 months ago
Hi peeps I have written a picture viewer program that (funnily enough) loads pictures into a window. Anyway I added OpenGL error checking code to the main RenderScene() function to catch if any errors occur when I use the DrawPixels() function (this is the only OpenGL function I really use for this program). Anyway the program runs fine on my machine however if I try running it on a machine with the generic Microsoft v1.1 implementation of OpenGL then I get some sort of OpenGL "enum" error. Anyone encountered this before. I think the program works fine if I remove the OpenGL error. Maybe. Any ideas?? Steve B
First make sure it's glDrawPixel that's actually generating the error. If it is, then you check the documentation to see what can cause that error and make sure you don't do that.
Advertisement
Perhaps you've specified an invalid datatype or an invalid pixel format?
Quote: Original post by AndyL
Perhaps you've specified an invalid datatype or an invalid pixel format?


Hi Andy

Hmmm. I'm really not sure. glDrawPixels is the only OpenGL function I use in my main drawing function. Here is what it looks like:

glDrawPixels(width, height, GL_RGB, GL_UNSIGNED_BYTE, deepILBM);

width = unsigned int
height = unsigned int
deepILBM = *unsigned char

I think this is the error that was generated:

GL_INVALID_ENUM is generated if format or type is not one of the accepted values.
GL_INVALID_ENUM is generated if type is GL_BITMAP and format is not either GL_COLOR_INDEX or GL_STENCIL_INDEX.

I can't really see what the problem is though. I'm also quite sure that it works fine (not tested it for a while though) when the error checking is removed. Maybe I should use a C++ cast to force width and height to become GLsizei? I'm not really sure.

Steve-B

This topic is closed to new replies.

Advertisement