Advertisement

about 16bit texture

Started by January 17, 2001 08:49 PM
0 comments, last by Nil_z 24 years ago
I want to use some 16bit per pixel data to generate a texture under OpenGL, but what value should I use in glTexImage2D() well, the TARGET is GL_TEXTURE_2D, LEVEL is set to 0 since I do not use MIPMAP, I do not care about the pixel format in video memory, so I set INTERNALFORMAT to GL_RGB to let OpenGL select a proper internal format, width and height is power of 2, border is 0, and I give DATA a pointer to my data. What should I do with FORMAT and TYPE? I set FORMAT to GL_RGB and TYPE to GL_UNSIGNED_SHORT or GL_UNSIGNED_BYTE, both will cause a crash. I think that is because OpenGL think I am using a short or byte for each RGB component, so the pointer is out of range. To test that, I set FORMAT to GL_RED to let OpenGL think there is only one component, and TYPE to GL_UNSIGNED_SHORT, well, that will not crash, but the texture is all red. The question is, what I need is to read RGB components from one unsigned_short, what should I do?
return 0 or return 1,that is a question :)
maybe u wanna have a look into packed pixels an example is on my page.
also choosing an internal format of GL_RGB5_A1 or GL_R5_G6_B5 will result in your texture''s texels only taking up 16bit''s on the card

http://members.xoom.com/myBollux

This topic is closed to new replies.

Advertisement