about 16bit texture
I want to use some 16bit per pixel data to generate a texture under OpenGL, but what value should I use in glTexImage2D()
well, the TARGET is GL_TEXTURE_2D, LEVEL is set to 0 since I do not use MIPMAP, I do not care about the pixel format in video memory, so I set INTERNALFORMAT to GL_RGB to let OpenGL select a proper internal format, width and height is power of 2, border is 0, and I give DATA a pointer to my data. What should I do with FORMAT and TYPE?
I set FORMAT to GL_RGB and TYPE to GL_UNSIGNED_SHORT or GL_UNSIGNED_BYTE, both will cause a crash. I think that is because OpenGL think I am using a short or byte for each RGB component, so the pointer is out of range. To test that, I set FORMAT to GL_RED to let OpenGL think there is only one component, and TYPE to GL_UNSIGNED_SHORT, well, that will not crash, but the texture is all red.
The question is, what I need is to read RGB components from one unsigned_short, what should I do?
return 0 or return 1,that is a question :)
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement