Advertisement

opengl+bit depths

Started by March 18, 2000 03:14 PM
2 comments, last by tschirke 24 years, 11 months ago
Hi all! During the development of my gl based engine I've had a serious problem. I want to change display mode dynamically. So when the user wants to use my stuff in 32 bit mode I just call my setmode routine and reload the textures and everything will be fine. But it is not working. Basic state: I have 640x480x16bpp ... it is working well. When I change the resolution but leave the 16bpp everything is okay, but when I try to set a 32bpp mode there is no display. I've debugged the code the pixel format is set (colorbits =32 ... ), wglCreatecontext is returning a value so, I have a rendering context... but I can see nothing (the engine is runnig well). Could someone help me? Here is the code I use: void setmode(int32 width,int32 height,int32 bpp) { // if the thread has a current rendering context ... hRC = wglGetCurrentContext(); hDC = wglGetCurrentDC() ; wglMakeCurrent(NULL, NULL) ; wglDeleteContext(hRC); hRC = 0; SetWindowPos( hwndApp, // handle to window HWND_TOP, //insert after 0, // horizontal position 0, // vertical position width, // width height, // height SWP_NOZORDER //no z change ); ChangeResolution(width, height, bpp); SetupPixelFormat(hDC,bpp); hRC = wglCreateContext(hDC); //do OpenGL! wglMakeCurrent(hDC, hRC); initGL(); //shading, projection etc... }; Csaba Berényi - Programmer - CLEVERS TEAM Edited by - tschirke on 3/18/00 3:35:30 PM
Csaba Berényi - Programmer - CLEVERS TEAM
Does it work if you start out in 32bpp mode? I know some cards have a problem rendering to a a different bit depth from the desktop.

If nobody else gives you a better answer, my suggestion is to destroy your window and recreate it. Bit depth changes work in my apps, and that's the only real difference I see between our code.

Edited by - Ampere on 3/19/00 2:42:20 AM
Advertisement
SetPixelFormat can only be called on a window once. (Microsofts fault -- yet another WinOpenGL/wgl limitation that will likely never be resolved for political reasons)

Any subsequent call will, at best, have no effect, and at worst may cause all manner of problems.

Unfortunately, destroying the window altogether and recreating it is the best way to deal with this situation.

Thank you for the answers.

I have a new question, maybe you could answer me:
I''ve created a little program to display a textured triangle.

I tried different ways to render it:
glVertex3 method is working well everywhere, but
when I use vertex arrays on an s3 savage, it doesn''t matter what I do the prorgam hangs in the drawarrays function. I use a TNT2 and I tried it on a voodoo2 and my code is working well.

By the way quake is running well on that s3 card.

Do you have any ideas why that is?

Csaba Berényi - Programmer - CLEVERS TEAM
Csaba Berényi - Programmer - CLEVERS TEAM

This topic is closed to new replies.

Advertisement