Advertisement

Problems running in 16-bit mode when desktop is 24 or 32bit

Started by January 13, 2005 11:02 AM
2 comments, last by Krohm 19 years, 10 months ago
On some display cards, I've got problems running the code provided with the nehe opengl lessons. When I tell it to run in 16-bit mode and the desktop is running in 24 or 32-bit mode, it seems like the red channel of the picture is ignored. Thus, white becomes bright blue. This can be seen in lesson2, lesson44 etc. This happens on two older computers, one which has an ATI 3D Rage Pro and one with a matrox card of some kind. This doesn't happen on a new computer with an Nvidia 6800. Is there any way to fix this? Regards, Alexander Toresson
Simple fix, don't.
Don't run it in 16 bit mode.
The problem probobly lies in your pfd.
You tell the individual colors to be 8 bit but the total number of bit's is 16 so you only get two of them.
Newer cards tend to be better than older ones, they allso sometimes seam to ignore your demands on using 16 bit and uses 24/32 either way.

So unless you plan on running it on a voodoo2 card, i suggest that you stick to 32 bit.
Advertisement
Are you running in windowed or fullscreen? If fullscreen, something funky is up.

If windowed mode, well... how can you have a window that uses 24/32 bits when the current desktop mode only allows 16? :)
I also had a similar problem YEARS ago and I realized hte channel wasn't missing but swapped.
Are you sure this is not happening?
By the way, I also suggest to go 32bit, no one likes 16bit anymore and unless you have a 16bit only video accelerator then it doesn't pay to fix it.

Previously "Krohm"

This topic is closed to new replies.

Advertisement