Advertisement

32bit depth buffer ... how ?

Started by October 31, 2003 06:26 PM
2 comments, last by penetrator 21 years, 4 months ago
I see lot of cards can run with 32bit depth buffer, but it seems i''m not able to set it up in opengl. I tried modifying the pixel format descriptor of the createglwindow() func from 24 to 32, but it makes no difference: I have a small app to check the depth test and it keeps saying it is at 24 bit. And my video card is a Geforce FX 5800, so i should be able to achieve a 32bit depth buffer right ?
r u sure u r changing the right parameter?
----------------------------------------------------"Plant a tree. Remove a Bush" -A bumper sticker I saw.
Advertisement
i think 24bit is max, at least at nvidia cards, the remaining 8 bit are the stencil buffer...


T2k
T2K is right

GTK Radiant had the same problem on nvidia cards
it used 32 at first then i told them to change back to 24 and everything worked fine

if you use 32 bit on a nvidia card the drivers jump back to 16 which causes artifacts
http://www.8ung.at/basiror/theironcross.html

This topic is closed to new replies.

Advertisement