Advertisement

Help! Totwpg's loading bitmaps

Started by August 26, 2000 08:15 PM
10 comments, last by An Irritable Gent 24 years, 4 months ago
I've found my problem.

Somebody lied somewhere. My video card is an S3 ViRGE DX PCI, with 2MB, and the latest 95/98 drivers installed. GetPixelFormat() tells me that the bit count is 16. According to LaMothe's book (page 292) a result of 16 "must be 5.6.5 mode", and a result of 15 "must be 5.5.5 mode". Since the value was 16, I assumed I had to work with 565.

HOWEVER, after fighting this for 2 days, I finally checked out the ddpixel.dwRGBAlphaBitMask, ddpixel.dwRBitMask, ddpixel.dwGBitMask and ddpixel.bwBBitMask to verify I wasn't going insane. The values respectively were 0, 0x7f00, 0x3e0 and 0x1f. This is 555!

Digging throught the DirectX docs, there is NO SUCH THING as a dwRGBBitCount of 15 , despite what LaMothe's book says. There are cards than run in 555 (as mine apparently does), but they will still report a bit count of 16.

Bottom line: You have to check the bit masks, not the dwRGBBitCount, to see if your 16 bit mode is 555 or 565.

Arghh but whoohoo!

The Gent

Edited by - An Irritable Gent on August 27, 2000 9:03:22 PM
aig
Unfortunately 5:5:5 is a 16 bit mode, because there is an alpha channel (I think ) and so you have been going around in circles. It was prolly his publisher trying to act smart by pretending to know that adding up 5+5+5 is 15, not 16 . Stupid publisher stuffed up sooo much more...

For future reference though: 5:5:5: is 16bit colour

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
-Chris Bennett of Dwarfsoft
"The Philosophers' Stone of Programming Alchemy"
IOL (The list formerly known as NPCAI) - A GDNet production
Our Doc - The future of RPGs
Thanks to all the goblins over in our little Game Design Corner niche

This topic is closed to new replies.

Advertisement