Advertisement

32-bit pixel format

Started by February 25, 2000 11:51 AM
7 comments, last by Sphet 25 years ago
What is the pixel format for 32 bit screens?
My guess is that it is ARGB8888, it''s just that the alpha component isn''t used.
Advertisement
Just like 16bit, there are several 32bit formats. I think the most popular ones are RGB10 12 10 (no alpha) and ARGB 1 10 11 10. The Green element typically gets the extra bits cause the human eye is most sensitive to that color.
but windows uses the standard RGBA8888 format, doesn''t it? i always though 32-bit would be like 24-bit, only with the extra 8bit component
to be honest, I never even figured out 24-bit format. I never got the code or could find an explanation of the format or how to use it.

"Remember, I'm the monkey, and you're the cheese grater. So no fooling around."
-Grand Theft Auto, London
D:
although i have never bothered with writing algorithms for 24-bit from what i have heard it is just an RGB888 format. fairly easy to use then compared to the RGB565 format
Advertisement
According to Tricks of the Windows Game Programming Gurus, the two 32 bit modes are indeed 8.8.8.8 (one is alpha and one is X, but they both use the same bit pattern). No more available colors then 24 bit mode. Reason given for why they''re used... many video cards can''t address on three-byte boundaries. Fourth byte is just for alignment. There, I just corrected myself.
personally i think the 16777216 colors of 24bit format are enough

anyway, that would mean that in 32bit you have an extra byte per pixel that is unused... can one use this byte for user-defined data storage? well i can''t imagine what i could store in there, but that sounds interesting
You can access the fourth byte through the alpha buffer if the card supports that buffer.

This topic is closed to new replies.

Advertisement