🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

"Deep color"

Started by
6 comments, last by Prune 16 years, 4 months ago
HDMI1.3 supports "deep color", basically hardware HDR (more than 8 bits per pixel per channel). So my question is, do any current or shortly-upcoming graphics cards support HDMI or DVI output of >8 bit per color channel?
"But who prays for Satan? Who, in eighteen centuries, has had the common humanity to pray for the one sinner that needed it most?" --Mark Twain

~~~~~~~~~~~~~~~Looking for a high-performance, easy to use, and lightweight math library? http://www.cmldev.net/ (note: I'm not associated with that project; just a user)
Advertisement
Surely someone has a comment...
"But who prays for Satan? Who, in eighteen centuries, has had the common humanity to pray for the one sinner that needed it most?" --Mark Twain

~~~~~~~~~~~~~~~Looking for a high-performance, easy to use, and lightweight math library? http://www.cmldev.net/ (note: I'm not associated with that project; just a user)
From here.

"The GeForce 8 series supports 10-bit per channel display output, up from 8-bit on previous NVIDIA cards."
That begs the question--how do I use it?! I will shortly have access to such displays and I want to convert my OpenGL HDR stuff to take advantage of this.
"But who prays for Satan? Who, in eighteen centuries, has had the common humanity to pray for the one sinner that needed it most?" --Mark Twain

~~~~~~~~~~~~~~~Looking for a high-performance, easy to use, and lightweight math library? http://www.cmldev.net/ (note: I'm not associated with that project; just a user)
Quote: Original post by Prune
That begs the question--how do I use it?! I will shortly have access to such displays and I want to convert my OpenGL HDR stuff to take advantage of this.


Haven't tried it myself, but I'm guessing that using A2R10G10B10 format framebuffers would do the trick?

I tried:

SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 10);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 10);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 10);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 2);

Then SDL_SetVideoMode(SW, SH, 32, SDL_OPENGL | SDL_FULLSCREEN) runs without error, but testing SDL_GL_GetAttribute gives me back 8 for each of these...

Now, would the card reject 10 bit output if it thinks the display doesn't support it? I'm using a DVI to HDMI connector into an HDTV, from an 8800 Ultra card. I only see the article mentioned above saying that the 10 bit output works for DVI, so I don't know if the HDMI is treated differently and the video signal format within HDMI is different from DVI...

I guess I don't know how to find out if I'm having a hardware issue or a programming issue in getting this to work. I don't even know if all 8800 Ultras are supposed to support the 10 bit output or it depends on manufacturer of the card (I have the EVGA).
"But who prays for Satan? Who, in eighteen centuries, has had the common humanity to pray for the one sinner that needed it most?" --Mark Twain

~~~~~~~~~~~~~~~Looking for a high-performance, easy to use, and lightweight math library? http://www.cmldev.net/ (note: I'm not associated with that project; just a user)
Hey Prune,

Did you manage to get anywhere with this?

I tried it using DirectX, and although I can create a A2R10G10B10 backbuffer (under DirectX9) I could find no way of creating 10 bit final display buffers, so it seems like it's not possible (at least with my 8800GTX).

I notice now also that NVIDIA refer to it as '10-bit display processing', suggesting it may just be internal accuracy rather than output capability.

Anyone else know any more about this?
No :(
"But who prays for Satan? Who, in eighteen centuries, has had the common humanity to pray for the one sinner that needed it most?" --Mark Twain

~~~~~~~~~~~~~~~Looking for a high-performance, easy to use, and lightweight math library? http://www.cmldev.net/ (note: I'm not associated with that project; just a user)

This topic is closed to new replies.

Advertisement