29 minutes ago, Hodgman said:Ugh, a headache. Still 0-1, but either 8, 10 or 12 bits per channel (and using different RGB wavelengths than normal displays... Requiring a colour rotation if you've generated normal content... And there's also a curve that needs to be applied, similar to sRGB but different).
However, each individual HDR display will map 1.0 to somewhere from 1k nits to 10k nits, which is painfully bright.
You want "traditional white" (aka "paper white") -- e.g. the intensity of a white unlit UI element/background -- to be displayed at about 300 nits, which might be ~0.3 on one HDR display, or 0.03 on another HDR display...
So, you need to query the TV to ask what its max nits level is, and then configure your tonemapper accordingly, so that it will result in "white" coming out at 300 nits and everything "brighter than white" going into the (300, 1000] to (300, 10000] range...
But... HDR displays are allowed to respond to the max-nits query with a value of 0, meaning "don't know", in which case you need to display a calibration UI to the user to empirically discover a suitable gamma adjustment (usually around 1 to 1.1), a suitable value for "paper white", and also what the saturation point / max nits is, so you can put an appropriate soft shoulder in your tonemapper...
I honestly expect a lot of HDR-TV compatible games that ship this year to do a pretty bad job of supporting all this
This is not exactly true, and perfectly wrong for the 0.3 vs 0.03, there is some guarantee here !
HDR capable monitors and TV are doing so according to the HDR10 standard. Under HDR10, the display buffer is at least 10bits with PQ ST2084. This encoding is mapping a range of 0-10000nits. If you want your paper write, just write the equivalent of 100nits, and all TVs should be pretty close to it. The same image in the 0-300nits range would be pretty close on all hardware, it is the point of HDR10. then, as you reach more saturated colors and brightness, you enter the blackbox enigma of tone-mapping each vendor implement.
If it is true that the max brightness is unknown ( at least on console, we are denied the value, dxgi can report it on pc, but it is to take with a pinch of salt ), the low brightness should just be close to what you are asking. unless you have some larger bright area that is pulling down the rest of the monitor ( to not blow a fuse ). What we do in our game is to behave linear up to 800-900nits and add a soft toe in order to retain colors/intention over the TV tonemap/clamp for the range it would not support well..
The problem with pre-hdr era monitors and tvs is that they already shoot more than paper white, around 300-400nits and people are use to it, having windows in HDR with the 100nits white windows feel very dim ( it is a stupid mistake of Microsoft to not have a brightness slider for people working in bright environment ). But in a game, you do not want to have paper white at 300-400nits, you would lose a 2 stop of dynamic range just to start with, it would be quite stupid and your picture would not match anymore what art has design.