8 hours ago, turanszkij said:
For example, how does it know which gamma space was the image authored in
If you're using this format, you're declaring that it was authored in sRGB curve is used, which actually isn't a gamma curve at all
Gamma 2.2 is appropriately equal to sRGB, but they're actually quite different down close to black (sRGB is superior). sRGB is the standard for PC and web graphics / displays, so if you're trying to standardise all your artists displays to a common standard, sRGB is the obvious choice.
I would probably only recommend storing colours in a different space if you know that you're not going to use the full black to white range and have more than 8bit precision source data. e.g. Crytek have a workflow that stores the per channel minimum and range in a constant, and they do a MAD after fetching to retrieve the original data. I can't remember, but they might've used a gamma curve too.
On a modern PC you probably won't notice. The per pixel pow is probably under a dozen MADs times the resolution times overdraw. Maybe 20 or 30 million flops, which a modern GPU can eat up.
On the PS3's terrible GPU we had this on our "must do before we can ship" list, and really did notice the change. On some older games we actually used Gamma 2.0 instead of Gamma 2.2 because the math is way cheaper! ![:D :D](https://uploads.gamedev.net/emoticons/medium.biggrin.webp)
However the difference between current low end and current high end is still like 10x different in FLOPs, so if you're optimizing for low end, you still will be trying to claw back every clock cycle you can, and saving 20 million FLOPs would be very welcome ![;) ;)](https://uploads.gamedev.net/emoticons/medium.wink.webp)