Re: [Algorithms] Gamma correction an HDR images?
Brought to you by:
vexxed72
|
From: Jonathan B. <jo...@nu...> - 2005-06-07 19:08:31
|
>Sure, except for 1.0 you'd need more than 8-bits per component to get >reasonable image quality at the expected dynamic range of the monitor. >Gamma is there to fix this problem. An intuitive way to think of it is >like perceptual compression/encoding. The reason monitors are not at >gamma 1.0 is because the human visual system isn't either. It's really >hard to tell the difference between incrementally brighter pixels at the >bright end of the spectrum, but pretty easy at the dark end, so gamma >2.2 lets you use more bits at the dark end. > > Which is also a good point, what I was saying about monitors outputting at gamma=1 ideally if they actually had the power to produce enough dynamic range physically, sort of comes from some worldview of non-scarcity where it doesn't matter how many bits per pixel you are storing. Which doesn't seem to be true yet, or for a while (since we keep raising resolutions and don't even have enough video RAM for full HDR rendering on one of the next-generation consoles, without doing icky tiling, ahem ahem.) |