[Algorithms] GPU HDR ToneMapping

 [Algorithms] GPU HDR ToneMapping From: JSeb - 2009-11-12 10:18:18 ```Hello, I'm investigating since quite a long time for a good hdr tone-mapping formula For the time being, we're using the following: a) a classic global operator defined by: LinearLDR.rgb = pow(1 - exp(-UserScale*HDR.rgb*AutoScale), UserGamma) SrgbLDR.rgb = LinearToSrgb(LinearLDR.rgb) (display gamma convertion) With AutoScale = clamp(MiddleGray/AverageLuminance, MinScale, MaxScale) This formula works quite well with dynamic adaptation of AverageLuminance. b) Yet, we've some situations where dynamic range of hdr input is too high and tone mapping output burns too much (for instance when getting out of a dark tunnel, outside locally burns) I've a big belief that a local operator could behaves better in case (b). I've made an old try with a self-made multi-scale gaussian blur of intensity (with downsizes/upscales to reach big scales) which was triggering another "AutoScale" factor, yet results were not so good. c) I've recently implement gpu histograms first to visualize hdr input, then i've tried a naive histogram equalization in log space which results is very poor. I'm appealed by some histogram mapping like c.1) "black & white 2" , (minIntens(<1%)=>0, maxIntens(>99%)=>1, midIntens(50%) controling a gamma) yet minIntens=>black is quite odd c.2) smarter histogram equalization (ward histogram adjustement) Yet, I've the feelling histogram won't really solve case (b) (pdf: 2007 "Efficient Histogram Generation Using scattering on GPUs") d) I've more recently implemented a approximation of Reinhard local operator (with box filters computed by summed area tables) result beging to be interesting yet, I've some artifacts, the biggest being banding & aliasing, I fear theses are inhere in the operator... and even if Alpha & Teta parameters default value works (0.05 & 8), others values (like 0.025 proposed by the paper) don't work so well. in addition very high luminance colors don't go to white & made "pure" saturated colors (but this defect may be tweaked) (pdf: 2008 "Real-Time Photographic Local Tone Reproduction Using Summed-Area Tables" Slomp/Oliveira CGI2008) Have you any advices or know local operators which really works for big hdr range images ?? Many thanks JSeb ```

 [Algorithms] GPU HDR ToneMapping From: JSeb - 2009-11-12 10:18:18 ```Hello, I'm investigating since quite a long time for a good hdr tone-mapping formula For the time being, we're using the following: a) a classic global operator defined by: LinearLDR.rgb = pow(1 - exp(-UserScale*HDR.rgb*AutoScale), UserGamma) SrgbLDR.rgb = LinearToSrgb(LinearLDR.rgb) (display gamma convertion) With AutoScale = clamp(MiddleGray/AverageLuminance, MinScale, MaxScale) This formula works quite well with dynamic adaptation of AverageLuminance. b) Yet, we've some situations where dynamic range of hdr input is too high and tone mapping output burns too much (for instance when getting out of a dark tunnel, outside locally burns) I've a big belief that a local operator could behaves better in case (b). I've made an old try with a self-made multi-scale gaussian blur of intensity (with downsizes/upscales to reach big scales) which was triggering another "AutoScale" factor, yet results were not so good. c) I've recently implement gpu histograms first to visualize hdr input, then i've tried a naive histogram equalization in log space which results is very poor. I'm appealed by some histogram mapping like c.1) "black & white 2" , (minIntens(<1%)=>0, maxIntens(>99%)=>1, midIntens(50%) controling a gamma) yet minIntens=>black is quite odd c.2) smarter histogram equalization (ward histogram adjustement) Yet, I've the feelling histogram won't really solve case (b) (pdf: 2007 "Efficient Histogram Generation Using scattering on GPUs") d) I've more recently implemented a approximation of Reinhard local operator (with box filters computed by summed area tables) result beging to be interesting yet, I've some artifacts, the biggest being banding & aliasing, I fear theses are inhere in the operator... and even if Alpha & Teta parameters default value works (0.05 & 8), others values (like 0.025 proposed by the paper) don't work so well. in addition very high luminance colors don't go to white & made "pure" saturated colors (but this defect may be tweaked) (pdf: 2008 "Real-Time Photographic Local Tone Reproduction Using Summed-Area Tables" Slomp/Oliveira CGI2008) Have you any advices or know local operators which really works for big hdr range images ?? Many thanks JSeb ```