RE: [Algorithms] Has linear space HDR rendering doomed hardware AA as we know it?
Brought to you by:
vexxed72
From: <c.s...@ph...> - 2006-01-31 09:17:18
|
gregory, alan, when I said MSAA is "useless" I meant this. With MSAA, before you can apply gamma correction or even saturate, you = need to downsample it first. So downsampling always comes before the = nonlinear transfer, while it should be the other way 'round. (I'm assuming that saturation/gamma can't done as last steps in the = shader for sake of alpha blending or additive multipassing). I wouldn't use srgb-write with float render targets. Why have a float = target in the first place? srgb-read is a different thing though. Most albedo-like textures are = perfectly fine in 32-bits resolution assuming "sRGB-law" encoding.=20 robert, in the light of such supersampling, you should even be able to bias the = miplevel +1 or so. -----Original Message----- From: gda...@li... = [mailto:gda...@li...] On Behalf Of = Gregory Massal Sent: Tuesday, January 31, 2006 9:46 AM To: gda...@li... Subject: Re: [Algorithms] Has linear space HDR rendering doomed hardware = AA as we know it? Hi Jonathan, First excuse my English since it is obviously not my first language, First I wrote, > > texture mapping. Because of your non linear transmission, your > > bilinear filtering is not linear anymore (anisotropic, mipmapping) Then you wrote: > Doesn't the SRGBTEXTURE sampler flag take care of this on modern = hardware? As far as I can tell, the SRGTEXTURE sampler state, is only there to address the issue of textures in SRGB space. So no I don't think this was intended to address texturing with HDR rendering. It doesn't address other non linearity problems such as per pixel lighting and shadow mapping either. sRGB is a totally different problem (but maybe it ought to be discussed in a separate thread if it hasn't been already). > And of course it's not an issue if the textures themselves are = authored > in HDR (though that may not really be a good idea for most textures). Hm. No. Textures in HDR space (as used for hdr environment cube maps, = etc.) will still suffer from the same problem even if the hardware is capable of filtering them in hardware. It is not a problem with the authoring, but with the rendering itself. > But I also don't understand why you're talking about linear filtering = as > something great that ought to be staunchly preserved. Linear = filtering > itself introduces a lot of artifacts of just the kind you mention. I agree that texture filtering in hardware is not always so great. But we're talking worse than "not so great" here. It can totally defeat the purpose of it. To illustrate this, I replaced one of the texture in the hdr lighting = sample with a regular checkerboard texture. Here's what regular bilinear filtering is supposed to look like in LDR condition with a simple linear transmission (no saturation): http://www.massal.net/article/hdr/texturefiltering_ldr_linear.png But then if I bring some HDR lighting and some heavy compression in the bright areas: http://www.massal.net/article/hdr/texturefiltering_hdrbright_regular.png And my soft edges don't look soft anymore. Here is the result with the 16x post super sampling I described in my previous post. That helps a little: http://www.massal.net/article/hdr/texturefiltering_hdrbright_postsupersam= pling.png Hope this is a bit clearer. Gr=E9gory ------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Do you grep through log = files for problems? Stop! Download the new AJAX search engine that makes searching your log files as easy as surfing the web. DOWNLOAD SPLUNK! http://sel.as-us.falkag.net/sel?cmd=3Dk&kid=103432&bid#0486&dat=121642 _______________________________________________ GDAlgorithms-list mailing list GDA...@li... https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list Archives: http://sourceforge.net/mailarchive/forum.php?forum_ida88 |