Re: [Celestia-developers] More about GLSL Error.
Real-time 3D visualization of space
Status: Beta
Brought to you by:
cjlaurel
From: Pat S. <pa...@su...> - 2005-12-20 03:07:17
|
NAILED it! Chris Laurel wrote: > In the default solarsys.ssc, the specular mask for Earth is in the alpha > channel of the base texture, whereas Pluto has a specular mask in a > separate texture. So, different shaders will be used for Earth and Pluto. Well, uncommenting that default line in solarsys.ssc that has the specular texture instead works for Earth now too. > I'm baffled. First of all, I think that texture2D should set all > components of color even if the texture doesn't have an alpha channel. In > older versions of OpenGL, there is a well-defined result when sampling > from a texture with 'missing' channels. For example, when sampling from an > RGB texture (no alpha channel), you'll always see an alpha value of 1.0. > I've seen nothing in the spec suggesting that GLSL's texture2D() > instruction behaves any differently. After doing very rudimentary debugging by setting earth's colour to either red, green, or blue based on if statements (how lame!), your eMail sparked an idea... > Second, even if some strange value was read out of the alpha channel, it > should not cause a crash. An alpha greater than 1.0 should present no > problems at all. Have you tried substituting a constant > 1.0 for color.a? > Something like this: > > gl_FragColor = color * diff + 2.0 * spec; I thought that was the case, but trying this proved me wrong. The above ran fine, the highlight just got really bright. > gl_FragColor.rgb = color.rgb * diff.rgb + color.a * spec.rgb; > gl_FragColor.a = diff.a; I had actually tried something really similar to this on my own. First, I split the original: gl_FragColor = color * diff + color.a * spec; into two lines thusly: gl_FragColor = color * diff gl_FragColor += color.a * spec; That crashed. But running each of those lines separately (changing '+=' for '=') ran fine! WTF?! Your two lines worked *perfectly*. I even duplicated the exact original meaning by changing the second line to: gl_FragColor.a = diff.a + spec.a; That worked too! Then I had the truly crazy idea. Remember how we had trouble in the past setting a vector with int values? I thought maybe it wasn't being typecast properly again. So, I went back to the original line, and made a very trivial addition. gl_FragColor = color * diff + float(color.a) * spec; BINGO. I tested with various colours and intensities in the .ssc file. It works. > It sounds to me like something is very broken in the ATI driver. Yes it does! I'm gonna give them shit over this one! The output from texture2D() is very clearly a vec4 made of floats! I'll check in this one-line change. Im theory, it shouldn't affect anything else, not even the binary output. Except on ATI cards. I have meself a perfectly functional OpenGL2.0 Render Path. And all it took was one line, or 10 hours. --Pat |