I was experimenting with OpenGL recently, and
discovered a very strange behavior. For the longest
time I thought it was my code, but I think now it is a
bug in the graphics driver I'm using. (it happens to
I was working with code that did the following
(extraneous operations stripped):
(a) enable blending
(b) load a texture with an alpha channel
(c) main loop:
- glClear() to erase the buffer.
- draw a quad with the above texture.
Simple, right? Well, the texture was appearing
perfectly, but without alpha blending -- the background
color was not showing through the transparent portions.
Doing a glColor(1,1,1,0.5) before drawing the quad
worked as expected: the texture became half-transparent.
I wasn't loading the texture in a straightforward
manner (it involved parsing SDL surfaces and other
ugliness), and so I checked many times that my alpha
channel was intact when calling glTexImage2D. It was.
Finally, I tried disabling hardware acceleration --
and the image was blended perfectly onto the
background. At this point, I quit looking for a bug in
my code and started looking for a workaround.
After much experimentation, I discovered that placing
a call to glEnable(GL_BLEND) *within* the drawing loop
alleviated the problem. Further experimentation
revealed that it had a very predictable behavior: if I
placed glEnable() *before* the call to glClear(), I saw
the incorrect behavior described above. If I placed it
*after* the call to glClear, the texture's alpha was
I cannot find any documentation which states that a
call to glClear() should reset blending. Furthermore,
I can't find any documentation of a state where texture
alphas are ignored, but alphas specified by glColor()
are handled correctly.
This may be some sort of card bug, but I think the
driver should be able to work around it. (of course, I
could also be misunderstanding the GL
specification/documentation :) )
Log in to post a comment.