From: Philip B. <ph...@bo...> - 2002-09-24 00:00:19
|
When a riva TNT card is put into 24bit depth with xfree, it turns out it is actually put into 32-bits-per-pixel mode. Existing utah-glx code does not seem to do anything with that extra bytes worth of video memory. it seems to treat the card as simple "RGB" aka truecolor, rather than RGBA. Or as mesa/utah-glx defines 'pixel formats', PF_8R8G8B rather than PF_8A8B8G8R Can people suggest what we 'should' be doing with that extra byte? Is it being used already, I wonder...? I think there is already hardware depth buffer support: we allocate video ram for that. I'm not sure of the state of alpha blending. Im not sure if triangle drawing has alpha support. But on a pixel-blasting level, it seems as though it is done at the mesa level, (and I dont fully understand the details there) but anyways... can people suggest one way or the other, "yes, you should use the extra byte for alpha information", or possibly something else entirely? (TNT cards allegedly have support for hardware accel alpha blending. Too bad we dont have specs on how it does it...) |