From: Brian P. <bri...@tu...> - 2006-04-07 15:31:34
|
Lebsack, Eliot wrote: > Good morning. > > I've been studying this issue off and on for 2 years now, > and have not been able to resolve it, so I figured I'd ask here. > > I've been using Mesa within XFree86 and now xorg from RedHat > Enterprise 3 and 4. By default, if I'm not using any special > 3D hardware drivers, the glxinfo command will generally > only provide contexts with 16-bit depth buffer support. I've tried > to recompile XFree86 and Mesa so that it would use, say, 24 bits > by changing the "DEFAULT_SOFTWARE_DEPTH_BITS" to 24 in the > config.h file. However, this effect does not propagate into the > compiled and installed software. > > There is a reference in the FAQ to "DEPTH_BITS" being changed from > 16 to 24 bits, but there is no further information. Furthermore, > when I search the code base (XFree86 and Mesa), I see no reference > to DEPTH_BITS actually impacting the compiled depth buffer precision. > > Has anyone else been able to change/increase their depth buffer > precision from 16 to 24? If so, what is the proper procedure for > doing this at compile time? Unfortunately, fixing this involves changing some server-side GLX code. The file in question is in the X.org tree at xserver/xorg/GL/mesa/X/xf86glx.c There's a list of default GLX visuals to use when there's no hardware driver. We'd need to add a new visual or two with deeper Z buffers. I don't have time for this right now. You could take a stab at it though. I'm cc'ing Ian since he may be the last person who's worked on that code. -Brian |