From: tom f. <tf...@al...> - 2009-11-18 23:24:20
|
Hi John, sorry this took so long... kind of fell off my radar. John Wythe <bit...@gm...> writes: > On Sat, Nov 7, 2009 at 2:46 PM, tom fogal <tf...@al...> wrote: > > Hi John, > > > > John Wythe <bit...@gm...> writes: > >> I am encountering different rendering behavior between two > >> seemingly compatible Linux environments. [. . .] Below are links to > >> screen-shots and troubleshooting information: > >> > >> Screenshots of the issue: > >> http://lh6.ggpht.com/_mTZwuLfG_iE/SvTnUfC0eWI/AAAAAAAAAB0/SUeL9K7CPcU/s800 > /sc > >> reenshots.jpeg > > > > These look (to me) like they might be Z-fighting issues. > > > > Is there any chance of requesting more resolution from the depth > > buffer? You would normally do this when choosing your glX visual. > > I've never heard of Z-fighting, but I can guess what it is. Probably > the only way I can get more depth from the buffer is to hack at the > wine opengl.dll implementation, since all the GL code is in the > legacy app. Right. > However, I would think that this would not be necessary, as it was > not on my desktop environment. I suppose it is possible something > else is increasing the depth buffer resolution on my desktop. The spec is worded in such a way that allows different implementations to return any among a set of `compatible' buffers. As an example, you might request a 16 bit depth buffer and get a 32bit depth buffer. Another implementation might actually give you the 16bit depth buffer. This can mask subtle bugs; an application might require a 24bit depth buffer, request a 16bit buffer, and through `luck', only be tested on systems that give 32bit depth buffers. See the man page for `glXChooseVisual' for more information. This information should be in the glX spec too, of course. > >> Server environment information: > >> http://docs.google.com/View?id=ddkkm9rx_2fvwmsdpt > >> > >> Desktop environment information: > >> http://docs.google.com/View?id=ddkkm9rx_3dgj28nf4 > > > > Unsurprisingly, your desktop X configuration is using XCB, probably > > with it's libX11 `emulation' of a sorts, while your server > > configuration does not have XCB. > > I did some reading about XCB before my initial message and figured it > was a non-issue since it seems to me like just a binding interface > and an app would have to be written for it to use it; which wine must > not be since it does not require it. This is not true; XCB has an emulation layer of sorts that translates libX11 APIs to libXCB APIs. > >> Instead I compiled Mesa using the xlib software driver. When using > >> this libGL version the application continues to work just fine on my > >> desktop. > > > > Are you absolutely certain you're using Mesa? > > I did not change my xorg.conf, only the LD_LIBRARY_PATH. The output of > ldd glxinfo shows that the linker is using the mesa build of libGL and > glxinfo says the renderer is using the Mesa X11 OpenGL renderer. From > what I understand so far, that means it's using Mesa. > > Without the LD_LIBRARY_PATH override, glxinfo instead says Nvidia is > the renderer. Sound logic, I think. To be absolutely certain, of course, it'd be good to check how this works when you've got a `Driver' of "nv" in your xorg.conf, instead of "nvidia". `rmmod nvidia' if you can too (it seems to load itself automagically when needed anyway). Cheers, -tom |