From: John W. <bit...@gm...> - 2009-11-08 01:55:39
|
On Sat, Nov 7, 2009 at 2:46 PM, tom fogal <tf...@al...> wrote: > Hi John, > > John Wythe <bit...@gm...> writes: >> I am encountering different rendering behavior between two >> seemingly compatible Linux environments. [. . .] Below are links to >> screen-shots and troubleshooting information: >> >> Screenshots of the issue: >> http://lh6.ggpht.com/_mTZwuLfG_iE/SvTnUfC0eWI/AAAAAAAAAB0/SUeL9K7CPcU/s800/sc >> reenshots.jpeg > > These look (to me) like they might be Z-fighting issues. > > Is there any chance of requesting more resolution from the depth > buffer? You would normally do this when choosing your glX visual. > I've never heard of Z-fighting, but I can guess what it is. Probably the only way I can get more depth from the buffer is to hack at the wine opengl.dll implementation, since all the GL code is in the legacy app. However, I would think that this would not be necessary, as it was not on my desktop environment. I suppose it is possible something else is increasing the depth buffer resolution on my desktop. >> Server environment information: >> http://docs.google.com/View?id=ddkkm9rx_2fvwmsdpt >> >> Desktop environment information: >> http://docs.google.com/View?id=ddkkm9rx_3dgj28nf4 > > Unsurprisingly, your desktop X configuration is using XCB, probably > with it's libX11 `emulation' of a sorts, while your server > configuration does not have XCB. I did some reading about XCB before my initial message and figured it was a non-issue since it seems to me like just a binding interface and an app would have to be written for it to use it; which wine must not be since it does not require it. >> On my desktop I have a NVidia 8800GTS. To try and isolate the >> problem, I wanted to force my desktop to use the software >> renderer. For some unknown reason setting LIBGL_ALWAYS_SOFTWARE=1 has >> no effect. > > You're probably using NVIDIAs driver. Actually, you almost definitely > are, because the only other options are `nv' and `noveau', and of > course the Mesa `swrast' driver. `nv' former can't do 3D, and `noveau' > will crash when used for 3D -- if you're lucky -- AFAICT (never tried > it myself). > If you're using NVIDIA's driver, none of Mesa's environment variables > matter. This makes complete sense now. It did not strike me initially that the using the nvidia driver removes mesa from the rendering pipeline. But that oversight is just me learning about the X architecture still. >> Instead I compiled Mesa using the xlib software driver. When using >> this libGL version the application continues to work just fine on my >> desktop. > > Are you absolutely certain you're using Mesa? I did not change my xorg.conf, only the LD_LIBRARY_PATH. The output of ldd glxinfo shows that the linker is using the mesa build of libGL and glxinfo says the renderer is using the Mesa X11 OpenGL renderer. From what I understand so far, that means it's using Mesa. Without the LD_LIBRARY_PATH override, glxinfo instead says Nvidia is the renderer. > I would recommend you remove any drivers your package manager supplies, > as much as possible at least. This won't be fully possible on Ubuntu > because the removal of all GL impls will make the package manager want > to remove X, but at least remove all nvidia packages. I'll try something like that if I get super desperate, but I don't wish to mess up my development environment. I wanted to get a third machine involved test test on. I was going to use an Ubuntu image on Amazon EC2, but for some reason, as soon as I call winetricks, the server locks up hard. I have to shut it down from EC2. When I get a chance, I might pursue something like this again to test with >> The server on the other hand, is a managed environment without root >> access. The default version of libGL caused the application to crash, >> which initially, I thought was due to an older version of Xvfb. After >> learning much more about xorg, I came to realize that it was not the >> version of Xvfb that made things marginally work, but rather the >> libGL version that was built as a result of building Xvfb/Mesa. Now >> I am only building Mesa and libXmu on the server and using the older >> Xvfb. > > CentOS, IMHO, is trash. Everything's too damn old on it; for software > I work on, we're always hitting things like old compilers not accepting > valid templates or similar. If you can update the toolchain, I would > recommend as much. Or better yet, put a Debian stable / Ubuntu LTS / > hell even openSUSE on the machine and save yourself the pain. Yeah, but unfortunately there is nothing I can do. Cent OS it has to be for now; it's a managed server. I'll try building my own private toolchain on the server, but I did not notice any errors during the build of Mesa. I might save the build log and look closer. >> On the server experiencing the problem, I have set MESA_DEBUG=FP >> to try and get some debug information. I also tried to set >> LIBGL_DEBUG=verbose but that seems to have no effect on either >> machine. Two messages were encountered at various times -only- on the >> server: >> >> "Mesa warning: couldn't open libtxc_dxtn.so, software DXTn >> compression/decompression unavailable" >> and >> "Mesa warning: XGetGeometry failed!" >> >> >> I downloaded the libtxc_dxtn from > [snip] > > I would not worry about it. Mesa will give that warning regardless > of whether or not compressed textures are actually used. I encounter > very few apps that actually use them (I suppose games would frequently, > though?), and in any case the image you sent makes me think there's no > texturing at all in your app, anyway. I assumed as much, but just wanted to make sure. >> Overall these two machines are >> * Using the same version of Mesa >> * Both using software rendering >> * Both using the same version of wine >> >> Which leads me to believe this must be a subtle dependency problem, >> either at runtime or build time. At this point though, I would have >> no idea what could affect the rendering in such a way. > > My best guess is XCB/X11. Try configuring Mesa with --enable-xcb. If > the app is threaded, --enable-glx-tls is probably a good idea as well. > > Beyond that my guess is issues with the ancient toolchain provided by > CentOS. > > You might consider OSMesa for this use case as well. Though, I guess > without source to the application, your only option would be to hack > OSMesa into wine. Thanks Tom. I'll take a look into these things. I guess I will have to try to understand XCB more and how wine might use it implicitly, or explicitly. These are definitely good ideas to try that I would not have thought of. Cheers, John |