I have a VirtualGL setup (VirtualGL v2.2.2-20110421) configured by bumblebee (https://github.com/MrMEEE/bumblebee) so I can use the nvidia card on my nvidia-optimus setup, and generally it works just great.
However, there are two particular wine apps that crash shortly after startup with an X Error (BadMatch - invalid parameter attributes - X_GLXMakeContextCurrent) when I use the VirtualGL server.
The error doesn't occur if I don't use VirtualGL, ie either if I just use the Intel card or if I use another PC with a normal nvidia setup.
The error happens when wine calls glXMakeCurrent:
trace:wgl:X11DRV_wglMakeCurrent make current for dis 0x7d24cb18, drawable 0x6000002, ctx 0xf3e4bce0
[VGL] glXMakeCurrent (dpy=0x7d24cb18(:0) drawable=0x06000002 ctx=0xf3e4bce0
and the call is coming from wine's context_set_pixel_format in dlls/wined3d/context.c (partial wine source attached).
The failure happens when the app tries to set a context when there is already a current context, so commenting out the call to GL_EXTCALL(wglSetPixelFormatWINE(dc, format, NULL)) in wine avoids the problem.
Is this a known problem? Is it because OpenGL doesn't allow the pixel format adjustment, as the comment says in context.c? Could it be handled in VirtualGL?
wine code for context_set_pixel_format
Yeah, I think VirtualGL needs to handle it somehow, but I need to figure out exactly what's going on first. Can you attach an example of an app that causes the failure (just the Windows binary is fine.) You can also e-mail it to me if it's not something you want posted (my e-mail address is on VirtualGL.org.)
Can you please attach an example of an app that demonstrates this failure?
The only apps I know that are experiencing this are Crysis and Crysis 2. Sorry, I can't attach those. But I can probably generate debug information from them if you let me know what might be of interest.
There are more complete logs at http://bugs.winehq.org/show_bug.cgi?id=27169 in case they help.
PS. Sorry for the delay in responding: I thought I would get emails from the tracker when comments were added, but I didn't.
Is there any way you could attach a log containing both the WINE trace information interleaved with VirtualGL trace information (from running vglrun +v +tr)? That would be helpful in seeing which calls in VirtualGL correspond to the trace entries from WINE.
As far as the comment on winehq.org that VirtualGL doesn't handle the recent GL extensions correctly, that's a red herring. The design of VirtualGL is such that it doesn't have to explicitly handle most GL extensions. It uses direct rendering, so once the OpenGL context is established in the Pbuffer, the GL extensions "just work" without any effort on our part.
GLX extensions are another matter. Those indeed do have to be specially handled by VirtualGL. However, the only one that WINE uses that we don't support is GLX_EXT_texture_from_pixmap, and WINE seems to do a reasonable job of falling back if that extension isn't available.
The BadMatch error is being generated whenever VirtualGL calls the "real" glXMakeContextCurrent() function from within the body of its "fake" glXMakeCurrent() function. It's unclear why the error is being generated. I can make some progress on that if I can track down the drawable ID and context ID and see what parameters were used to create those.
Log with wine +wgl,d3d and vgl trace
Here's one I prepared earlier :)
It has wgl and d3d tracing from wine interspersed with VirtualGL tracing. I think I did it with an environment var rather than use vglrun +v +tr so it may not be verbose enough. If not I think I can probably generate another (it's just that the computer using virtualgl is currently several thousand miles away).
I believe I may have worked around it in the 2.2 stable branch (CVS tag stablebranch_2_2). The basic issue is that VirtualGL wasn't handling cases in which the application would map a context to a drawable, then map a new context with different visual properties to the same drawable. Personally, I think this is rather weird behavior-- theoretically, at least, the visual of the context should match the visual of the window it's being applied to, and I'm surprised that WINE is able to get away with this without encountering a BadMatch error in other GLX implementations as well. It may be that it's behaving this way only when it is running in VGL, because VGL's visual matching system is returning the same visual ID for two different FB configs. <sigh> I really wish I could invent a time machine so I could go back and convince the people who designed GLX to skip 1.1 and go straight to 1.3. While I'm at it, I think I'll buy some Google stock.
The workaround was to make VirtualGL check the FB config of the context in glXMake[Context]Current() and, if it doesn't match the FB config of the Pbuffer corresponding to the drawable that is passed to glXMake[Context]Current(), then VGL destroys the Pbuffer, creates a new one with the new FB config, and maps the new Pbuffer to the X11 drawable. If WINE tries to change the FB config in this manner once the application has started, you may get brief visual errors as VGL changes the Pbuffer. I'm hoping that they only do this during initialization, though, in which case it should work OK.
Hey, that's great. I tried to test it but I get a bunch of errors trying to build vgl, like:
/usr/bin/ld: skipping incompatible ../linux/lib/libstdc++.a when searching for -lstdc++
/usr/bin/ld: skipping incompatible /usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/libstdc++.so when searching for -lstdc++
and:
make[3]: *** No rule to make target `../linux/lib/libfbxv.a', needed by `../linux/lib/librrfaker.so'. Stop.
I've installed the build dependencies for 64-bit ubuntu according to BUILDING.txt, including and 32-bit libraries using getlibs, but clearly I'm doing something wrong.
You're missing the 32-bit libstdc++ development package, or possibly the 32-bit GCC package (some distros package libstdc++.a with GCC.)
You're also probably missing the 32-bit X Video development package.
I tried getlibs -p for the packages libstdc++6-4.4-dev, libstdc++6-4.5-dev, and libxv-dev, but I get the same error, even after a make clean. Synaptic doesn't show any dev packages for gcc. "make M32=yes" generates the linux/lib/libstdc++.a file but it is later rejected as incompatible.
Just in case, I also tried installing all the gcc packages I could see in Synaptic with getlibs -p, but no luck.
Actually, I went through this same discussion with the guy who generates the bumblebee packages, and I forget now what the resolution was. For now, I've generated a new pre-release build for you:
http://www.virtualgl.org/DeveloperInfo/PreReleases
Thanks, the new release does fix the problem! I wonder what the build issue is with ubuntu.
Just to check, did the change make it into subversion? I tried the latest version of virtualgl from bumblebee ( 2.2.2svn-8~natty, dated 21 July) and it didn't have the fix.
Urrr... We do not use subversion. We use CVS (at least for the moment-- investigating moving to git.) The change is in our repository, but apparently whoever is packaging bumblebee uses their own repository, and it hasn't been synced.
Also, I think the 32-bit package you were looking for earlier that had the correct libstdc++.a is g++-{version}-multilib.
"Urrr... We do not use subversion." Oops, I knew that!
I tried getlibs -p for both g++-4.5-multilib and g++-4.4-multilib but I still get the incompatible libstdc++ error.
I don't think you use getlibs for those packages. I think you just install them.
I tried just installing them too, but without any luck.
Not sure, then. What version of Ubuntu are you based on? I've only tested 9.04 and 10.04.
I'm using 11.04.