see http://stackoverflow.com/questions/10065849/memory-leak-using-glxcreatecontext for more information :)
We don't really support running VirtualGL with Mesa as the underlying OpenGL stack. Not sure why that would be useful, since VirtualGL's raison d'etre is to provide remote hardware-accelerated OpenGL.
That being said, however, I did examine the VirtualGL code and can't determine why valgrind would be reporting a memory leak. As soon as VirtualGL saves the first GLX FB config value returned from glXChooseFBConfig(), it destroys the table using XFree(), so if there's a leak there, I can only assume that it is a bug in Mesa. No one else has reported this when using OpenGL stacks that we support (such as those provided by nVidia and ATI.)
Further, I'm not sure why you would post a VirtualGL-specific question on StackOverflow. There is no one on that site that understands the system better than its developers do.
I happen to have both mesa and nVidia implementation of OpenGL running here, so I will give a try with nvidia drivers to confirm that the bug is on mesa’s side, and I will let you know the result. I understand that mesa support makes little sense.
Concerning stack overflow, I posted my message before reporting a bug as I first suspected my own code to be wrong.
Any word on this?
Closed due to lack of used response. Assumed to be an issue with Mesa, not VGL. If this leak can be reproduced using a hardware-accelerated OpenGL stack (nVidia or ATI), then please re-open the issue.
Log in to post a comment.