VirtualGL with X11 proxy

Help
2011-12-14
2013-11-26
  • Dan Kulinski
    Dan Kulinski
    2011-12-14

    I have begun testing VirtualGL with our NX machine.  VirtualGL appears to be a tool that will be very handy as we start to move more and more into visualization.  Currently I am failing to get this to work in a certain scenario. 

    Our work flow is generally this:
    Log into NX
    SSH to an interactive workstation
    Set display to NX display
    Start application

    Under this scenario there are a couple tools that don't work due to NX not providing the OpenGL functions.  So, we decided to turn to VirtualGL.

    Here is the work flow that I have tried:
    Log into NX server
    Set VGL_DISPLAY to the nxserver:0 (it has an ATI FireGL card installed with ATI drivers)
    vglconnect -x user@workstation
    vglrun -d <nx display> -c proxy application (vglrun doesn't automatically use the DISPLAY environmental variable)

    At this point I get the error message:
    c001:~> /opt/VirtualGL/bin/vglrun +v -d bear:1001 -c proxy /opt/VirtualGL/bin/glxspheres
    Polygons in scene: 62464
    Shared memory segment ID for vglconfig: 6422530
    VirtualGL v2.3 32-bit (Build 20111213)
    Opening local display bear:1001
    WARNING: VirtualGL attempted and failed to obtain a Pbuffer-enabled
        24-bit visual on the 3D X server bear:1001.  This is normal if
        the 3D application is probing for visuals with certain capabilities,
        but if the app fails to start, then make sure that the 3D X server is
        configured for 24-bit color and has accelerated 3D drivers installed.
    ERROR (596): Could not obtain RGB visual with requested properties

    It seems like it is trying to open a context under the OpenGL capabilities of the NX proxy?

    If I set it to bear:0, the root display, I get access denied errors. 

    I'd like any assistance trying to figure out either what I am misunderstanding or if I have an error in the setup.  I have run the vglserver_config script on the NX server and restarted the display manager.  I can't seem to find any more troubleshooting information and now need help. 

    Thanks,
      Dan Kulinski

     
  • Dan Kulinski
    Dan Kulinski
    2011-12-14

    I have found my error.  My display value for vglrun needed to be bear:0.  Adding this and I am working correctly.

    Dan Kulinski

     
  • DRC
    DRC
    2011-12-14

    The default value of VGL_DISPLAY is :0, so it probably would have worked had you not tried to change VGL_DISPLAY.  Not sure why you tried to override that value.  The documentation explains the difference between the "2D X server" and the "3D X server."  VGL_DISPLAY is used to specify the 3D X server, and the only reason why you would ever need to override the default value of :0 is if you are running multiple 3D X servers on the same machine (presumably because you have multiple 3D graphics cards.)

     
  • Dan Kulinski
    Dan Kulinski
    2011-12-14

    I have used so many different technologies for remote display that I made a fatal backwards assumption about how the vglclient worked.  I have since worked it out.  It was one of those moments where I was looking at your very well made documents and I was struck with the clue bat.  For some reason I had switched the roles of the VGL server and the client in my head.  As most systems send data to the X proxy by default I had assumed that the VGL server was going to be on the proxy.

    Also, in our situation, our compute nodes have no OpenGL hardware acceleration and this is where my testing was being done. My expectations were that I could offload the OpenGL over the network to a machine with hardware acceleration.  This isn't the case either and further colored my interpretation of the software and documentation.  

    I have now set this up correctly and I am seeing increases in glxsphere of nearly 600%.  Now I just need to pass this on to an actual user and have them test the software. 

    Thanks for the feedback, it is appreciated.

    Dan Kulinski