virtualgl-users Mailing List for VirtualGL (Page 3)
3D Without Boundaries
Brought to you by:
dcommander
You can subscribe to this list here.
| 2007 |
Jan
(1) |
Feb
(5) |
Mar
(7) |
Apr
(7) |
May
(1) |
Jun
(10) |
Jul
(5) |
Aug
(4) |
Sep
(16) |
Oct
(2) |
Nov
(8) |
Dec
(3) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2008 |
Jan
(6) |
Feb
(18) |
Mar
(6) |
Apr
(5) |
May
(15) |
Jun
(6) |
Jul
(23) |
Aug
(5) |
Sep
(9) |
Oct
(15) |
Nov
(7) |
Dec
(3) |
| 2009 |
Jan
(22) |
Feb
(13) |
Mar
(15) |
Apr
(3) |
May
(19) |
Jun
(1) |
Jul
(44) |
Aug
(16) |
Sep
(13) |
Oct
(32) |
Nov
(34) |
Dec
(6) |
| 2010 |
Jan
(5) |
Feb
(27) |
Mar
(28) |
Apr
(29) |
May
(19) |
Jun
(30) |
Jul
(14) |
Aug
(5) |
Sep
(17) |
Oct
(10) |
Nov
(13) |
Dec
(13) |
| 2011 |
Jan
(18) |
Feb
(34) |
Mar
(57) |
Apr
(39) |
May
(28) |
Jun
(2) |
Jul
(7) |
Aug
(17) |
Sep
(28) |
Oct
(25) |
Nov
(17) |
Dec
(15) |
| 2012 |
Jan
(15) |
Feb
(47) |
Mar
(40) |
Apr
(15) |
May
(15) |
Jun
(34) |
Jul
(44) |
Aug
(66) |
Sep
(34) |
Oct
(8) |
Nov
(37) |
Dec
(23) |
| 2013 |
Jan
(14) |
Feb
(26) |
Mar
(38) |
Apr
(27) |
May
(33) |
Jun
(67) |
Jul
(14) |
Aug
(39) |
Sep
(24) |
Oct
(59) |
Nov
(29) |
Dec
(16) |
| 2014 |
Jan
(21) |
Feb
(17) |
Mar
(21) |
Apr
(11) |
May
(10) |
Jun
(2) |
Jul
(10) |
Aug
|
Sep
(23) |
Oct
(16) |
Nov
(7) |
Dec
(2) |
| 2015 |
Jan
(7) |
Feb
|
Mar
(26) |
Apr
|
May
(2) |
Jun
(16) |
Jul
(1) |
Aug
(5) |
Sep
(6) |
Oct
(10) |
Nov
(5) |
Dec
(6) |
| 2016 |
Jan
|
Feb
(6) |
Mar
|
Apr
(2) |
May
|
Jun
(6) |
Jul
(5) |
Aug
|
Sep
(17) |
Oct
(6) |
Nov
(2) |
Dec
(4) |
| 2017 |
Jan
(3) |
Feb
(25) |
Mar
(4) |
Apr
(3) |
May
(4) |
Jun
(10) |
Jul
(1) |
Aug
(8) |
Sep
|
Oct
|
Nov
|
Dec
|
|
From: Jason K. <tek...@gm...> - 2017-02-15 18:02:16
|
> > Message: 1 > Date: Tue, 14 Feb 2017 23:03:15 -0500 > From: Wei Liu <wei...@gm...> > Subject: [VirtualGL-Users] server have multi GPUs not connected to > physical screen > To: vir...@li... > Message-ID: > <CAJoKFkTgR578-LhJoxEn5UjuAE+GTAn6_z8Pg5Ftdbhudsw+uA@mail. > gmail.com> > Content-Type: text/plain; charset="utf-8" > Hi VirtualGL users, > I need your help on a set up of virtualGL, or decide if I can use virtualGL > to solve my problem. > My setup: a Ubunt 14.04 server called 'roundvalley' has multiple Nvidia > GPUs for parallel computing, but they do not have any VGA/DVI/DisplayPort > and not connected to monitor. The only onboard-card connected to a monitor > does not have 3D capability, and Ubuntu automatically load a kernel driver > module (I forgot the module name) for this onboard card. > Client side is Windows. My goal is to use VNC to connect to the server and > run 3D application (need OpenGL) in my VNC session. > My question is, can I configure virtualGL to use any of Nvidia GPU even > they are not used for display currently? > I read the official document and found: > ==== user doc ===== > 6.2 Using VirtualGL with Multiple GPUs > VirtualGL can redirect the OpenGL commands from a 3D application to any GPU > in the server machine. In order for this to work, however, all of the GPUs > must be attached to different screens on the same X server or to different > X servers. > ====== end user doc ====== > Does it mean the GPU need to connect to physical screens, or some > definitions in X.org file? > I appreciate your help. > Thanks, > Wei > > First off I would recommend the following setup Ubuntu Machine Install VirtualGL, Turbojpeg, Synergy, Xpra Windows, Machine Installl Synergy, Xpra This setup will allow you to run a full 3d desktop and just the 3d applications you need. Example Here are the two scripts I use. To start the server its just one command #startserver xpra start --max-size=1280x720 --start-child=start3dprogram --speaker=disabled --bind-tcp=19 2.168.2.1:1000 #start3dprogram vglclient -detach vglrun -q 50 -c rgb /opt/bin/3dprogram Then use Synergy to connect the keyboard. Then you can just connect from the Windows client. With this setup you have, clipboard,Audio,Print, 3d gl support and you save resources from running just the 3d app. The configuration is also all gui front end through xpra and synergy. |
|
From: DRC <dco...@us...> - 2017-02-15 17:44:06
|
That is all normal. The rmmod error is telling you that the nVidia
module is in use, which is why the subsequent message tells you to run
'rmmod nvidia' with the DM stopped in order to remove the module. That
is only necessary if you want to activate VGL without rebooting. Since
you rebooted, you're good.
The TurboVNC display will not have a GLX extension. That's why
VirtualGL exists. VirtualGL will redirect GLX calls from an OpenGL
application to display :0, which has a hardware-accelerated GLX extension.
Connect to the TurboVNC session and run
vglrun {your_application}
and it should work.
On 2/15/17 11:38 AM, Wei Liu wrote:
> Darrel,
>
> Thanks for the information. I followed your suggestion and created a
> xorg.conf file. Then I stopped the display manager, and run
> 'vglserver_config'. After answering questions, I got the following error:
>
> ====== vglserver_config output =======
>
> .. Creating /etc/modprobe.d/virtualgl.conf to set requested permissions
> for____
>
> /dev/nvidia* ...____
>
> ... Attempting to remove nvidia module from memory so device permissions____
>
> will be reloaded ...____
>
> *rmmod: ERROR: Module nvidia is in use by: nvidia_modeset nvidia_uvm____*
>
> ... Granting write permission to /dev/nvidia-uvm /dev/nvidia-uvm-tools
> /dev/nvid ia0 /dev/nvidia1 /dev/nvidia2 /dev/nvidia3 /dev/nvidia4
> /dev/nvidia5 /dev/nvidia 6 /dev/nvidia7 /dev/nvidiactl for all
> users ...____
>
> ... Granting write permission to /dev/dri/card0 for all users ...____
>
> ... /etc/X11/xorg.conf has been saved as /etc/X11/xorg.conf.orig.vgl ...____
>
> ... Modifying /etc/X11/xorg.conf to enable DRI permissions____
>
> for all users ...____
>
> ... /etc/lightdm/lightdm.conf has been saved as
> /etc/lightdm/lightdm.conf.orig.v gl ...____
>
> ... Adding display-setup-script=xhost +LOCAL: to
> /etc/lightdm/lightdm.conf ...____
>
> __ __
>
> Done. You must restart the display manager for the changes to take
> effect.____
>
> __ __
>
> IMPORTANT NOTE: Your system uses modprobe.d to set device permissions.
> You____
>
> must execute rmmod nvidia with the display manager stopped in order for
> the____
>
> new device permission settings to become effective.
>
> ===== end of vglserver_config =======
>
>
> I chose 'no' to all the questions, since I don't worry about security in
> our environment.
>
>
> Then I installed TubroVNC and run vncserver to get a new vnc session at
> port 3.
>
>
> If I run DISPLAY=:0 glxinfo, I can see direct rendering is yes, and glx
> vendor string is Nvidia, but if I run DISPLAY=:3 glxinfo, I got error
> "could not find RGB GLX visual or fbconfig".
>
>
> xdpyinfo -display :3 shows vendor string is TubroVNC...
>
>
> I also reboot the system, but got the same results. I attached the
> output log of runnining xdpyinfo and glxinfo on port 0 and 3
> respectively. (the file name has 0 and 3 for the port number).
>
>
> Did I miss anything? Where should I start troubleshooting?
>
>
> Thanks for the help!
>
>
>
>
>
> On Tue, Feb 14, 2017 at 11:53 PM, DRC <dco...@us...
> <mailto:dco...@us...>> wrote:
>
> You can use headless GPUs with VirtualGL. Install the nVidia
> proprietary drivers (I believe Ubuntu provides a package for these),
> then run:
>
> nvidia-xconfig -a --use-display-device=None --virtual=1920x1200
>
> (I don't think the resolution specified for --virtual matters, since
> it's headless.)
>
> Run vglserver_config to grant access to the X server while the display
> manager is sitting at the login prompt, then restart the display
> manager, and ideally you should be able to do
>
> xauth merge /etc/opt/VirtualGL/vgl_xauth_key
> DISPLAY=:0 glxinfo
>
> and glxinfo should report that it is using the nVidia OpenGL renderer.
>
> On 2/14/17 10:03 PM, Wei Liu wrote:
> > Hi VirtualGL users,
> >
> > I need your help on a set up of virtualGL, or decide if I can use
> > virtualGL to solve my problem.
> >
> > My setup: a Ubunt 14.04 server called 'roundvalley' has multiple Nvidia
> > GPUs for parallel computing, but they do not have any
> > VGA/DVI/DisplayPort and not connected to monitor. The only onboard-card
> > connected to a monitor does not have 3D capability, and Ubuntu
> > automatically load a kernel driver module (I forgot the module name) for
> > this onboard card.
> >
> > Client side is Windows. My goal is to use VNC to connect to the server
> > and run 3D application (need OpenGL) in my VNC session.
> >
> > My question is, can I configure virtualGL to use any of Nvidia GPU even
> > they are not used for display currently?
> >
> > I read the official document and found:
> > ==== user doc =====
> > 6.2 Using VirtualGL with Multiple GPUs
> >
> > VirtualGL can redirect the OpenGL commands from a 3D application to any
> > GPU in the server machine. In order for this to work, however, all of
> > the GPUs must be attached to different screens on the same X server or
> > to different X servers.
> > ====== end user doc ======
> >
> > Does it mean the GPU need to connect to physical screens, or some
> > definitions in X.org file?
> >
> > I appreciate your help.
> >
> > Thanks,
> > Wei
>
>
> ------------------------------------------------------------------------------
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
> _______________________________________________
> VirtualGL-Users mailing list
> Vir...@li...
> <mailto:Vir...@li...>
> https://lists.sourceforge.net/lists/listinfo/virtualgl-users
> <https://lists.sourceforge.net/lists/listinfo/virtualgl-users>
>
>
>
>
> ------------------------------------------------------------------------------
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
>
>
>
> _______________________________________________
> VirtualGL-Users mailing list
> Vir...@li...
> https://lists.sourceforge.net/lists/listinfo/virtualgl-users
>
|
|
From: Wei L. <wei...@gm...> - 2017-02-15 17:38:30
|
Darrel,
Thanks for the information. I followed your suggestion and created a
xorg.conf file. Then I stopped the display manager, and run
'vglserver_config'. After answering questions, I got the following error:
====== vglserver_config output =======
.. Creating /etc/modprobe.d/virtualgl.conf to set requested permissions for
/dev/nvidia* ...
... Attempting to remove nvidia module from memory so device permissions
will be reloaded ...
*rmmod: ERROR: Module nvidia is in use by: nvidia_modeset nvidia_uvm*
... Granting write permission to /dev/nvidia-uvm /dev/nvidia-uvm-tools
/dev/nvid ia0 /dev/nvidia1 /dev/nvidia2 /dev/nvidia3 /dev/nvidia4
/dev/nvidia5 /dev/nvidia 6 /dev/nvidia7 /dev/nvidiactl for all users
...
... Granting write permission to /dev/dri/card0 for all users ...
... /etc/X11/xorg.conf has been saved as /etc/X11/xorg.conf.orig.vgl ...
... Modifying /etc/X11/xorg.conf to enable DRI permissions
for all users ...
... /etc/lightdm/lightdm.conf has been saved as
/etc/lightdm/lightdm.conf.orig.v
gl ...
... Adding display-setup-script=xhost +LOCAL: to /etc/lightdm/lightdm.conf
...
Done. You must restart the display manager for the changes to take effect.
IMPORTANT NOTE: Your system uses modprobe.d to set device permissions. You
must execute rmmod nvidia with the display manager stopped in order for the
new device permission settings to become effective.
===== end of vglserver_config =======
I chose 'no' to all the questions, since I don't worry about security in
our environment.
Then I installed TubroVNC and run vncserver to get a new vnc session at
port 3.
If I run DISPLAY=:0 glxinfo, I can see direct rendering is yes, and glx
vendor string is Nvidia, but if I run DISPLAY=:3 glxinfo, I got error
"could not find RGB GLX visual or fbconfig".
xdpyinfo -display :3 shows vendor string is TubroVNC...
I also reboot the system, but got the same results. I attached the output
log of runnining xdpyinfo and glxinfo on port 0 and 3 respectively. (the
file name has 0 and 3 for the port number).
Did I miss anything? Where should I start troubleshooting?
Thanks for the help!
On Tue, Feb 14, 2017 at 11:53 PM, DRC <dco...@us...>
wrote:
> You can use headless GPUs with VirtualGL. Install the nVidia
> proprietary drivers (I believe Ubuntu provides a package for these),
> then run:
>
> nvidia-xconfig -a --use-display-device=None --virtual=1920x1200
>
> (I don't think the resolution specified for --virtual matters, since
> it's headless.)
>
> Run vglserver_config to grant access to the X server while the display
> manager is sitting at the login prompt, then restart the display
> manager, and ideally you should be able to do
>
> xauth merge /etc/opt/VirtualGL/vgl_xauth_key
> DISPLAY=:0 glxinfo
>
> and glxinfo should report that it is using the nVidia OpenGL renderer.
>
> On 2/14/17 10:03 PM, Wei Liu wrote:
> > Hi VirtualGL users,
> >
> > I need your help on a set up of virtualGL, or decide if I can use
> > virtualGL to solve my problem.
> >
> > My setup: a Ubunt 14.04 server called 'roundvalley' has multiple Nvidia
> > GPUs for parallel computing, but they do not have any
> > VGA/DVI/DisplayPort and not connected to monitor. The only onboard-card
> > connected to a monitor does not have 3D capability, and Ubuntu
> > automatically load a kernel driver module (I forgot the module name) for
> > this onboard card.
> >
> > Client side is Windows. My goal is to use VNC to connect to the server
> > and run 3D application (need OpenGL) in my VNC session.
> >
> > My question is, can I configure virtualGL to use any of Nvidia GPU even
> > they are not used for display currently?
> >
> > I read the official document and found:
> > ==== user doc =====
> > 6.2 Using VirtualGL with Multiple GPUs
> >
> > VirtualGL can redirect the OpenGL commands from a 3D application to any
> > GPU in the server machine. In order for this to work, however, all of
> > the GPUs must be attached to different screens on the same X server or
> > to different X servers.
> > ====== end user doc ======
> >
> > Does it mean the GPU need to connect to physical screens, or some
> > definitions in X.org file?
> >
> > I appreciate your help.
> >
> > Thanks,
> > Wei
>
>
> ------------------------------------------------------------
> ------------------
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
> _______________________________________________
> VirtualGL-Users mailing list
> Vir...@li...
> https://lists.sourceforge.net/lists/listinfo/virtualgl-users
>
|
|
From: DRC <dco...@us...> - 2017-02-15 04:53:21
|
You can use headless GPUs with VirtualGL. Install the nVidia
proprietary drivers (I believe Ubuntu provides a package for these),
then run:
nvidia-xconfig -a --use-display-device=None --virtual=1920x1200
(I don't think the resolution specified for --virtual matters, since
it's headless.)
Run vglserver_config to grant access to the X server while the display
manager is sitting at the login prompt, then restart the display
manager, and ideally you should be able to do
xauth merge /etc/opt/VirtualGL/vgl_xauth_key
DISPLAY=:0 glxinfo
and glxinfo should report that it is using the nVidia OpenGL renderer.
On 2/14/17 10:03 PM, Wei Liu wrote:
> Hi VirtualGL users,
>
> I need your help on a set up of virtualGL, or decide if I can use
> virtualGL to solve my problem.
>
> My setup: a Ubunt 14.04 server called 'roundvalley' has multiple Nvidia
> GPUs for parallel computing, but they do not have any
> VGA/DVI/DisplayPort and not connected to monitor. The only onboard-card
> connected to a monitor does not have 3D capability, and Ubuntu
> automatically load a kernel driver module (I forgot the module name) for
> this onboard card.
>
> Client side is Windows. My goal is to use VNC to connect to the server
> and run 3D application (need OpenGL) in my VNC session.
>
> My question is, can I configure virtualGL to use any of Nvidia GPU even
> they are not used for display currently?
>
> I read the official document and found:
> ==== user doc =====
> 6.2 Using VirtualGL with Multiple GPUs
>
> VirtualGL can redirect the OpenGL commands from a 3D application to any
> GPU in the server machine. In order for this to work, however, all of
> the GPUs must be attached to different screens on the same X server or
> to different X servers.
> ====== end user doc ======
>
> Does it mean the GPU need to connect to physical screens, or some
> definitions in X.org file?
>
> I appreciate your help.
>
> Thanks,
> Wei
|
|
From: Wei L. <wei...@gm...> - 2017-02-15 04:03:22
|
Hi VirtualGL users, I need your help on a set up of virtualGL, or decide if I can use virtualGL to solve my problem. My setup: a Ubunt 14.04 server called 'roundvalley' has multiple Nvidia GPUs for parallel computing, but they do not have any VGA/DVI/DisplayPort and not connected to monitor. The only onboard-card connected to a monitor does not have 3D capability, and Ubuntu automatically load a kernel driver module (I forgot the module name) for this onboard card. Client side is Windows. My goal is to use VNC to connect to the server and run 3D application (need OpenGL) in my VNC session. My question is, can I configure virtualGL to use any of Nvidia GPU even they are not used for display currently? I read the official document and found: ==== user doc ===== 6.2 Using VirtualGL with Multiple GPUs VirtualGL can redirect the OpenGL commands from a 3D application to any GPU in the server machine. In order for this to work, however, all of the GPUs must be attached to different screens on the same X server or to different X servers. ====== end user doc ====== Does it mean the GPU need to connect to physical screens, or some definitions in X.org file? I appreciate your help. Thanks, Wei |
|
From: Faiz A. <fab...@vt...> - 2017-01-26 14:59:05
|
Ah, yeah, that makes sense. Thanks! On Wed, Jan 25, 2017 at 2:49 PM, DRC <dco...@us...> wrote: > Without a display manager? Yes-- simply use startx to start a local > (server-side) X server on the GPU, then use `xhost +LOCAL:` to grant > access for that X server to the local machine (or share the XAuth token > with other users who need to access it.) vglserver_config essentially > automates the latter, but it requires the use of a display manager. > > Without a "3D X server"? Not currently, but I am seeking funding to > make that happen: > https://github.com/VirtualGL/virtualgl/issues/10 > > The technology is already in place, thanks to nVidia, but extensive work > will be required to utilize it. One of the things that will be required > is a complete overhaul of VirtualGL so that it uses FBOs rather than > Pbuffers for off-screen rendering, and this is likely to be very painful. > > > On 1/25/17 12:22 PM, Faiz Abidi wrote: > > Hello VirtualGL Community, > > > > Quick question - anyone knows if I can get VirtualGL to work without a > > display manager like gdm installed? > > > > I did try, and though I can get xlogo to work, glxgears fails with > > "Could not open display :0". > > > > If anyone has any pointers on this, it would be helpful. > > ------------------------------------------------------------ > ------------------ > Check out the vibrant tech community on one of the world's most > engaging tech sites, SlashDot.org! http://sdm.link/slashdot > _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users > -- Faiz Abidi | Master's Student at Virginia Tech | www.faizabidi.com | +1-540-998-6636 |
|
From: DRC <dco...@us...> - 2017-01-25 19:49:29
|
Without a display manager? Yes-- simply use startx to start a local (server-side) X server on the GPU, then use `xhost +LOCAL:` to grant access for that X server to the local machine (or share the XAuth token with other users who need to access it.) vglserver_config essentially automates the latter, but it requires the use of a display manager. Without a "3D X server"? Not currently, but I am seeking funding to make that happen: https://github.com/VirtualGL/virtualgl/issues/10 The technology is already in place, thanks to nVidia, but extensive work will be required to utilize it. One of the things that will be required is a complete overhaul of VirtualGL so that it uses FBOs rather than Pbuffers for off-screen rendering, and this is likely to be very painful. On 1/25/17 12:22 PM, Faiz Abidi wrote: > Hello VirtualGL Community, > > Quick question - anyone knows if I can get VirtualGL to work without a > display manager like gdm installed? > > I did try, and though I can get xlogo to work, glxgears fails with > "Could not open display :0". > > If anyone has any pointers on this, it would be helpful. |
|
From: Faiz A. <fab...@vt...> - 2017-01-25 18:22:26
|
Hello VirtualGL Community, Quick question - anyone knows if I can get VirtualGL to work without a display manager like gdm installed? I did try, and though I can get xlogo to work, glxgears fails with "Could not open display :0". If anyone has any pointers on this, it would be helpful. Best, -- Faiz Abidi | Master's Student at Virginia Tech | www.faizabidi.com | +1-540-998-6636 |
|
From: DRC <dco...@us...> - 2016-12-09 18:50:11
|
Not sure what you mean by "statically", but you can control the vglclient port using environment variables: https://cdn.rawgit.com/VirtualGL/virtualgl/2.5.1/doc/index.html#hd0019002 On 12/9/16 12:30 PM, Anthony Moon wrote: > Hi all, > > > > Quick question, is it possible to statically set the TCP port that VGL > will use for the image transport back to the client display server? |
|
From: Anthony M. <ant...@mo...> - 2016-12-09 18:31:04
|
Hi all, Quick question, is it possible to statically set the TCP port that VGL will use for the image transport back to the client display server? Thanks |
|
From: DRC <dco...@us...> - 2016-12-01 18:40:11
|
I'm not very familiar with Steam in-home streaming, but my guess, based on the advertising materials for it, is that it's an application-specific screen scraping solution. In other words, a copy of Steam runs on one PC and sends a copy of the graphics it's rendering on that PC's physical display to another copy of Steam running on another PC. I don't know the technical details of how they do that, but it is almost certainly a single-user solution. VirtualGL, on the other hand, is designed to be a multi-user solution, and it is not application-specific. With TurboVNC or another X proxy, you can create multiple virtual desktop instances on a single Linux/Unix machine. Then you use VirtualGL to launch any 3D application in that virtual desktop session. VirtualGL modifies the GLX and OpenGL commands from any arbitrary application such that 3D rendering is redirected into off-screen buffers on the GPU, thus allowing multiple applications and multiple users to share the same GPU, and thus allowing OpenGL applications to run with 3D hardware acceleration in the X proxy session (which ordinarily wouldn't be possible-- virtual desktops normally can only support software OpenGL.) On 11/29/16 5:35 AM, Marco Müller wrote: > Hello, > I tried both virtualgl and steam inhome streaming and for me the steam > solution works very good: > in Kwin all compiz effects activated incl. wobbly windows and very smooth > in Kodi I can activate nvidia hardware accelerated video > in steam i can stream games like Bioshock infinite over wlan without problems > Whats the diffrence between virtualgl and steam streaming, I am not a pro, can > you explain it? > Is there for ubuntu a multiuser terminal server uses this steam technics, i > cant find any on the net? > > regards marco > > my hardware: > server i5 2600 Nvidia geforce 550Ti > client Aspire one cloudbook Celeron N3050 |
|
From: Marco M. <use...@ya...> - 2016-12-01 08:09:00
|
Hello, I tried both virtualgl and steam inhome streaming and for me the steam solution works very good: in Kwin all compiz effects activated incl. wobbly windows and very smooth in Kodi I can activate nvidia hardware accelerated video in steam i can stream games like Bioshock infinite over wlan without problems Whats the diffrence between virtualgl and steam streaming, I am not a pro, can you explain it? Is there for ubuntu a multiuser terminal server uses this steam technics, i cant find any on the net? regards marco my hardware: server i5 2600 Nvidia geforce 550Ti client Aspire one cloudbook Celeron N3050 |
|
From: Marco M. <use...@ya...> - 2016-11-29 11:35:51
|
Hello, I tried both virtualgl and steam inhome streaming and for me the steam solution works very good: in Kwin all compiz effects activated incl. wobbly windows and very smooth in Kodi I can activate nvidia hardware accelerated video in steam i can stream games like Bioshock infinite over wlan without problems Whats the diffrence between virtualgl and steam streaming, I am not a pro, can you explain it? Is there for ubuntu a multiuser terminal server uses this steam technics, i cant find any on the net? regards marco my hardware: server i5 2600 Nvidia geforce 550Ti client Aspire one cloudbook Celeron N3050 |
|
From: Murat M. <mur...@ou...> - 2016-11-12 18:08:03
|
Hi, We have a bunch of centos 7.2 workstation with geforce cards in them. I configured VNC + VGL on them and they work fine (both through VNC sessions or by vglconnect). So all is good there. These are not headless, but used interactively. I have a different workstation with two GPUs in them. one GPU is a NVS card, to which the monitor is attached and there is a headless geforce. There is no X running on the NVS. I installed the nvidia drivers, vnc, and configured the VGL. But when I connect through vnc and run vglrun +v glxinfo I get Xlib: extension GLX missing on display :0 error Couldn't find RBG GLX visual or fbconfig. Can someone provide some pointers on how to do the configuration appropriately? This is a test bed for a server that will have an onboard graphics card and a headless GPU (or two) to serve users 3D applications. cat /etc/X11/xorg.conf nvidia-xconfig: X configuration file generated by nvidia-xconfig nvidia-xconfig: version 370.23 (buildmeister@swio-display-x86-rhel47-02) Mon Aug 8 18:33:47 PDT 2016 Section "DRI" Mode 0666 EndSection Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/input/mice" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" HorizSync 28.0 - 33.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GT 740" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "ConnectedMonitor" "DFP" SubSection "Display" Virtual 1280 1024 Depth 24 EndSubSection EndSection This is the VGA output of lspci 03:00.0 VGA compatible controller: NVIDIA Corporation GK107 [GeForce GT 740] (rev a1) 04:00.0 VGA compatible controller: NVIDIA Corporation GT218 [NVS 300] (rev a2) This is the VNC log Xvnc TigerVNC 1.3.1 - built Mar 31 2016 16:34:53 Copyright (C) 1999-2011 TigerVNC Team and many others (see README.txt) See http://www.tigervnc.org for information on TigerVNC. Underlying X server release 11702000, The X.Org Foundation Fri Nov 11 23:13:55 2016 vncext: VNC extension running! vncext: Listening for VNC connections on all interface(s), port 5900 vncext: created VNC server for screen 0 (xfce4-session:2262): xfce4-session-WARNING **: xfsm_manager_load_session: Something wrong with /root/.cache/sessions/xfce4-s ession-EWRGSRF27.childrens.sea.kids:0, Does it exist? Permissions issue? (xfsettingsd:2288): xfsettingsd-WARNING **: Failed to get the _NET_NUMBER_OF_DESKTOPS property. Fri Nov 11 23:14:08 2016 Connections: accepted: 10.40.84.60::48391 SConnection: Client needs protocol version 3.8 SConnection: Client requests security type VncAuth(2) Fri Nov 11 23:14:11 2016 SConnection: AuthFailureException: Authentication failure Connections: closed: 10.40.84.60::48391 (Authentication failure) Fri Nov 11 23:14:18 2016 Connections: accepted: 10.40.84.60::48469 SConnection: Client needs protocol version 3.8 SConnection: Client requests security type VncAuth(2) Fri Nov 11 23:14:21 2016 VNCSConnST: Server default pixel format depth 24 (32bpp) little-endian rgb888 VNCSConnST: Client pixel format depth 24 (32bpp) little-endian rgb888 |
|
From: DRC <dco...@us...> - 2016-10-29 15:33:36
|
We are now spinning continuous "official" pre-release builds from the VirtualGL master (2.5.x) branch and the TurboVNC master (2.1.x) and dev (2.2.x) branches, using Travis and AppVeyor. Refer to: http://www.virtualgl.org/DeveloperInfo/PreReleases and http://www.turbovnc.org/DeveloperInfo/PreReleases for more information. There are not many notable changes in the TurboVNC dev branch yet, but the TurboVNC dev pre-release builds incorporate the evolving version of libjpeg-turbo 1.6, so they include the AVX2 SIMD work that has been done thus far (which should make those builds significantly faster on recent CPUs.) DRC |
|
From: DRC <dco...@us...> - 2016-10-06 12:48:57
|
I'm not sure which version of GDM introduced this bug, but it was definitely recent, so that makes sense. Glad we were able to resolve it. > On Oct 6, 2016, at 5:10 AM, Wolfgang Verestek <wol...@im...> wrote: > > That was the hint. Switching to lightdm resolved my issues. Btw. In 14.04 with gdm3 it was working properly. > > Thanks a lot > > > -----Ursprüngliche Nachricht----- > Von: DRC [mailto:dco...@us...] > Gesendet: Mittwoch, 5. Oktober 2016 17:30 > An: vir...@li... > Betreff: Re: [VirtualGL-Users] Ubuntu 16.04 + VNC + VirtualGL: No protocol specified /unable to open display :0 > > In vglserver_config, there is an option to grant 3D X server access to members of the vglusers group or to all users of the system. If you elect to grant access only to members of vglusers, then it will modify the display manager startup script to run vglgenkey, which generates vgl_xauth_key. > > However, I'm noticing that you're using GDM rather than LightDM. There is a known issue that prevents /etc/gdm/Init/Default from being executed by certain versions of GDM 3: > > https://bugzilla.redhat.com/show_bug.cgi?id=851769 > > That issue is known to affect Fedora 22 and later, and it might be affecting the version of GDM shipped with Ubuntu 16.04. I've never actually tested GDM 3 on Ubuntu 16, because I can't even get it to start properly. LightDM is the only version that works on my system. > > Try modifying /etc/gdm3/Init/Default and add: > > echo test >/tmp/test > > to the top. Restart GDM. If /tmp/test doesn't get created, then you are likely running into the aforementioned issue, and the only known workaround is to use LightDM. > > >> On 10/5/16 2:50 AM, Wolfgang Verestek wrote: >> Thanks for the reply. The link solved some issues with the nvidia driver. >> But the original problem remains, I'm not able to connect to display :0. >> >> [VGL] Shared memory segment ID for vglconfig: 5734445 [VGL] VirtualGL >> v2.5.1 64-bit (Build 20161001) [VGL] Opening connection to 3D X server >> 0 [VGL] ERROR: Could not open display 0. >> >> On the other hand if I start vnc and use "startx" in a terminal to >> start a new X server, I can open a 3d application with 2vglrun -d :1 application". >> >> wolfgang@WS-Mikro:~$ startx >> X.Org X Server 1.18.3 >> Release Date: 2016-04-04 >> X Protocol Version 11, Revision 0 >> Build Operating System: Linux 3.13.0-92-generic x86_64 Ubuntu Current >> Operating System: Linux WS-Mikro 4.4.0-38-generic #57-Ubuntu SMP Tue >> Sep 6 15:42:33 UTC 2016 x86_64 Kernel command line: >> BOOT_IMAGE=/boot/vmlinuz-4.4.0-38-generic >> root=UUID=0aed3666-5f62-49e1-b57c-7a64d6318dd1 ro splash quiet >> nomodeset Build Date: 22 July 2016 07:50:34AM xorg-server >> 2:1.18.3-1ubuntu2.3 (For technical support please see >> http://www.ubuntu.com/support) >> Current version of pixman: 0.33.6 >> Before reporting problems, check http://wiki.x.org >> to make sure that you have the latest version. >> Markers: (--) probed, (**) from config file, (==) default setting, >> (++) from command line, (!!) notice, (II) informational, >> (WW) warning, (EE) error, (NI) not implemented, (??) unknown. >> (==) Log file: "/var/log/Xorg.1.log", Time: Wed Oct 5 09:24:17 2016 >> (==) Using system config directory "/usr/share/X11/xorg.conf.d" >> The XKEYBOARD keymap compiler (xkbcomp) reports: >>> Warning: Type "ONE_LEVEL" has 1 levels, but <RALT> has 2 symbols >>> Ignoring extra symbols >> Errors from xkbcomp are not fatal to the X server >> >> wolfgang@WS-Mikro:~$ vglrun -d :1 /opt/ovito-2.7.0-x86_64/bin/ovito >> QXcbConnection: Failed to initialize XRandr >> Qt: XKEYBOARD extension not present on the X server. >> >> So to me this seems to be more a problem of permissions and rights. >> For this I found that /etc/opt/VirtualGL/vgl_xauth_key is not present. >> Runnning vglgenkey results in >> >> wolfgang@WS-Mikro:~$ /opt/VirtualGL/bin/vglgenkey >> xauth: timeout in locking authority file >> /etc/opt/VirtualGL/vgl_xauth_key >> xauth: timeout in locking authority file >> /etc/opt/VirtualGL/vgl_xauth_key >> chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file >> or directory >> >> Resp. >> >> wolfgang@WS-Mikro:~$ sudo /opt/VirtualGL/bin/vglgenkey >> xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist >> xauth: (argv):1: couldn't query Security extension on display ":4" >> xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist >> xauth: (argv):1: bad "add" command line >> chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file >> or directory >> >> In /etc/gdm3/Init/Default there is a line with "xhost +local" but >> vglgenkey is not called. So the question I'm asking myself right now >> is where and when should vglgenkey be called? >> >> -----Ursprüngliche Nachricht----- >> Von: DRC [mailto:dco...@us...] >> Gesendet: Freitag, 30. September 2016 18:46 >> An: vir...@li... >> Betreff: Re: [VirtualGL-Users] Ubuntu 16.04 + VNC + VirtualGL: No >> protocol specified /unable to open display :0 >> >> I've successfully used Ubuntu 16.04 with the distribution-supplied >> nVidia driver package and VirtualGL 2.5, so I know that that >> configuration works properly. It seems that the root of the problem >> is that your 3D X server (display :0) isn't starting, which has >> nothing to do with VirtualGL. I assume that you are unable to log into the machine locally? >> >> My best advice would be to try re-installing the nVidia driver package >> and rebooting. Also double-check that the nouveau driver isn't >> installed, as that can cause conflicts with the proprietary nVidia >> driver. It seems like you might be running into the same problem described here: >> https://devtalk.nvidia.com/default/topic/951741/linux/locked-out-of-xw >> indows -login-loop-after-driver-upgrade-ubuntu-16-04/, >> in which case you should try the troubleshooting advice from that thread. >> >> >>> On 9/30/16 6:58 AM, Wolfgang Verestek wrote: >>> >>> >>> Dear VirtualGL users, >>> >>> I need some help with 3D visualization. I'm using a workstation for >>> visualization of some simulation data. For this we were using Ubuntu >>> 14.04 + VNC and VirtualGL for 3D applications (gnome, xfce4 as >>> Desktop >> environment). >>> Everything was working, but after upgrading to 16.04 I just cannot >>> get VirtualGL running properly utilizing the Nvidia GPU (Titan Z). >>> >>> wolfgang@WS-Mikro:~$ lspci | grep VGA >>> 07:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED >>> Graphics Family (rev 30) >>> 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce >>> GTX TITAN Z] (rev a1) >>> >>> wolfgang@WS-Mikro:~$ lspci | grep NVIDIA >>> 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce >>> GTX TITAN Z] (rev a1) >>> 83:00.1 Audio device: NVIDIA Corporation GK110 HDMI Audio (rev a1) >>> 84:00.0 3D controller: NVIDIA Corporation GK110B [GeForce GTX TITAN >>> Z] (rev >>> a1) >>> >>> wolfgang@WS-Mikro:~$ vglrun +v glxgears [VGL] Shared memory segment >>> ID for vglconfig: 950279 [VGL] VirtualGL v2.5 64-bit (Build 20160215) >>> [VGL] Opening connection to 3D X server :0 No protocol specified >>> [VGL] >>> ERROR: Could not open display :0. >>> >>> >>> in /etc/log/Xorg.0.log i get some error messages, but to be honest i >>> have no idea how to resolve this issue... >>> wolfgang@WS-Mikro:~$ tail /var/log/Xorg.0.log >>> [ 251.081] (WW) Disabling Mouse0 >>> [ 251.081] (WW) Disabling Keyboard0 >>> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >>> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >>> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >>> [ 251.082] (EE) [drm] Failed to open DRM device for (null): -22 >>> [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:83:00.0: >> -22 >>> [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:84:00.0: >> -22 >>> [ 251.082] Number of created screens does not match number of detected >>> devices. >>> Configuration failed. >>> VNC-Screen: >>> wolfgang@WS-Mikro:~$ tail /var/log/Xorg.4.log >>> [ 10.046] xorg-server 2:1.15.1-0ubuntu2.6 (For technical support please >>> see http://www.ubuntu.com/support) >>> [ 10.046] Current version of pixman: 0.30.2 >>> [ 10.046] Before reporting problems, check http://wiki.x.org >>> to make sure that you have the latest version. >>> [ 10.046] Markers: (--) probed, (**) from config file, (==) default >>> setting, >>> (++) from command line, (!!) notice, (II) informational, >>> (WW) warning, (EE) error, (NI) not implemented, (??) unknown. >>> [ 10.046] (==) Log file: "/var/log/Xorg.4.log", Time: Mon Mar 2 >> 10:46:31 >>> 2015 >>> [ 10.046] (==) Using config file: "/etc/X11/xorg.conf" >>> [ 10.046] (==) Using system config directory >> "/usr/share/X11/xorg.conf.d" >>> >>> >>> X server is running, also e.g. xclock works on the VNC screen. >>> wolfgang@WS-Mikro:~$ w >>> 16:02:51 up 30 min, 2 users, load average: 2,16, 2,35, 1,67 >>> USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT >>> dummy_us tty7 :0 15:32 30:42 8:57 0.09s >>> update-notifier >>> wolfgang pts/0 10.132.2.119 15:36 5:27 7.77s 0.23s -bash >>> >>> wolfgang@WS-Mikro:~$ nvidia-smi >>> Wed Sep 28 16:02:54 2016 >>> +------------------------------------------------------+ >>> | NVIDIA-SMI 361.42 Driver Version: 361.42 | >>> |-------------------------------+----------------------+------------- >>> |-------------------------------+----------------------+- >>> |-------------------------------+----------------------+------ >>> --+ >>> | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. >>> ECC | >>> | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute >>> M. | >>> |===============================+======================+============= >>> |= >>> |====== >>> ==| >>> | 0 GeForce GTX TIT... Off | 0000:83:00.0 On | >>> N/A | >>> | 36% 50C P8 28W / 189W | 95MiB / 6143MiB | 0% >>> Default | >>> +-------------------------------+----------------------+------------- >>> +-------------------------------+----------------------+- >>> +-------------------------------+----------------------+------ >>> --+ >>> | 1 GeForce GTX TIT... Off | 0000:84:00.0 Off | >>> N/A | >>> | 33% 47C P8 33W / 189W | 15MiB / 6143MiB | 0% >>> Default | >>> +-------------------------------+----------------------+------------- >>> +-------------------------------+----------------------+- >>> +-------------------------------+----------------------+------ >>> --+ >>> >>> +-------------------------------------------------------------------- >>> +- >>> +------ >>> --+ >>> | Processes: GPU >>> Memory | >>> | GPU PID Type Process name Usage >>> | >>> |==================================================================== >>> |= >>> |====== >>> ==| >>> | 0 2968 G /usr/lib/xorg/Xorg >>> 33MiB | >>> | 0 3262 G /usr/bin/gnome-shell >>> 43MiB | >>> +-------------------------------------------------------------------- >>> +- >>> +------ >>> --+ >>> >>> xorg.conf (which was working on 14.04 LTS): >>> Code: >>> >>> Section "DRI" >>> Mode 0666 >>> EndSection >>> >>> Section "ServerLayout" >>> Identifier "Layout0" >>> Screen 0 "Screen0" 0 0 >>> Screen 1 "Screen1" RightOf "Screen0" >>> InputDevice "Keyboard0" "CoreKeyboard" >>> InputDevice "Mouse0" "CorePointer" >>> Option "Xinerama" "0" >>> EndSection >>> >>> Section "Files" >>> EndSection >>> >>> Section "InputDevice" >>> >>> # generated from default >>> Identifier "Mouse0" >>> Driver "mouse" >>> Option "Protocol" "auto" >>> Option "Device" "/dev/psaux" >>> Option "Emulate3Buttons" "no" >>> Option "ZAxisMapping" "4 5" >>> EndSection >>> >>> Section "InputDevice" >>> >>> # generated from default >>> Identifier "Keyboard0" >>> Driver "kbd" >>> EndSection >>> >>> Section "Monitor" >>> Identifier "Monitor0" >>> VendorName "Unknown" >>> ModelName "BenQ G2200W" >>> HorizSync 31.0 - 83.0 >>> VertRefresh 55.0 - 76.0 >>> Option "DPMS" >>> EndSection >>> >>> Section "Monitor" >>> Identifier "Monitor1" >>> VendorName "Unknown" >>> ModelName "Dell DEL 1908FPBLK" >>> HorizSync 30.0 - 81.0 >>> VertRefresh 56.0 - 76.0 >>> EndSection >>> >>> Section "Device" >>> Identifier "Device0" >>> Driver "nvidia" >>> VendorName "NVIDIA Corporation" >>> BoardName "GeForce GTX TITAN Z" >>> BusID "PCI:83:0:0" >>> Screen 0 >>> EndSection >>> >>> Section "Device" >>> Identifier "Device1" >>> Driver "nvidia" >>> VendorName "NVIDIA Corporation" >>> BoardName "GeForce GTX TITAN Z" >>> BusID "PCI:84:0:0" >>> Screen 1 >>> EndSection >>> >>> Section "Screen" >>> Identifier "Screen0" >>> Device "Device0" >>> Monitor "Monitor0" >>> DefaultDepth 24 >>> Option "Stereo" "0" >>> Option "nvidiaXineramaInfoOrder" "DFP-0" >>> Option "metamodes" "DVI-I-1: nvidia-auto-select +0+0" >>> Option "SLI" "Off" >>> Option "MultiGPU" "On" >>> Option "BaseMosaic" "off" >>> Option "AllowGLXWithComposite" "true" >>> SubSection "Display" >>> Depth 24 >>> EndSubSection >>> EndSection >>> >>> Section "Screen" >>> Identifier "Screen1" >>> Device "Device1" >>> Monitor "Monitor1" >>> DefaultDepth 24 >>> Option "Stereo" "0" >>> Option "metamodes" "DVI-D-0: nvidia-auto-select +0+0" >>> Option "SLI" "Off" >>> Option "MultiGPU" "On" >>> Option "BaseMosaic" "off" >>> Option "AllowGLXWithComposite" "true" >>> SubSection "Display" >>> Depth 24 >>> EndSubSection >>> EndSection >>> >>> I hope somebody can give me a hint how to resolve that issue. If some >>> additional info is needed let me know. >>> >>> >>> Best regards >>> Wolfgang >> >> ---------------------------------------------------------------------- >> ------ >> -- >> _______________________________________________ >> VirtualGL-Users mailing list >> Vir...@li... >> https://lists.sourceforge.net/lists/listinfo/virtualgl-users >> >> >> ---------------------------------------------------------------------- >> -------- Check out the vibrant tech community on one of the world's >> most engaging tech sites, SlashDot.org! http://sdm.link/slashdot >> _______________________________________________ >> VirtualGL-Users mailing list >> Vir...@li... >> https://lists.sourceforge.net/lists/listinfo/virtualgl-users > > ------------------------------------------------------------------------------ > Check out the vibrant tech community on one of the world's most engaging tech sites, SlashDot.org! http://sdm.link/slashdot _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users > > > ------------------------------------------------------------------------------ > Check out the vibrant tech community on one of the world's most > engaging tech sites, SlashDot.org! http://sdm.link/slashdot > _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users |
|
From: Wolfgang V. <wol...@im...> - 2016-10-06 10:11:03
|
That was the hint. Switching to lightdm resolved my issues. Btw. In 14.04 with gdm3 it was working properly. Thanks a lot -----Ursprüngliche Nachricht----- Von: DRC [mailto:dco...@us...] Gesendet: Mittwoch, 5. Oktober 2016 17:30 An: vir...@li... Betreff: Re: [VirtualGL-Users] Ubuntu 16.04 + VNC + VirtualGL: No protocol specified /unable to open display :0 In vglserver_config, there is an option to grant 3D X server access to members of the vglusers group or to all users of the system. If you elect to grant access only to members of vglusers, then it will modify the display manager startup script to run vglgenkey, which generates vgl_xauth_key. However, I'm noticing that you're using GDM rather than LightDM. There is a known issue that prevents /etc/gdm/Init/Default from being executed by certain versions of GDM 3: https://bugzilla.redhat.com/show_bug.cgi?id=851769 That issue is known to affect Fedora 22 and later, and it might be affecting the version of GDM shipped with Ubuntu 16.04. I've never actually tested GDM 3 on Ubuntu 16, because I can't even get it to start properly. LightDM is the only version that works on my system. Try modifying /etc/gdm3/Init/Default and add: echo test >/tmp/test to the top. Restart GDM. If /tmp/test doesn't get created, then you are likely running into the aforementioned issue, and the only known workaround is to use LightDM. On 10/5/16 2:50 AM, Wolfgang Verestek wrote: > Thanks for the reply. The link solved some issues with the nvidia driver. > But the original problem remains, I'm not able to connect to display :0. > > [VGL] Shared memory segment ID for vglconfig: 5734445 [VGL] VirtualGL > v2.5.1 64-bit (Build 20161001) [VGL] Opening connection to 3D X server > 0 [VGL] ERROR: Could not open display 0. > > On the other hand if I start vnc and use "startx" in a terminal to > start a new X server, I can open a 3d application with 2vglrun -d :1 application". > > wolfgang@WS-Mikro:~$ startx > X.Org X Server 1.18.3 > Release Date: 2016-04-04 > X Protocol Version 11, Revision 0 > Build Operating System: Linux 3.13.0-92-generic x86_64 Ubuntu Current > Operating System: Linux WS-Mikro 4.4.0-38-generic #57-Ubuntu SMP Tue > Sep 6 15:42:33 UTC 2016 x86_64 Kernel command line: > BOOT_IMAGE=/boot/vmlinuz-4.4.0-38-generic > root=UUID=0aed3666-5f62-49e1-b57c-7a64d6318dd1 ro splash quiet > nomodeset Build Date: 22 July 2016 07:50:34AM xorg-server > 2:1.18.3-1ubuntu2.3 (For technical support please see > http://www.ubuntu.com/support) > Current version of pixman: 0.33.6 > Before reporting problems, check http://wiki.x.org > to make sure that you have the latest version. > Markers: (--) probed, (**) from config file, (==) default setting, > (++) from command line, (!!) notice, (II) informational, > (WW) warning, (EE) error, (NI) not implemented, (??) unknown. > (==) Log file: "/var/log/Xorg.1.log", Time: Wed Oct 5 09:24:17 2016 > (==) Using system config directory "/usr/share/X11/xorg.conf.d" > The XKEYBOARD keymap compiler (xkbcomp) reports: >> Warning: Type "ONE_LEVEL" has 1 levels, but <RALT> has 2 symbols >> Ignoring extra symbols > Errors from xkbcomp are not fatal to the X server > > wolfgang@WS-Mikro:~$ vglrun -d :1 /opt/ovito-2.7.0-x86_64/bin/ovito > QXcbConnection: Failed to initialize XRandr > Qt: XKEYBOARD extension not present on the X server. > > So to me this seems to be more a problem of permissions and rights. > For this I found that /etc/opt/VirtualGL/vgl_xauth_key is not present. > Runnning vglgenkey results in > > wolfgang@WS-Mikro:~$ /opt/VirtualGL/bin/vglgenkey > xauth: timeout in locking authority file > /etc/opt/VirtualGL/vgl_xauth_key > xauth: timeout in locking authority file > /etc/opt/VirtualGL/vgl_xauth_key > chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file > or directory > > Resp. > > wolfgang@WS-Mikro:~$ sudo /opt/VirtualGL/bin/vglgenkey > xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist > xauth: (argv):1: couldn't query Security extension on display ":4" > xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist > xauth: (argv):1: bad "add" command line > chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file > or directory > > In /etc/gdm3/Init/Default there is a line with "xhost +local" but > vglgenkey is not called. So the question I'm asking myself right now > is where and when should vglgenkey be called? > > -----Ursprüngliche Nachricht----- > Von: DRC [mailto:dco...@us...] > Gesendet: Freitag, 30. September 2016 18:46 > An: vir...@li... > Betreff: Re: [VirtualGL-Users] Ubuntu 16.04 + VNC + VirtualGL: No > protocol specified /unable to open display :0 > > I've successfully used Ubuntu 16.04 with the distribution-supplied > nVidia driver package and VirtualGL 2.5, so I know that that > configuration works properly. It seems that the root of the problem > is that your 3D X server (display :0) isn't starting, which has > nothing to do with VirtualGL. I assume that you are unable to log into the machine locally? > > My best advice would be to try re-installing the nVidia driver package > and rebooting. Also double-check that the nouveau driver isn't > installed, as that can cause conflicts with the proprietary nVidia > driver. It seems like you might be running into the same problem described here: > https://devtalk.nvidia.com/default/topic/951741/linux/locked-out-of-xw > indows -login-loop-after-driver-upgrade-ubuntu-16-04/, > in which case you should try the troubleshooting advice from that thread. > > > On 9/30/16 6:58 AM, Wolfgang Verestek wrote: >> >> >> Dear VirtualGL users, >> >> I need some help with 3D visualization. I'm using a workstation for >> visualization of some simulation data. For this we were using Ubuntu >> 14.04 + VNC and VirtualGL for 3D applications (gnome, xfce4 as >> Desktop > environment). >> Everything was working, but after upgrading to 16.04 I just cannot >> get VirtualGL running properly utilizing the Nvidia GPU (Titan Z). >> >> wolfgang@WS-Mikro:~$ lspci | grep VGA >> 07:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED >> Graphics Family (rev 30) >> 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce >> GTX TITAN Z] (rev a1) >> >> wolfgang@WS-Mikro:~$ lspci | grep NVIDIA >> 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce >> GTX TITAN Z] (rev a1) >> 83:00.1 Audio device: NVIDIA Corporation GK110 HDMI Audio (rev a1) >> 84:00.0 3D controller: NVIDIA Corporation GK110B [GeForce GTX TITAN >> Z] (rev >> a1) >> >> wolfgang@WS-Mikro:~$ vglrun +v glxgears [VGL] Shared memory segment >> ID for vglconfig: 950279 [VGL] VirtualGL v2.5 64-bit (Build 20160215) >> [VGL] Opening connection to 3D X server :0 No protocol specified >> [VGL] >> ERROR: Could not open display :0. >> >> >> in /etc/log/Xorg.0.log i get some error messages, but to be honest i >> have no idea how to resolve this issue... >> wolfgang@WS-Mikro:~$ tail /var/log/Xorg.0.log >> [ 251.081] (WW) Disabling Mouse0 >> [ 251.081] (WW) Disabling Keyboard0 >> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.082] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:83:00.0: > -22 >> [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:84:00.0: > -22 >> [ 251.082] Number of created screens does not match number of detected >> devices. >> Configuration failed. >> VNC-Screen: >> wolfgang@WS-Mikro:~$ tail /var/log/Xorg.4.log >> [ 10.046] xorg-server 2:1.15.1-0ubuntu2.6 (For technical support please >> see http://www.ubuntu.com/support) >> [ 10.046] Current version of pixman: 0.30.2 >> [ 10.046] Before reporting problems, check http://wiki.x.org >> to make sure that you have the latest version. >> [ 10.046] Markers: (--) probed, (**) from config file, (==) default >> setting, >> (++) from command line, (!!) notice, (II) informational, >> (WW) warning, (EE) error, (NI) not implemented, (??) unknown. >> [ 10.046] (==) Log file: "/var/log/Xorg.4.log", Time: Mon Mar 2 > 10:46:31 >> 2015 >> [ 10.046] (==) Using config file: "/etc/X11/xorg.conf" >> [ 10.046] (==) Using system config directory > "/usr/share/X11/xorg.conf.d" >> >> >> X server is running, also e.g. xclock works on the VNC screen. >> wolfgang@WS-Mikro:~$ w >> 16:02:51 up 30 min, 2 users, load average: 2,16, 2,35, 1,67 >> USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT >> dummy_us tty7 :0 15:32 30:42 8:57 0.09s >> update-notifier >> wolfgang pts/0 10.132.2.119 15:36 5:27 7.77s 0.23s -bash >> >> wolfgang@WS-Mikro:~$ nvidia-smi >> Wed Sep 28 16:02:54 2016 >> +------------------------------------------------------+ >> | NVIDIA-SMI 361.42 Driver Version: 361.42 | >> |-------------------------------+----------------------+------------- >> |-------------------------------+----------------------+- >> |-------------------------------+----------------------+------ >> --+ >> | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. >> ECC | >> | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute >> M. | >> |===============================+======================+============= >> |= >> |====== >> ==| >> | 0 GeForce GTX TIT... Off | 0000:83:00.0 On | >> N/A | >> | 36% 50C P8 28W / 189W | 95MiB / 6143MiB | 0% >> Default | >> +-------------------------------+----------------------+------------- >> +-------------------------------+----------------------+- >> +-------------------------------+----------------------+------ >> --+ >> | 1 GeForce GTX TIT... Off | 0000:84:00.0 Off | >> N/A | >> | 33% 47C P8 33W / 189W | 15MiB / 6143MiB | 0% >> Default | >> +-------------------------------+----------------------+------------- >> +-------------------------------+----------------------+- >> +-------------------------------+----------------------+------ >> --+ >> >> +-------------------------------------------------------------------- >> +- >> +------ >> --+ >> | Processes: GPU >> Memory | >> | GPU PID Type Process name Usage >> | >> |==================================================================== >> |= >> |====== >> ==| >> | 0 2968 G /usr/lib/xorg/Xorg >> 33MiB | >> | 0 3262 G /usr/bin/gnome-shell >> 43MiB | >> +-------------------------------------------------------------------- >> +- >> +------ >> --+ >> >> xorg.conf (which was working on 14.04 LTS): >> Code: >> >> Section "DRI" >> Mode 0666 >> EndSection >> >> Section "ServerLayout" >> Identifier "Layout0" >> Screen 0 "Screen0" 0 0 >> Screen 1 "Screen1" RightOf "Screen0" >> InputDevice "Keyboard0" "CoreKeyboard" >> InputDevice "Mouse0" "CorePointer" >> Option "Xinerama" "0" >> EndSection >> >> Section "Files" >> EndSection >> >> Section "InputDevice" >> >> # generated from default >> Identifier "Mouse0" >> Driver "mouse" >> Option "Protocol" "auto" >> Option "Device" "/dev/psaux" >> Option "Emulate3Buttons" "no" >> Option "ZAxisMapping" "4 5" >> EndSection >> >> Section "InputDevice" >> >> # generated from default >> Identifier "Keyboard0" >> Driver "kbd" >> EndSection >> >> Section "Monitor" >> Identifier "Monitor0" >> VendorName "Unknown" >> ModelName "BenQ G2200W" >> HorizSync 31.0 - 83.0 >> VertRefresh 55.0 - 76.0 >> Option "DPMS" >> EndSection >> >> Section "Monitor" >> Identifier "Monitor1" >> VendorName "Unknown" >> ModelName "Dell DEL 1908FPBLK" >> HorizSync 30.0 - 81.0 >> VertRefresh 56.0 - 76.0 >> EndSection >> >> Section "Device" >> Identifier "Device0" >> Driver "nvidia" >> VendorName "NVIDIA Corporation" >> BoardName "GeForce GTX TITAN Z" >> BusID "PCI:83:0:0" >> Screen 0 >> EndSection >> >> Section "Device" >> Identifier "Device1" >> Driver "nvidia" >> VendorName "NVIDIA Corporation" >> BoardName "GeForce GTX TITAN Z" >> BusID "PCI:84:0:0" >> Screen 1 >> EndSection >> >> Section "Screen" >> Identifier "Screen0" >> Device "Device0" >> Monitor "Monitor0" >> DefaultDepth 24 >> Option "Stereo" "0" >> Option "nvidiaXineramaInfoOrder" "DFP-0" >> Option "metamodes" "DVI-I-1: nvidia-auto-select +0+0" >> Option "SLI" "Off" >> Option "MultiGPU" "On" >> Option "BaseMosaic" "off" >> Option "AllowGLXWithComposite" "true" >> SubSection "Display" >> Depth 24 >> EndSubSection >> EndSection >> >> Section "Screen" >> Identifier "Screen1" >> Device "Device1" >> Monitor "Monitor1" >> DefaultDepth 24 >> Option "Stereo" "0" >> Option "metamodes" "DVI-D-0: nvidia-auto-select +0+0" >> Option "SLI" "Off" >> Option "MultiGPU" "On" >> Option "BaseMosaic" "off" >> Option "AllowGLXWithComposite" "true" >> SubSection "Display" >> Depth 24 >> EndSubSection >> EndSection >> >> I hope somebody can give me a hint how to resolve that issue. If some >> additional info is needed let me know. >> >> >> Best regards >> Wolfgang > > ---------------------------------------------------------------------- > ------ > -- > _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users > > > ---------------------------------------------------------------------- > -------- Check out the vibrant tech community on one of the world's > most engaging tech sites, SlashDot.org! http://sdm.link/slashdot > _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users > ------------------------------------------------------------------------------ Check out the vibrant tech community on one of the world's most engaging tech sites, SlashDot.org! http://sdm.link/slashdot _______________________________________________ VirtualGL-Users mailing list Vir...@li... https://lists.sourceforge.net/lists/listinfo/virtualgl-users |
|
From: DRC <dco...@us...> - 2016-10-05 15:29:57
|
In vglserver_config, there is an option to grant 3D X server access to members of the vglusers group or to all users of the system. If you elect to grant access only to members of vglusers, then it will modify the display manager startup script to run vglgenkey, which generates vgl_xauth_key. However, I'm noticing that you're using GDM rather than LightDM. There is a known issue that prevents /etc/gdm/Init/Default from being executed by certain versions of GDM 3: https://bugzilla.redhat.com/show_bug.cgi?id=851769 That issue is known to affect Fedora 22 and later, and it might be affecting the version of GDM shipped with Ubuntu 16.04. I've never actually tested GDM 3 on Ubuntu 16, because I can't even get it to start properly. LightDM is the only version that works on my system. Try modifying /etc/gdm3/Init/Default and add: echo test >/tmp/test to the top. Restart GDM. If /tmp/test doesn't get created, then you are likely running into the aforementioned issue, and the only known workaround is to use LightDM. On 10/5/16 2:50 AM, Wolfgang Verestek wrote: > Thanks for the reply. The link solved some issues with the nvidia driver. > But the original problem remains, I'm not able to connect to display :0. > > [VGL] Shared memory segment ID for vglconfig: 5734445 > [VGL] VirtualGL v2.5.1 64-bit (Build 20161001) > [VGL] Opening connection to 3D X server 0 > [VGL] ERROR: Could not open display 0. > > On the other hand if I start vnc and use "startx" in a terminal to start a > new X server, I can open a 3d application with 2vglrun -d :1 application". > > wolfgang@WS-Mikro:~$ startx > X.Org X Server 1.18.3 > Release Date: 2016-04-04 > X Protocol Version 11, Revision 0 > Build Operating System: Linux 3.13.0-92-generic x86_64 Ubuntu > Current Operating System: Linux WS-Mikro 4.4.0-38-generic #57-Ubuntu SMP Tue > Sep 6 15:42:33 UTC 2016 x86_64 > Kernel command line: BOOT_IMAGE=/boot/vmlinuz-4.4.0-38-generic > root=UUID=0aed3666-5f62-49e1-b57c-7a64d6318dd1 ro splash quiet nomodeset > Build Date: 22 July 2016 07:50:34AM > xorg-server 2:1.18.3-1ubuntu2.3 (For technical support please see > http://www.ubuntu.com/support) > Current version of pixman: 0.33.6 > Before reporting problems, check http://wiki.x.org > to make sure that you have the latest version. > Markers: (--) probed, (**) from config file, (==) default setting, > (++) from command line, (!!) notice, (II) informational, > (WW) warning, (EE) error, (NI) not implemented, (??) unknown. > (==) Log file: "/var/log/Xorg.1.log", Time: Wed Oct 5 09:24:17 2016 > (==) Using system config directory "/usr/share/X11/xorg.conf.d" > The XKEYBOARD keymap compiler (xkbcomp) reports: >> Warning: Type "ONE_LEVEL" has 1 levels, but <RALT> has 2 symbols >> Ignoring extra symbols > Errors from xkbcomp are not fatal to the X server > > wolfgang@WS-Mikro:~$ vglrun -d :1 /opt/ovito-2.7.0-x86_64/bin/ovito > QXcbConnection: Failed to initialize XRandr > Qt: XKEYBOARD extension not present on the X server. > > So to me this seems to be more a problem of permissions and rights. For this > I found that /etc/opt/VirtualGL/vgl_xauth_key is not present. Runnning > vglgenkey results in > > wolfgang@WS-Mikro:~$ /opt/VirtualGL/bin/vglgenkey > xauth: timeout in locking authority file /etc/opt/VirtualGL/vgl_xauth_key > xauth: timeout in locking authority file /etc/opt/VirtualGL/vgl_xauth_key > chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file or > directory > > Resp. > > wolfgang@WS-Mikro:~$ sudo /opt/VirtualGL/bin/vglgenkey > xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist > xauth: (argv):1: couldn't query Security extension on display ":4" > xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist > xauth: (argv):1: bad "add" command line > chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file or > directory > > In /etc/gdm3/Init/Default there is a line with "xhost +local" but vglgenkey > is not called. So the question I'm asking myself right now is where and when > should vglgenkey be called? > > -----Ursprüngliche Nachricht----- > Von: DRC [mailto:dco...@us...] > Gesendet: Freitag, 30. September 2016 18:46 > An: vir...@li... > Betreff: Re: [VirtualGL-Users] Ubuntu 16.04 + VNC + VirtualGL: No protocol > specified /unable to open display :0 > > I've successfully used Ubuntu 16.04 with the distribution-supplied nVidia > driver package and VirtualGL 2.5, so I know that that configuration works > properly. It seems that the root of the problem is that your 3D X server > (display :0) isn't starting, which has nothing to do with VirtualGL. I > assume that you are unable to log into the machine locally? > > My best advice would be to try re-installing the nVidia driver package and > rebooting. Also double-check that the nouveau driver isn't installed, as > that can cause conflicts with the proprietary nVidia driver. It seems like > you might be running into the same problem described here: > https://devtalk.nvidia.com/default/topic/951741/linux/locked-out-of-xwindows > -login-loop-after-driver-upgrade-ubuntu-16-04/, > in which case you should try the troubleshooting advice from that thread. > > > On 9/30/16 6:58 AM, Wolfgang Verestek wrote: >> >> >> Dear VirtualGL users, >> >> I need some help with 3D visualization. I'm using a workstation for >> visualization of some simulation data. For this we were using Ubuntu >> 14.04 + VNC and VirtualGL for 3D applications (gnome, xfce4 as Desktop > environment). >> Everything was working, but after upgrading to 16.04 I just cannot get >> VirtualGL running properly utilizing the Nvidia GPU (Titan Z). >> >> wolfgang@WS-Mikro:~$ lspci | grep VGA >> 07:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED >> Graphics Family (rev 30) >> 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce >> GTX TITAN Z] (rev a1) >> >> wolfgang@WS-Mikro:~$ lspci | grep NVIDIA >> 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce >> GTX TITAN Z] (rev a1) >> 83:00.1 Audio device: NVIDIA Corporation GK110 HDMI Audio (rev a1) >> 84:00.0 3D controller: NVIDIA Corporation GK110B [GeForce GTX TITAN Z] >> (rev >> a1) >> >> wolfgang@WS-Mikro:~$ vglrun +v glxgears [VGL] Shared memory segment ID >> for vglconfig: 950279 [VGL] VirtualGL v2.5 64-bit (Build 20160215) >> [VGL] Opening connection to 3D X server :0 No protocol specified [VGL] >> ERROR: Could not open display :0. >> >> >> in /etc/log/Xorg.0.log i get some error messages, but to be honest i >> have no idea how to resolve this issue... >> wolfgang@WS-Mikro:~$ tail /var/log/Xorg.0.log >> [ 251.081] (WW) Disabling Mouse0 >> [ 251.081] (WW) Disabling Keyboard0 >> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.082] (EE) [drm] Failed to open DRM device for (null): -22 >> [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:83:00.0: > -22 >> [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:84:00.0: > -22 >> [ 251.082] Number of created screens does not match number of detected >> devices. >> Configuration failed. >> VNC-Screen: >> wolfgang@WS-Mikro:~$ tail /var/log/Xorg.4.log >> [ 10.046] xorg-server 2:1.15.1-0ubuntu2.6 (For technical support please >> see http://www.ubuntu.com/support) >> [ 10.046] Current version of pixman: 0.30.2 >> [ 10.046] Before reporting problems, check http://wiki.x.org >> to make sure that you have the latest version. >> [ 10.046] Markers: (--) probed, (**) from config file, (==) default >> setting, >> (++) from command line, (!!) notice, (II) informational, >> (WW) warning, (EE) error, (NI) not implemented, (??) unknown. >> [ 10.046] (==) Log file: "/var/log/Xorg.4.log", Time: Mon Mar 2 > 10:46:31 >> 2015 >> [ 10.046] (==) Using config file: "/etc/X11/xorg.conf" >> [ 10.046] (==) Using system config directory > "/usr/share/X11/xorg.conf.d" >> >> >> X server is running, also e.g. xclock works on the VNC screen. >> wolfgang@WS-Mikro:~$ w >> 16:02:51 up 30 min, 2 users, load average: 2,16, 2,35, 1,67 >> USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT >> dummy_us tty7 :0 15:32 30:42 8:57 0.09s >> update-notifier >> wolfgang pts/0 10.132.2.119 15:36 5:27 7.77s 0.23s -bash >> >> wolfgang@WS-Mikro:~$ nvidia-smi >> Wed Sep 28 16:02:54 2016 >> +------------------------------------------------------+ >> | NVIDIA-SMI 361.42 Driver Version: 361.42 | >> |-------------------------------+----------------------+-------------- >> |-------------------------------+----------------------+------ >> --+ >> | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. >> ECC | >> | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute >> M. | >> |===============================+======================+============== >> |====== >> ==| >> | 0 GeForce GTX TIT... Off | 0000:83:00.0 On | >> N/A | >> | 36% 50C P8 28W / 189W | 95MiB / 6143MiB | 0% >> Default | >> +-------------------------------+----------------------+-------------- >> +-------------------------------+----------------------+------ >> --+ >> | 1 GeForce GTX TIT... Off | 0000:84:00.0 Off | >> N/A | >> | 33% 47C P8 33W / 189W | 15MiB / 6143MiB | 0% >> Default | >> +-------------------------------+----------------------+-------------- >> +-------------------------------+----------------------+------ >> --+ >> >> +--------------------------------------------------------------------- >> +------ >> --+ >> | Processes: GPU >> Memory | >> | GPU PID Type Process name Usage >> | >> |===================================================================== >> |====== >> ==| >> | 0 2968 G /usr/lib/xorg/Xorg >> 33MiB | >> | 0 3262 G /usr/bin/gnome-shell >> 43MiB | >> +--------------------------------------------------------------------- >> +------ >> --+ >> >> xorg.conf (which was working on 14.04 LTS): >> Code: >> >> Section "DRI" >> Mode 0666 >> EndSection >> >> Section "ServerLayout" >> Identifier "Layout0" >> Screen 0 "Screen0" 0 0 >> Screen 1 "Screen1" RightOf "Screen0" >> InputDevice "Keyboard0" "CoreKeyboard" >> InputDevice "Mouse0" "CorePointer" >> Option "Xinerama" "0" >> EndSection >> >> Section "Files" >> EndSection >> >> Section "InputDevice" >> >> # generated from default >> Identifier "Mouse0" >> Driver "mouse" >> Option "Protocol" "auto" >> Option "Device" "/dev/psaux" >> Option "Emulate3Buttons" "no" >> Option "ZAxisMapping" "4 5" >> EndSection >> >> Section "InputDevice" >> >> # generated from default >> Identifier "Keyboard0" >> Driver "kbd" >> EndSection >> >> Section "Monitor" >> Identifier "Monitor0" >> VendorName "Unknown" >> ModelName "BenQ G2200W" >> HorizSync 31.0 - 83.0 >> VertRefresh 55.0 - 76.0 >> Option "DPMS" >> EndSection >> >> Section "Monitor" >> Identifier "Monitor1" >> VendorName "Unknown" >> ModelName "Dell DEL 1908FPBLK" >> HorizSync 30.0 - 81.0 >> VertRefresh 56.0 - 76.0 >> EndSection >> >> Section "Device" >> Identifier "Device0" >> Driver "nvidia" >> VendorName "NVIDIA Corporation" >> BoardName "GeForce GTX TITAN Z" >> BusID "PCI:83:0:0" >> Screen 0 >> EndSection >> >> Section "Device" >> Identifier "Device1" >> Driver "nvidia" >> VendorName "NVIDIA Corporation" >> BoardName "GeForce GTX TITAN Z" >> BusID "PCI:84:0:0" >> Screen 1 >> EndSection >> >> Section "Screen" >> Identifier "Screen0" >> Device "Device0" >> Monitor "Monitor0" >> DefaultDepth 24 >> Option "Stereo" "0" >> Option "nvidiaXineramaInfoOrder" "DFP-0" >> Option "metamodes" "DVI-I-1: nvidia-auto-select +0+0" >> Option "SLI" "Off" >> Option "MultiGPU" "On" >> Option "BaseMosaic" "off" >> Option "AllowGLXWithComposite" "true" >> SubSection "Display" >> Depth 24 >> EndSubSection >> EndSection >> >> Section "Screen" >> Identifier "Screen1" >> Device "Device1" >> Monitor "Monitor1" >> DefaultDepth 24 >> Option "Stereo" "0" >> Option "metamodes" "DVI-D-0: nvidia-auto-select +0+0" >> Option "SLI" "Off" >> Option "MultiGPU" "On" >> Option "BaseMosaic" "off" >> Option "AllowGLXWithComposite" "true" >> SubSection "Display" >> Depth 24 >> EndSubSection >> EndSection >> >> I hope somebody can give me a hint how to resolve that issue. If some >> additional info is needed let me know. >> >> >> Best regards >> Wolfgang > > ---------------------------------------------------------------------------- > -- > _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users > > > ------------------------------------------------------------------------------ > Check out the vibrant tech community on one of the world's most > engaging tech sites, SlashDot.org! http://sdm.link/slashdot > _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users > |
|
From: Wolfgang V. <wol...@im...> - 2016-10-05 07:50:32
|
Thanks for the reply. The link solved some issues with the nvidia driver. But the original problem remains, I'm not able to connect to display :0. [VGL] Shared memory segment ID for vglconfig: 5734445 [VGL] VirtualGL v2.5.1 64-bit (Build 20161001) [VGL] Opening connection to 3D X server 0 [VGL] ERROR: Could not open display 0. On the other hand if I start vnc and use "startx" in a terminal to start a new X server, I can open a 3d application with 2vglrun -d :1 application". wolfgang@WS-Mikro:~$ startx X.Org X Server 1.18.3 Release Date: 2016-04-04 X Protocol Version 11, Revision 0 Build Operating System: Linux 3.13.0-92-generic x86_64 Ubuntu Current Operating System: Linux WS-Mikro 4.4.0-38-generic #57-Ubuntu SMP Tue Sep 6 15:42:33 UTC 2016 x86_64 Kernel command line: BOOT_IMAGE=/boot/vmlinuz-4.4.0-38-generic root=UUID=0aed3666-5f62-49e1-b57c-7a64d6318dd1 ro splash quiet nomodeset Build Date: 22 July 2016 07:50:34AM xorg-server 2:1.18.3-1ubuntu2.3 (For technical support please see http://www.ubuntu.com/support) Current version of pixman: 0.33.6 Before reporting problems, check http://wiki.x.org to make sure that you have the latest version. Markers: (--) probed, (**) from config file, (==) default setting, (++) from command line, (!!) notice, (II) informational, (WW) warning, (EE) error, (NI) not implemented, (??) unknown. (==) Log file: "/var/log/Xorg.1.log", Time: Wed Oct 5 09:24:17 2016 (==) Using system config directory "/usr/share/X11/xorg.conf.d" The XKEYBOARD keymap compiler (xkbcomp) reports: > Warning: Type "ONE_LEVEL" has 1 levels, but <RALT> has 2 symbols > Ignoring extra symbols Errors from xkbcomp are not fatal to the X server wolfgang@WS-Mikro:~$ vglrun -d :1 /opt/ovito-2.7.0-x86_64/bin/ovito QXcbConnection: Failed to initialize XRandr Qt: XKEYBOARD extension not present on the X server. So to me this seems to be more a problem of permissions and rights. For this I found that /etc/opt/VirtualGL/vgl_xauth_key is not present. Runnning vglgenkey results in wolfgang@WS-Mikro:~$ /opt/VirtualGL/bin/vglgenkey xauth: timeout in locking authority file /etc/opt/VirtualGL/vgl_xauth_key xauth: timeout in locking authority file /etc/opt/VirtualGL/vgl_xauth_key chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file or directory Resp. wolfgang@WS-Mikro:~$ sudo /opt/VirtualGL/bin/vglgenkey xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist xauth: (argv):1: couldn't query Security extension on display ":4" xauth: file /etc/opt/VirtualGL/vgl_xauth_key does not exist xauth: (argv):1: bad "add" command line chmod: cannot access '/etc/opt/VirtualGL/vgl_xauth_key': No such file or directory In /etc/gdm3/Init/Default there is a line with "xhost +local" but vglgenkey is not called. So the question I'm asking myself right now is where and when should vglgenkey be called? -----Ursprüngliche Nachricht----- Von: DRC [mailto:dco...@us...] Gesendet: Freitag, 30. September 2016 18:46 An: vir...@li... Betreff: Re: [VirtualGL-Users] Ubuntu 16.04 + VNC + VirtualGL: No protocol specified /unable to open display :0 I've successfully used Ubuntu 16.04 with the distribution-supplied nVidia driver package and VirtualGL 2.5, so I know that that configuration works properly. It seems that the root of the problem is that your 3D X server (display :0) isn't starting, which has nothing to do with VirtualGL. I assume that you are unable to log into the machine locally? My best advice would be to try re-installing the nVidia driver package and rebooting. Also double-check that the nouveau driver isn't installed, as that can cause conflicts with the proprietary nVidia driver. It seems like you might be running into the same problem described here: https://devtalk.nvidia.com/default/topic/951741/linux/locked-out-of-xwindows -login-loop-after-driver-upgrade-ubuntu-16-04/, in which case you should try the troubleshooting advice from that thread. On 9/30/16 6:58 AM, Wolfgang Verestek wrote: > > > Dear VirtualGL users, > > I need some help with 3D visualization. I'm using a workstation for > visualization of some simulation data. For this we were using Ubuntu > 14.04 + VNC and VirtualGL for 3D applications (gnome, xfce4 as Desktop environment). > Everything was working, but after upgrading to 16.04 I just cannot get > VirtualGL running properly utilizing the Nvidia GPU (Titan Z). > > wolfgang@WS-Mikro:~$ lspci | grep VGA > 07:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED > Graphics Family (rev 30) > 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce > GTX TITAN Z] (rev a1) > > wolfgang@WS-Mikro:~$ lspci | grep NVIDIA > 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce > GTX TITAN Z] (rev a1) > 83:00.1 Audio device: NVIDIA Corporation GK110 HDMI Audio (rev a1) > 84:00.0 3D controller: NVIDIA Corporation GK110B [GeForce GTX TITAN Z] > (rev > a1) > > wolfgang@WS-Mikro:~$ vglrun +v glxgears [VGL] Shared memory segment ID > for vglconfig: 950279 [VGL] VirtualGL v2.5 64-bit (Build 20160215) > [VGL] Opening connection to 3D X server :0 No protocol specified [VGL] > ERROR: Could not open display :0. > > > in /etc/log/Xorg.0.log i get some error messages, but to be honest i > have no idea how to resolve this issue... > wolfgang@WS-Mikro:~$ tail /var/log/Xorg.0.log > [ 251.081] (WW) Disabling Mouse0 > [ 251.081] (WW) Disabling Keyboard0 > [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.082] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:83:00.0: -22 > [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:84:00.0: -22 > [ 251.082] Number of created screens does not match number of detected > devices. > Configuration failed. > VNC-Screen: > wolfgang@WS-Mikro:~$ tail /var/log/Xorg.4.log > [ 10.046] xorg-server 2:1.15.1-0ubuntu2.6 (For technical support please > see http://www.ubuntu.com/support) > [ 10.046] Current version of pixman: 0.30.2 > [ 10.046] Before reporting problems, check http://wiki.x.org > to make sure that you have the latest version. > [ 10.046] Markers: (--) probed, (**) from config file, (==) default > setting, > (++) from command line, (!!) notice, (II) informational, > (WW) warning, (EE) error, (NI) not implemented, (??) unknown. > [ 10.046] (==) Log file: "/var/log/Xorg.4.log", Time: Mon Mar 2 10:46:31 > 2015 > [ 10.046] (==) Using config file: "/etc/X11/xorg.conf" > [ 10.046] (==) Using system config directory "/usr/share/X11/xorg.conf.d" > > > X server is running, also e.g. xclock works on the VNC screen. > wolfgang@WS-Mikro:~$ w > 16:02:51 up 30 min, 2 users, load average: 2,16, 2,35, 1,67 > USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT > dummy_us tty7 :0 15:32 30:42 8:57 0.09s > update-notifier > wolfgang pts/0 10.132.2.119 15:36 5:27 7.77s 0.23s -bash > > wolfgang@WS-Mikro:~$ nvidia-smi > Wed Sep 28 16:02:54 2016 > +------------------------------------------------------+ > | NVIDIA-SMI 361.42 Driver Version: 361.42 | > |-------------------------------+----------------------+-------------- > |-------------------------------+----------------------+------ > --+ > | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. > ECC | > | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute > M. | > |===============================+======================+============== > |====== > ==| > | 0 GeForce GTX TIT... Off | 0000:83:00.0 On | > N/A | > | 36% 50C P8 28W / 189W | 95MiB / 6143MiB | 0% > Default | > +-------------------------------+----------------------+-------------- > +-------------------------------+----------------------+------ > --+ > | 1 GeForce GTX TIT... Off | 0000:84:00.0 Off | > N/A | > | 33% 47C P8 33W / 189W | 15MiB / 6143MiB | 0% > Default | > +-------------------------------+----------------------+-------------- > +-------------------------------+----------------------+------ > --+ > > +--------------------------------------------------------------------- > +------ > --+ > | Processes: GPU > Memory | > | GPU PID Type Process name Usage > | > |===================================================================== > |====== > ==| > | 0 2968 G /usr/lib/xorg/Xorg > 33MiB | > | 0 3262 G /usr/bin/gnome-shell > 43MiB | > +--------------------------------------------------------------------- > +------ > --+ > > xorg.conf (which was working on 14.04 LTS): > Code: > > Section "DRI" > Mode 0666 > EndSection > > Section "ServerLayout" > Identifier "Layout0" > Screen 0 "Screen0" 0 0 > Screen 1 "Screen1" RightOf "Screen0" > InputDevice "Keyboard0" "CoreKeyboard" > InputDevice "Mouse0" "CorePointer" > Option "Xinerama" "0" > EndSection > > Section "Files" > EndSection > > Section "InputDevice" > > # generated from default > Identifier "Mouse0" > Driver "mouse" > Option "Protocol" "auto" > Option "Device" "/dev/psaux" > Option "Emulate3Buttons" "no" > Option "ZAxisMapping" "4 5" > EndSection > > Section "InputDevice" > > # generated from default > Identifier "Keyboard0" > Driver "kbd" > EndSection > > Section "Monitor" > Identifier "Monitor0" > VendorName "Unknown" > ModelName "BenQ G2200W" > HorizSync 31.0 - 83.0 > VertRefresh 55.0 - 76.0 > Option "DPMS" > EndSection > > Section "Monitor" > Identifier "Monitor1" > VendorName "Unknown" > ModelName "Dell DEL 1908FPBLK" > HorizSync 30.0 - 81.0 > VertRefresh 56.0 - 76.0 > EndSection > > Section "Device" > Identifier "Device0" > Driver "nvidia" > VendorName "NVIDIA Corporation" > BoardName "GeForce GTX TITAN Z" > BusID "PCI:83:0:0" > Screen 0 > EndSection > > Section "Device" > Identifier "Device1" > Driver "nvidia" > VendorName "NVIDIA Corporation" > BoardName "GeForce GTX TITAN Z" > BusID "PCI:84:0:0" > Screen 1 > EndSection > > Section "Screen" > Identifier "Screen0" > Device "Device0" > Monitor "Monitor0" > DefaultDepth 24 > Option "Stereo" "0" > Option "nvidiaXineramaInfoOrder" "DFP-0" > Option "metamodes" "DVI-I-1: nvidia-auto-select +0+0" > Option "SLI" "Off" > Option "MultiGPU" "On" > Option "BaseMosaic" "off" > Option "AllowGLXWithComposite" "true" > SubSection "Display" > Depth 24 > EndSubSection > EndSection > > Section "Screen" > Identifier "Screen1" > Device "Device1" > Monitor "Monitor1" > DefaultDepth 24 > Option "Stereo" "0" > Option "metamodes" "DVI-D-0: nvidia-auto-select +0+0" > Option "SLI" "Off" > Option "MultiGPU" "On" > Option "BaseMosaic" "off" > Option "AllowGLXWithComposite" "true" > SubSection "Display" > Depth 24 > EndSubSection > EndSection > > I hope somebody can give me a hint how to resolve that issue. If some > additional info is needed let me know. > > > Best regards > Wolfgang ---------------------------------------------------------------------------- -- _______________________________________________ VirtualGL-Users mailing list Vir...@li... https://lists.sourceforge.net/lists/listinfo/virtualgl-users |
|
From: DRC <dco...@us...> - 2016-10-01 18:25:41
|
Official binaries and source tarball are here: https://sourceforge.net/projects/virtualgl/files/2.5.1/ Change log is here: https://github.com/VirtualGL/virtualgl/releases/tag/2.5.1 |
|
From: DRC <dco...@us...> - 2016-09-30 18:53:29
|
Unfortunately, it appears that AMD's implementation of OpenCL relies upon a special X extension (ATIFGLEXTENSION) in order to access the GPU. This is similar to the NV-CONTROL extension provided by the nVidia drivers. TurboVNC 2.1 includes limited support for redirecting NV-CONTROL requests from the X proxy to the 3D X server, so assuming the spec for ATIFGLEXTENSION is open, it should be possible to provide a similar sort of redirector for that extension. Please contact me off-list to discuss specifics, if you are interested in pursuing that project. Otherwise, without the ability to access the GPU, there isn't much point to running OpenCL applications in VirtualGL (unless the application in question also needs OpenGL.) Apparently someone else encountered the same issue: https://software.intel.com/en-us/forums/opencl/topic/328091 I've confirmed that adding the Intel OpenCL path to LD_LIBRARY_PATH: vglrun -ld /opt/intel/opencl/lib64 clinfo works around the issue. If you are running an application that links directly with libGL (as opposed to indirectly loading libGL with dlopen()/dlsym()), then you can also work around the issue by running: vglrun -nodl clinfo And, of course, if your application doesn't use OpenGL, then there is no need to use vglrun at all. I'd be happy to fix the segfault if I had a better understanding of what's causing it. The poster on the Intel forum seemed to think that VirtualGL's interposed version of dlopen() breaks rpath when it is set to $ORIGIN, but I noticed that the rpath within Intel's OpenCL libraries is set to "$ORIGIN::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::". That seems incorrect, so this may be a bug in their libraries that is, for whatever reason, only exposed when using VirtualGL. On 6/27/16 10:23 AM, Jeff McWilliams wrote: > > Can one use OpenCL when running under VirtualGL? > > > > I noticed that clinfo segfaults if I invoke it with: > > > > “vglrun clinfo” > > > > I’m using the latest VirtualGL 2.5, CentOS 6.7 64-bit, AMD FirePro > W8000 graphics. > > > > Core was generated by `clinfo'. > > Program terminated with signal SIGSEGV, Segmentation fault. > > #0 0x00007f6e9fd77fcd in ?? () from > /opt/intel/opencl-1.2-4.4.0.117/lib64/libintelocl.so > > Missing separate debuginfos, use: debuginfo-install > VirtualGL-2.5-20160215.x86_64 glibc-2.12-1.166.el6_7.7.x86_64 > libX11-1.6.0-6.el6.x86_64 libXau-1.0.6-4.el6.x86_64 > libXext-1.3.2-2.1.el6.x86_64 libXv-1.0.9-2.1.el6.x86_64 > libgcc-4.4.7-16.el6.x86_64 libstdc++-4.4.7-16.el6.x86_64 > libxcb-1.9.1-3.el6.x86_64 mesa-libGL-10.4.3-1.el6.x86_64 > numactl-2.0.9-2.el6.x86_64 opencl-1.2-intel-cpu-4.4.0.117-1.x86_64 > > (gdb) bt > > #0 0x00007f6e9fd77fcd in ?? () from > /opt/intel/opencl-1.2-4.4.0.117/lib64/libintelocl.so > > #1 0x00007f6e9fcf1e81 in clGetPlatformInfo () from > /opt/intel/opencl-1.2-4.4.0.117/lib64/libintelocl.so > > #2 0x000000000040e8be in int > cl::detail::getInfoHelper<cl::detail::GetInfoFunctor0<int > (*)(_cl_platform_id*, unsigned int, unsigned long, void*, unsigned > long*), _cl_platform_id*> >(cl::detail::GetInfoFunctor0<int > (*)(_cl_platform_id*, unsigned int, unsigned long, void*, unsigned > long*), _cl_platform_id*>, unsigned int, std::__1::basic_string<char, > std::__1::char_traits<char>, std::__1::allocator<char> >*, long) > [clone .isra.35] [clone .constprop.213] () > > #3 0x000000000040fd57 in > cl::detail::param_traits<cl::detail::cl_platform_info, > 2308>::param_type cl::Platform::getInfo<2308>(int*) const () > > #4 0x0000000000407fc3 in main () > > > > > > If I rename /etc/OpenCL/vendors/intel64.icd I can then run “vglrun > clinfo”, and it does not crash. I get a list indicating AMD APP is > the provider, with one CPU device. The GPU device is not listed. If > I run clinfo without using vglrun, the AMD APP provider lists both a > CPU device and a GPU device. > > > > > > > > Thanks, > > > > Jeff McWilliams > > Software Development Manager – Altair HyperView > > > > > > > > > > ------------------------------------------------------------------------------ > Attend Shape: An AT&T Tech Expo July 15-16. Meet us at AT&T Park in San > Francisco, CA to explore cutting-edge tech and listen to tech luminaries > present their vision of the future. This family event has something for > everyone, including kids. Get more information and register today. > http://sdm.link/attshape > > > _______________________________________________ > VirtualGL-Users mailing list > Vir...@li... > https://lists.sourceforge.net/lists/listinfo/virtualgl-users |
|
From: DRC <dco...@us...> - 2016-09-30 16:46:38
|
I've successfully used Ubuntu 16.04 with the distribution-supplied nVidia driver package and VirtualGL 2.5, so I know that that configuration works properly. It seems that the root of the problem is that your 3D X server (display :0) isn't starting, which has nothing to do with VirtualGL. I assume that you are unable to log into the machine locally? My best advice would be to try re-installing the nVidia driver package and rebooting. Also double-check that the nouveau driver isn't installed, as that can cause conflicts with the proprietary nVidia driver. It seems like you might be running into the same problem described here: https://devtalk.nvidia.com/default/topic/951741/linux/locked-out-of-xwindows-login-loop-after-driver-upgrade-ubuntu-16-04/, in which case you should try the troubleshooting advice from that thread. On 9/30/16 6:58 AM, Wolfgang Verestek wrote: > > > Dear VirtualGL users, > > I need some help with 3D visualization. I'm using a workstation for > visualization of some simulation data. For this we were using Ubuntu 14.04 + > VNC and VirtualGL for 3D applications (gnome, xfce4 as Desktop environment). > Everything was working, but after upgrading to 16.04 I just cannot get > VirtualGL running properly utilizing the Nvidia GPU (Titan Z). > > wolfgang@WS-Mikro:~$ lspci | grep VGA > 07:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED Graphics > Family (rev 30) > 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce GTX > TITAN Z] (rev a1) > > wolfgang@WS-Mikro:~$ lspci | grep NVIDIA > 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce GTX > TITAN Z] (rev a1) > 83:00.1 Audio device: NVIDIA Corporation GK110 HDMI Audio (rev a1) > 84:00.0 3D controller: NVIDIA Corporation GK110B [GeForce GTX TITAN Z] (rev > a1) > > wolfgang@WS-Mikro:~$ vglrun +v glxgears > [VGL] Shared memory segment ID for vglconfig: 950279 [VGL] VirtualGL v2.5 > 64-bit (Build 20160215) [VGL] Opening connection to 3D X server :0 No > protocol specified [VGL] ERROR: Could not open display :0. > > > in /etc/log/Xorg.0.log i get some error messages, but to be honest i have no > idea how to resolve this issue... > wolfgang@WS-Mikro:~$ tail /var/log/Xorg.0.log > [ 251.081] (WW) Disabling Mouse0 > [ 251.081] (WW) Disabling Keyboard0 > [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.082] (EE) [drm] Failed to open DRM device for (null): -22 > [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:83:00.0: -22 > [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:84:00.0: -22 > [ 251.082] Number of created screens does not match number of detected > devices. > Configuration failed. > VNC-Screen: > wolfgang@WS-Mikro:~$ tail /var/log/Xorg.4.log > [ 10.046] xorg-server 2:1.15.1-0ubuntu2.6 (For technical support please > see http://www.ubuntu.com/support) > [ 10.046] Current version of pixman: 0.30.2 > [ 10.046] Before reporting problems, check http://wiki.x.org > to make sure that you have the latest version. > [ 10.046] Markers: (--) probed, (**) from config file, (==) default > setting, > (++) from command line, (!!) notice, (II) informational, > (WW) warning, (EE) error, (NI) not implemented, (??) unknown. > [ 10.046] (==) Log file: "/var/log/Xorg.4.log", Time: Mon Mar 2 10:46:31 > 2015 > [ 10.046] (==) Using config file: "/etc/X11/xorg.conf" > [ 10.046] (==) Using system config directory "/usr/share/X11/xorg.conf.d" > > > X server is running, also e.g. xclock works on the VNC screen. > wolfgang@WS-Mikro:~$ w > 16:02:51 up 30 min, 2 users, load average: 2,16, 2,35, 1,67 > USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT > dummy_us tty7 :0 15:32 30:42 8:57 0.09s > update-notifier > wolfgang pts/0 10.132.2.119 15:36 5:27 7.77s 0.23s -bash > > wolfgang@WS-Mikro:~$ nvidia-smi > Wed Sep 28 16:02:54 2016 > +------------------------------------------------------+ > | NVIDIA-SMI 361.42 Driver Version: 361.42 | > |-------------------------------+----------------------+-------------------- > --+ > | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. > ECC | > | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute > M. | > |===============================+======================+==================== > ==| > | 0 GeForce GTX TIT... Off | 0000:83:00.0 On | > N/A | > | 36% 50C P8 28W / 189W | 95MiB / 6143MiB | 0% > Default | > +-------------------------------+----------------------+-------------------- > --+ > | 1 GeForce GTX TIT... Off | 0000:84:00.0 Off | > N/A | > | 33% 47C P8 33W / 189W | 15MiB / 6143MiB | 0% > Default | > +-------------------------------+----------------------+-------------------- > --+ > > +--------------------------------------------------------------------------- > --+ > | Processes: GPU > Memory | > | GPU PID Type Process name Usage > | > |=========================================================================== > ==| > | 0 2968 G /usr/lib/xorg/Xorg > 33MiB | > | 0 3262 G /usr/bin/gnome-shell > 43MiB | > +--------------------------------------------------------------------------- > --+ > > xorg.conf (which was working on 14.04 LTS): > Code: > > Section "DRI" > Mode 0666 > EndSection > > Section "ServerLayout" > Identifier "Layout0" > Screen 0 "Screen0" 0 0 > Screen 1 "Screen1" RightOf "Screen0" > InputDevice "Keyboard0" "CoreKeyboard" > InputDevice "Mouse0" "CorePointer" > Option "Xinerama" "0" > EndSection > > Section "Files" > EndSection > > Section "InputDevice" > > # generated from default > Identifier "Mouse0" > Driver "mouse" > Option "Protocol" "auto" > Option "Device" "/dev/psaux" > Option "Emulate3Buttons" "no" > Option "ZAxisMapping" "4 5" > EndSection > > Section "InputDevice" > > # generated from default > Identifier "Keyboard0" > Driver "kbd" > EndSection > > Section "Monitor" > Identifier "Monitor0" > VendorName "Unknown" > ModelName "BenQ G2200W" > HorizSync 31.0 - 83.0 > VertRefresh 55.0 - 76.0 > Option "DPMS" > EndSection > > Section "Monitor" > Identifier "Monitor1" > VendorName "Unknown" > ModelName "Dell DEL 1908FPBLK" > HorizSync 30.0 - 81.0 > VertRefresh 56.0 - 76.0 > EndSection > > Section "Device" > Identifier "Device0" > Driver "nvidia" > VendorName "NVIDIA Corporation" > BoardName "GeForce GTX TITAN Z" > BusID "PCI:83:0:0" > Screen 0 > EndSection > > Section "Device" > Identifier "Device1" > Driver "nvidia" > VendorName "NVIDIA Corporation" > BoardName "GeForce GTX TITAN Z" > BusID "PCI:84:0:0" > Screen 1 > EndSection > > Section "Screen" > Identifier "Screen0" > Device "Device0" > Monitor "Monitor0" > DefaultDepth 24 > Option "Stereo" "0" > Option "nvidiaXineramaInfoOrder" "DFP-0" > Option "metamodes" "DVI-I-1: nvidia-auto-select +0+0" > Option "SLI" "Off" > Option "MultiGPU" "On" > Option "BaseMosaic" "off" > Option "AllowGLXWithComposite" "true" > SubSection "Display" > Depth 24 > EndSubSection > EndSection > > Section "Screen" > Identifier "Screen1" > Device "Device1" > Monitor "Monitor1" > DefaultDepth 24 > Option "Stereo" "0" > Option "metamodes" "DVI-D-0: nvidia-auto-select +0+0" > Option "SLI" "Off" > Option "MultiGPU" "On" > Option "BaseMosaic" "off" > Option "AllowGLXWithComposite" "true" > SubSection "Display" > Depth 24 > EndSubSection > EndSection > > I hope somebody can give me a hint how to resolve that issue. If some > additional info is needed let me know. > > > Best regards > Wolfgang |
|
From: Wolfgang V. <wol...@im...> - 2016-09-30 11:58:27
|
Dear VirtualGL users, I need some help with 3D visualization. I'm using a workstation for visualization of some simulation data. For this we were using Ubuntu 14.04 + VNC and VirtualGL for 3D applications (gnome, xfce4 as Desktop environment). Everything was working, but after upgrading to 16.04 I just cannot get VirtualGL running properly utilizing the Nvidia GPU (Titan Z). wolfgang@WS-Mikro:~$ lspci | grep VGA 07:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED Graphics Family (rev 30) 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce GTX TITAN Z] (rev a1) wolfgang@WS-Mikro:~$ lspci | grep NVIDIA 83:00.0 VGA compatible controller: NVIDIA Corporation GK110B [GeForce GTX TITAN Z] (rev a1) 83:00.1 Audio device: NVIDIA Corporation GK110 HDMI Audio (rev a1) 84:00.0 3D controller: NVIDIA Corporation GK110B [GeForce GTX TITAN Z] (rev a1) wolfgang@WS-Mikro:~$ vglrun +v glxgears [VGL] Shared memory segment ID for vglconfig: 950279 [VGL] VirtualGL v2.5 64-bit (Build 20160215) [VGL] Opening connection to 3D X server :0 No protocol specified [VGL] ERROR: Could not open display :0. in /etc/log/Xorg.0.log i get some error messages, but to be honest i have no idea how to resolve this issue... wolfgang@WS-Mikro:~$ tail /var/log/Xorg.0.log [ 251.081] (WW) Disabling Mouse0 [ 251.081] (WW) Disabling Keyboard0 [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 [ 251.081] (EE) [drm] Failed to open DRM device for (null): -22 [ 251.082] (EE) [drm] Failed to open DRM device for (null): -22 [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:83:00.0: -22 [ 251.082] (EE) [drm] Failed to open DRM device for pci:0000:84:00.0: -22 [ 251.082] Number of created screens does not match number of detected devices. Configuration failed. VNC-Screen: wolfgang@WS-Mikro:~$ tail /var/log/Xorg.4.log [ 10.046] xorg-server 2:1.15.1-0ubuntu2.6 (For technical support please see http://www.ubuntu.com/support) [ 10.046] Current version of pixman: 0.30.2 [ 10.046] Before reporting problems, check http://wiki.x.org to make sure that you have the latest version. [ 10.046] Markers: (--) probed, (**) from config file, (==) default setting, (++) from command line, (!!) notice, (II) informational, (WW) warning, (EE) error, (NI) not implemented, (??) unknown. [ 10.046] (==) Log file: "/var/log/Xorg.4.log", Time: Mon Mar 2 10:46:31 2015 [ 10.046] (==) Using config file: "/etc/X11/xorg.conf" [ 10.046] (==) Using system config directory "/usr/share/X11/xorg.conf.d" X server is running, also e.g. xclock works on the VNC screen. wolfgang@WS-Mikro:~$ w 16:02:51 up 30 min, 2 users, load average: 2,16, 2,35, 1,67 USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT dummy_us tty7 :0 15:32 30:42 8:57 0.09s update-notifier wolfgang pts/0 10.132.2.119 15:36 5:27 7.77s 0.23s -bash wolfgang@WS-Mikro:~$ nvidia-smi Wed Sep 28 16:02:54 2016 +------------------------------------------------------+ | NVIDIA-SMI 361.42 Driver Version: 361.42 | |-------------------------------+----------------------+-------------------- --+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | |===============================+======================+==================== ==| | 0 GeForce GTX TIT... Off | 0000:83:00.0 On | N/A | | 36% 50C P8 28W / 189W | 95MiB / 6143MiB | 0% Default | +-------------------------------+----------------------+-------------------- --+ | 1 GeForce GTX TIT... Off | 0000:84:00.0 Off | N/A | | 33% 47C P8 33W / 189W | 15MiB / 6143MiB | 0% Default | +-------------------------------+----------------------+-------------------- --+ +--------------------------------------------------------------------------- --+ | Processes: GPU Memory | | GPU PID Type Process name Usage | |=========================================================================== ==| | 0 2968 G /usr/lib/xorg/Xorg 33MiB | | 0 3262 G /usr/bin/gnome-shell 43MiB | +--------------------------------------------------------------------------- --+ xorg.conf (which was working on 14.04 LTS): Code: Section "DRI" Mode 0666 EndSection Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 Screen 1 "Screen1" RightOf "Screen0" InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "BenQ G2200W" HorizSync 31.0 - 83.0 VertRefresh 55.0 - 76.0 Option "DPMS" EndSection Section "Monitor" Identifier "Monitor1" VendorName "Unknown" ModelName "Dell DEL 1908FPBLK" HorizSync 30.0 - 81.0 VertRefresh 56.0 - 76.0 EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GTX TITAN Z" BusID "PCI:83:0:0" Screen 0 EndSection Section "Device" Identifier "Device1" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GTX TITAN Z" BusID "PCI:84:0:0" Screen 1 EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "Stereo" "0" Option "nvidiaXineramaInfoOrder" "DFP-0" Option "metamodes" "DVI-I-1: nvidia-auto-select +0+0" Option "SLI" "Off" Option "MultiGPU" "On" Option "BaseMosaic" "off" Option "AllowGLXWithComposite" "true" SubSection "Display" Depth 24 EndSubSection EndSection Section "Screen" Identifier "Screen1" Device "Device1" Monitor "Monitor1" DefaultDepth 24 Option "Stereo" "0" Option "metamodes" "DVI-D-0: nvidia-auto-select +0+0" Option "SLI" "Off" Option "MultiGPU" "On" Option "BaseMosaic" "off" Option "AllowGLXWithComposite" "true" SubSection "Display" Depth 24 EndSubSection EndSection I hope somebody can give me a hint how to resolve that issue. If some additional info is needed let me know. Best regards Wolfgang |
|
From: DRC <dco...@us...> - 2016-09-29 21:55:30
|
I can reproduce the issue using the AMD APP SDK v3.0 and v16.1.1 of the Intel OpenCL runtime for Core and Xeon processors. It's unclear, however, what exactly VirtualGL is doing to contribute to the application's delinquency. I'll keep poking at it. The problem seems to be two-fold: (1) Why isn't the AMD GPU OpenCL device enabled when using VirtualGL? (2) Why does it crash when the Intel provider is also installed? These may be related issues, although it's also important to point out that RHEL 6 and AMD CPUs are not a supported configuration for the Intel OpenCL drivers. On 6/27/16 10:23 AM, Jeff McWilliams wrote: > > Can one use OpenCL when running under VirtualGL? > > > > I noticed that clinfo segfaults if I invoke it with: > > > > “vglrun clinfo” > > > > I’m using the latest VirtualGL 2.5, CentOS 6.7 64-bit, AMD FirePro > W8000 graphics. > > > > Core was generated by `clinfo'. > > Program terminated with signal SIGSEGV, Segmentation fault. > > #0 0x00007f6e9fd77fcd in ?? () from > /opt/intel/opencl-1.2-4.4.0.117/lib64/libintelocl.so > > Missing separate debuginfos, use: debuginfo-install > VirtualGL-2.5-20160215.x86_64 glibc-2.12-1.166.el6_7.7.x86_64 > libX11-1.6.0-6.el6.x86_64 libXau-1.0.6-4.el6.x86_64 > libXext-1.3.2-2.1.el6.x86_64 libXv-1.0.9-2.1.el6.x86_64 > libgcc-4.4.7-16.el6.x86_64 libstdc++-4.4.7-16.el6.x86_64 > libxcb-1.9.1-3.el6.x86_64 mesa-libGL-10.4.3-1.el6.x86_64 > numactl-2.0.9-2.el6.x86_64 opencl-1.2-intel-cpu-4.4.0.117-1.x86_64 > > (gdb) bt > > #0 0x00007f6e9fd77fcd in ?? () from > /opt/intel/opencl-1.2-4.4.0.117/lib64/libintelocl.so > > #1 0x00007f6e9fcf1e81 in clGetPlatformInfo () from > /opt/intel/opencl-1.2-4.4.0.117/lib64/libintelocl.so > > #2 0x000000000040e8be in int > cl::detail::getInfoHelper<cl::detail::GetInfoFunctor0<int > (*)(_cl_platform_id*, unsigned int, unsigned long, void*, unsigned > long*), _cl_platform_id*> >(cl::detail::GetInfoFunctor0<int > (*)(_cl_platform_id*, unsigned int, unsigned long, void*, unsigned > long*), _cl_platform_id*>, unsigned int, std::__1::basic_string<char, > std::__1::char_traits<char>, std::__1::allocator<char> >*, long) > [clone .isra.35] [clone .constprop.213] () > > #3 0x000000000040fd57 in > cl::detail::param_traits<cl::detail::cl_platform_info, > 2308>::param_type cl::Platform::getInfo<2308>(int*) const () > > #4 0x0000000000407fc3 in main () > > > > > > If I rename /etc/OpenCL/vendors/intel64.icd I can then run “vglrun > clinfo”, and it does not crash. I get a list indicating AMD APP is > the provider, with one CPU device. The GPU device is not listed. If > I run clinfo without using vglrun, the AMD APP provider lists both a > CPU device and a GPU device. > > > > > > > > Thanks, > > > > Jeff McWilliams > > Software Development Manager – Altair HyperView > |
|
From: DRC <dco...@us...> - 2016-09-29 21:05:09
|
It's impossible to say without knowing more. It could be a bug in TigerVNC, or the window manager, or VirtualGL, or the nVidia drivers. Unfortunately, there's no way to know unless you can somehow narrow down the possibilities for me: -- Are you running the window manager using VirtualGL, or are you using TigerVNC's built-in (software) OpenGL implementation to run the WM? If you're using the latter, then this isn't a VirtualGL bug, and you should report it to Cendio. -- Can you ascertain whether any particular application triggers the bug, or does it seem to occur irrespective of the applications that have been run? I'm asking this to attempt to determine whether the window manager itself is generating the corrupt pixels or whether they are somehow triggered by running a 3D application. If you are running the WM in VirtualGL, then trying to run a 3D application within the WM, then I could envision there being a possible conflict between the two. Note that if you're running the WM in VirtualGL, you shouldn't use vglrun to launch 3D applications (because VirtualGL is already preloaded into everything launched from within the WM. -- Can you ascertain whether the issue occurs with TurboVNC as well? Use TurboVNC 2.1 and the -3dwm switch to load the window manager in VirtualGL. If that configuration also fails, then it would point to this being a bug in either VirtualGL or the nVidia drivers. Also, we generally recommend using MATE rather than GNOME 3. It's much more VNC-friendly. On 9/15/16 1:48 PM, Carsten Rose wrote: > Dear all > > I'm new to virtualGL and try to use it in a Cendio/Thinlinc environment. > The first tests (VirtualGL 2.5) have been _quite_ impressive by using > Gnome3 or Compiz/Unity (Ubuntu 16.04, thinlinc 4.6). Thanks a lot for > this superb software - in fact, I couldn't believe that there exist a > free solution for sharing GPU(s) among several desktop sessions. Fantastic. > > For our test, we use a NVIDIA Corporation GT218 [GeForce 210]. We > installed two of them, but are using only one at the moment. > > After a few days some desktop sessions starts to show garbled desktops, > typically for less than a second. 'garbled' means a lot of pixel (not > all) are wrong. > Moving the mouse or a window fixes the broken display - but only until > it happens again (sometimes after 10 seconds, sometimes after hours). > This appears neither regular nor in every desktop session (one of three > daily users never saw this effect). Killing the session and re-login > doesn't change anything. > We see no error messages indicating a problem. > The nvidia monitoring shows: > > $ nvidia-smi > Thu Sep 15 20:39:59 2016 > +------------------------------------------------------+ > > | NVIDIA-SMI 340.96 Driver Version: 340.96 | > > |-------------------------------+----------------------+----------------------+ > | GPU Name Persistence-M| Bus-Id Disp.A | Volatile > Uncorr. ECC | > | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util > Compute M. | > |===============================+======================+======================| > | 0 GeForce 210 Off | 0000:81:00.0 N/A | > N/A | > | N/A 36C P8 N/A / N/A | 935MiB / 1023MiB | N/A > Default | > +-------------------------------+----------------------+----------------------+ > | 1 GeForce 210 Off | 0000:82:00.0 N/A | > N/A | > | N/A 35C P8 N/A / N/A | 2MiB / 1023MiB | N/A > Default | > +-------------------------------+----------------------+----------------------+ > > > +-----------------------------------------------------------------------------+ > | Compute processes: GPU > Memory | > | GPU PID Process name Usage > | > |=============================================================================| > | 0 Not Supported > | > | 1 Not Supported > | > +-----------------------------------------------------------------------------+ > > > Any ideas or hints about the reason or how to do more investigation? > > Thanks a lot for your time. > > CU > Carsten > |