Andy Isaacson wrote:
> On Thu, Mar 23, 2000 at 10:46:07AM -0700, Brian Paul wrote:
> > I've just checked in a change to xc/lib/GL/dri/dri_glx.c in order to
> > circumvent the 6-second delay problem and the printing of the
> > "_X11TransSocketUNIXConnect: Can't connect: errno = 111" message.
> > There's a new env var, LIBGL_MULTIHEAD. If it's defined, libGL will
> > probe for all X servers in order to collect driver extension info.
> > If it's not defined, only server :0 will be addressed.
> Before I looked at the code, I hoped you meant "only the server
> denoted by the DISPLAY environment variable".
> looking at the code, I see that's not the case. That strikes me as a
> bad idea; shouldn't the code only attempt to contact an explicitly
> listed set of displays?
> The current code will be very annoying on systems with more than one
> running X server with verbose logging on (where the X server prints a
> warning about every client it refuses a connection to). IRIX X
> servers ship configured that way. (But it looks like XFree86 doesn't
> support this mode of operation, so perhaps it's not a big deal. See
> xc/programs/Xserver/os/connection.c:AuthAudit(). My IRIX box says
> Xsgi0: AUDIT: Thu Mar 23 13:11:25 2000: 44636 Xsgi: client 18
> rejected from IP 220.127.116.11 port 4165 .)
The point is to query all the _local_ X server's 3D drivers to let
them register new extensions with libGL.so for direct rendering.
Even if the DISPLAY env var points to a remote display, that doesn't
preclude a GL client app from opening a local connection.