From: Felix <fx...@gm...> - 2003-07-29 20:23:22
|
Hi, as I'm going to clean up vsync related stuff on the config-0-0-1-branch I read the code for dynamic glx extension registration in xc/lib/GL/dri/dri_glx.c and xc/lib/GL/glx/glxextensions.[ch]. I stumbled over this comment in front of __glXRegisterExtensions: ** In older versions of libGL (prior to October 2002) we _always_ ** called this function during libGL start-up. Now, we only call ** it from glXGetProcAddress() as a last resort. However, __glXRegisterExtensions is still called in driCreateDisplay. Hmm, on the other hand I found this comment in radeon_screen.c in front of __driRegisterExtensions: /* This function is called by libGL.so as soon as libGL.so is loaded. * This is where we'd register new extension functions with the dispatcher. Do the __driRegisterExtensions functions in the drivers rely on being called during initialisation? In fact I believe it could be dangerous if __driRegisterExtensions was called later as it may override extensions disabled in e.g. CreateContext due to lacking hardware support. Fortunately __glXRegisterExtensions returns immediately if it is called the second or later time. Maybe it's just a matter of updating a few comments after all. Regards, Felix ------------ __\|/__ ___ ___ ------------------------- Felix ___\_e -_/___/ __\___/ __\_____ You can do anything, Kühling (_____\Ä/____/ /_____/ /________) just not everything fx...@gm... \___/ \___/ U at the same time. |
From: Keith W. <ke...@tu...> - 2003-07-29 20:31:53
|
Felix K=FChling wrote: > Hi, >=20 > as I'm going to clean up vsync related stuff on the config-0-0-1-branch > I read the code for dynamic glx extension registration in > xc/lib/GL/dri/dri_glx.c and xc/lib/GL/glx/glxextensions.[ch]. I stumble= d > over this comment in front of __glXRegisterExtensions: >=20 > ** In older versions of libGL (prior to October 2002) we _always_ > ** called this function during libGL start-up. Now, we only call > ** it from glXGetProcAddress() as a last resort. >=20 > However, __glXRegisterExtensions is still called in driCreateDisplay. > Hmm, on the other hand I found this comment in radeon_screen.c in front > of __driRegisterExtensions: >=20 > /* This function is called by libGL.so as soon as libGL.so is loaded. > * This is where we'd register new extension functions with the dispatc= her. >=20 > Do the __driRegisterExtensions functions in the drivers rely on being > called during initialisation? >=20 > In fact I believe it could be dangerous if __driRegisterExtensions was > called later as it may override extensions disabled in e.g. > CreateContext due to lacking hardware support. Fortunately > __glXRegisterExtensions returns immediately if it is called the second > or later time. Maybe it's just a matter of updating a few comments afte= r > all. The __driRegisterExtensions is used to add named entrypoints to the dispa= tch=20 table, that can be retrieved with glXGetProcAddress, etc. It doesn't "tu= rn=20 on" extensions in the sense that they are added to the extensions string=20 managed by mesa/src/extensions.c Keith |
From: Ian R. <id...@us...> - 2003-07-29 20:59:11
|
Felix K=FChling wrote: > Hi, >=20 > as I'm going to clean up vsync related stuff on the config-0-0-1-branch > I read the code for dynamic glx extension registration in > xc/lib/GL/dri/dri_glx.c and xc/lib/GL/glx/glxextensions.[ch]. I stumble= d > over this comment in front of __glXRegisterExtensions: >=20 > ** In older versions of libGL (prior to October 2002) we _always_ > ** called this function during libGL start-up. Now, we only call > ** it from glXGetProcAddress() as a last resort. >=20 > However, __glXRegisterExtensions is still called in driCreateDisplay. > Hmm, on the other hand I found this comment in radeon_screen.c in front > of __driRegisterExtensions: >=20 > /* This function is called by libGL.so as soon as libGL.so is loaded. > * This is where we'd register new extension functions with the dispatc= her. >=20 > Do the __driRegisterExtensions functions in the drivers rely on being > called during initialisation? >=20 > In fact I believe it could be dangerous if __driRegisterExtensions was > called later as it may override extensions disabled in e.g. > CreateContext due to lacking hardware support. Fortunately > __glXRegisterExtensions returns immediately if it is called the second > or later time. Maybe it's just a matter of updating a few comments afte= r > all. I'm inclined to believe that the comments in dri_glx.c are just wrong.=20 __glXRegisterExtensions has to be called before a call to=20 glXGetProcAddress. The app can query that string via=20 glXQueryExtensionsString long before calling glXGetProcAddress. In=20 fact, it may never call glXGetProcAddress. I'm sure glxinfo doesn't. :) |
From: Felix <fx...@gm...> - 2003-07-29 21:59:43
|
On Tue, 29 Jul 2003 13:58:58 -0700 Ian Romanick <id...@us...> wrote: > Felix Kühling wrote: > > > Hi, > > > > as I'm going to clean up vsync related stuff on the config-0-0-1-branch > > I read the code for dynamic glx extension registration in > > xc/lib/GL/dri/dri_glx.c and xc/lib/GL/glx/glxextensions.[ch]. I stumbled > > over this comment in front of __glXRegisterExtensions: > > > > ** In older versions of libGL (prior to October 2002) we _always_ > > ** called this function during libGL start-up. Now, we only call > > ** it from glXGetProcAddress() as a last resort. > > > > However, __glXRegisterExtensions is still called in driCreateDisplay. > > Hmm, on the other hand I found this comment in radeon_screen.c in front > > of __driRegisterExtensions: > > > > /* This function is called by libGL.so as soon as libGL.so is loaded. > > * This is where we'd register new extension functions with the dispatcher. > > > > Do the __driRegisterExtensions functions in the drivers rely on being > > called during initialisation? > > > > In fact I believe it could be dangerous if __driRegisterExtensions was > > called later as it may override extensions disabled in e.g. > > CreateContext due to lacking hardware support. Fortunately > > __glXRegisterExtensions returns immediately if it is called the second > > or later time. Maybe it's just a matter of updating a few comments after > > all. > > I'm inclined to believe that the comments in dri_glx.c are just wrong. > __glXRegisterExtensions has to be called before a call to > glXGetProcAddress. The app can query that string via > glXQueryExtensionsString long before calling glXGetProcAddress. In > fact, it may never call glXGetProcAddress. I'm sure glxinfo doesn't. :) So this does influence which extensions are listed in the extension string, contradicting what Keith wrote? In that case I have one more question. How can this work with multi-head configurations where you can have multiple different cards (different screens) on one display. Then each driver will add or readd extensions. But they should never disable any extensions, right? You don't want drivers to disable each others extensions, do you? Consequently __glXDisableExtension should never be called (or better not even exist ;-). And the only way to disable an extension is to not enable it. Thus, if you don't want to enable the swap-interval extensions if the hardware can't support them (no IRQs) then you have to know whether IRQs work at the time __driRegisterExtensions is called. Is that possible? Just my thoughts. I hope I'm wrong ;-) Felix ------------ __\|/__ ___ ___ ------------------------- Felix ___\_e -_/___/ __\___/ __\_____ You can do anything, Kühling (_____\Ä/____/ /_____/ /________) just not everything fx...@gm... \___/ \___/ U at the same time. |
From: Ian R. <id...@us...> - 2003-07-29 23:01:35
|
Felix K=FChling wrote: > On Tue, 29 Jul 2003 13:58:58 -0700 > Ian Romanick <id...@us...> wrote: >=20 >>Felix K=FChling wrote: >> >>>Do the __driRegisterExtensions functions in the drivers rely on being >>>called during initialisation? >>> >>>In fact I believe it could be dangerous if __driRegisterExtensions was >>>called later as it may override extensions disabled in e.g. >>>CreateContext due to lacking hardware support. Fortunately >>>__glXRegisterExtensions returns immediately if it is called the second >>>or later time. Maybe it's just a matter of updating a few comments aft= er >>>all. >> >>I'm inclined to believe that the comments in dri_glx.c are just wrong.=20 >>__glXRegisterExtensions has to be called before a call to=20 >>glXGetProcAddress. The app can query that string via=20 >>glXQueryExtensionsString long before calling glXGetProcAddress. In=20 >>fact, it may never call glXGetProcAddress. I'm sure glxinfo doesn't. := ) >=20 > So this does influence which extensions are listed in the extension > string, contradicting what Keith wrote? In that case I have one more > question. How can this work with multi-head configurations where you ca= n > have multiple different cards (different screens) on one display. Then > each driver will add or readd extensions. But they should never disable > any extensions, right? You don't want drivers to disable each others > extensions, do you? It influences the GLX extension string. To test this, I fired up gears=20 under gdb. Basically, run it once from gdb, then set a break-poing at=20 __driRegisterExtensions, then run it again. That's the easiest way.=20 Here's the back-trace from that second run: #0 __driRegisterExtensions () at r200_screen.c:435 #1 0x400b959e in __glXRegisterExtensions () at dri_glx.c:464 #2 0x400b948f in driCreateDisplay (dpy=3D0x804b7a0, pdisp=3D0x804cd14) a= t=20 dri_glx.c:394 #3 0x400a4f9f in __glXInitialize (dpy=3D0x804b7a0) at glxext.c:885 #4 0x400a1ae3 in glXChooseVisual (dpy=3D0x804b7a0, screen=3D0,=20 attribList=3D0xbffff360) at glxcmds.c:1265 #5 0x4003ed20 in getVisualInfoRGB () from /usr/lib/libglut.so.3 #6 0x4003edca in __glutDetermineVisual () from /usr/lib/libglut.so.3 #7 0x4003ef98 in __glutDetermineWindowVisual () from /usr/lib/libglut.so= .3 #8 0x4003f037 in __glutCreateWindow () from /usr/lib/libglut.so.3 #9 0x4003f391 in glutCreateWindow () from /usr/lib/libglut.so.3 #10 0x0804a2d8 in main (argc=3D2, argv=3D0xbffff5b4) at gears.c:348 #11 0x42017589 in __libc_start_main () from /lib/i686/libc.so.6 You can see that __driRegisterExtensions gets called even if=20 glXQueryExtensionsString and glXGetProcAddress are not called. The problem is that the GLX extensions should be tracked per-screen, but=20 they are instead tracked per-display. This doesn't "matter" right now=20 because we don't support the configuration that you describe (at least=20 not as far as I know!). Each card would be it's own display. > Consequently __glXDisableExtension should never be called (or better no= t > even exist ;-). And the only way to disable an extension is to not > enable it. Thus, if you don't want to enable the swap-interval > extensions if the hardware can't support them (no IRQs) then you have t= o > know whether IRQs work at the time __driRegisterExtensions is called. I= s > that possible? Now there's an interesting point. The bigger problem is that the driver=20 might not have a chance to call __glXDisableExtension until *after* the=20 app has called glXQueryExtensionsString. At that point the extension=20 string cannot be changed. I'm not sure what the right answer is here. |
From: Felix <fx...@gm...> - 2003-07-30 07:55:28
|
On Tue, 29 Jul 2003 16:01:22 -0700 Ian Romanick <id...@us...> wrote: > Felix Kühling wrote: > > > On Tue, 29 Jul 2003 13:58:58 -0700 > > Ian Romanick <id...@us...> wrote: > > > >>Felix Kühling wrote: > >> > >>>Do the __driRegisterExtensions functions in the drivers rely on being > >>>called during initialisation? > >>> > >>>In fact I believe it could be dangerous if __driRegisterExtensions was > >>>called later as it may override extensions disabled in e.g. > >>>CreateContext due to lacking hardware support. Fortunately > >>>__glXRegisterExtensions returns immediately if it is called the second > >>>or later time. Maybe it's just a matter of updating a few comments after > >>>all. > >> > >>I'm inclined to believe that the comments in dri_glx.c are just wrong. > >>__glXRegisterExtensions has to be called before a call to > >>glXGetProcAddress. The app can query that string via > >>glXQueryExtensionsString long before calling glXGetProcAddress. In > >>fact, it may never call glXGetProcAddress. I'm sure glxinfo doesn't. :) > > > > So this does influence which extensions are listed in the extension > > string, contradicting what Keith wrote? In that case I have one more > > question. How can this work with multi-head configurations where you can > > have multiple different cards (different screens) on one display. Then > > each driver will add or readd extensions. But they should never disable > > any extensions, right? You don't want drivers to disable each others > > extensions, do you? > > It influences the GLX extension string. To test this, I fired up gears > under gdb. Basically, run it once from gdb, then set a break-poing at > __driRegisterExtensions, then run it again. That's the easiest way. > Here's the back-trace from that second run: > > #0 __driRegisterExtensions () at r200_screen.c:435 > #1 0x400b959e in __glXRegisterExtensions () at dri_glx.c:464 > #2 0x400b948f in driCreateDisplay (dpy=0x804b7a0, pdisp=0x804cd14) at > dri_glx.c:394 > #3 0x400a4f9f in __glXInitialize (dpy=0x804b7a0) at glxext.c:885 > #4 0x400a1ae3 in glXChooseVisual (dpy=0x804b7a0, screen=0, > attribList=0xbffff360) > at glxcmds.c:1265 > #5 0x4003ed20 in getVisualInfoRGB () from /usr/lib/libglut.so.3 > #6 0x4003edca in __glutDetermineVisual () from /usr/lib/libglut.so.3 > #7 0x4003ef98 in __glutDetermineWindowVisual () from /usr/lib/libglut.so.3 > #8 0x4003f037 in __glutCreateWindow () from /usr/lib/libglut.so.3 > #9 0x4003f391 in glutCreateWindow () from /usr/lib/libglut.so.3 > #10 0x0804a2d8 in main (argc=2, argv=0xbffff5b4) at gears.c:348 > #11 0x42017589 in __libc_start_main () from /lib/i686/libc.so.6 > > You can see that __driRegisterExtensions gets called even if > glXQueryExtensionsString and glXGetProcAddress are not called. > > The problem is that the GLX extensions should be tracked per-screen, but I see: C SPECIFICATION const char * glXQueryExtensionsString( Display *dpy, int screen ) > they are instead tracked per-display. This doesn't "matter" right now > because we don't support the configuration that you describe (at least > not as far as I know!). Each card would be it's own display. Maybe these configs don't work for one reason or another, but the configuration framework was designed with this in mind and also the code in dri_glx.c handles the case of different drivers for different screens. I see two choices here, either glxextensions.c manages multiple screens itself or the four bitfields server/client_support/only are managed in __GLXscreenConfigsRec. In either case glXGetUsableExtensions would have to be told about the screen. A screen number in the first case or a __GLXscreenConfigsRec pointer in the second case. > > > Consequently __glXDisableExtension should never be called (or better not > > even exist ;-). And the only way to disable an extension is to not > > enable it. Thus, if you don't want to enable the swap-interval > > extensions if the hardware can't support them (no IRQs) then you have to > > know whether IRQs work at the time __driRegisterExtensions is called. Is > > that possible? > > Now there's an interesting point. The bigger problem is that the driver > might not have a chance to call __glXDisableExtension until *after* the > app has called glXQueryExtensionsString. At that point the extension > string cannot be changed. I'm not sure what the right answer is here. Ok, so we have to know which extensions to enable before a driver is initialized in its createScreenFunc. Sounds tough :-/ Felix ------------ __\|/__ ___ ___ ------------------------- Felix ___\_e -_/___/ __\___/ __\_____ You can do anything, Kühling (_____\Ä/____/ /_____/ /________) just not everything fx...@gm... \___/ \___/ U at the same time. |
From: Ian R. <id...@us...> - 2003-07-30 16:20:42
|
Felix K=FChling wrote: > On Tue, 29 Jul 2003 16:01:22 -0700 > Ian Romanick <id...@us...> wrote: >=20 >=20 >>Felix K=FChling wrote: >> >> >>>On Tue, 29 Jul 2003 13:58:58 -0700 >>>Ian Romanick <id...@us...> wrote: >>> >>> >>>>Felix K=FChling wrote: >>>> >>>> >>>>>Do the __driRegisterExtensions functions in the drivers rely on bein= g >>>>>called during initialisation? >>>>> >>>>>In fact I believe it could be dangerous if __driRegisterExtensions w= as >>>>>called later as it may override extensions disabled in e.g. >>>>>CreateContext due to lacking hardware support. Fortunately >>>>>__glXRegisterExtensions returns immediately if it is called the seco= nd >>>>>or later time. Maybe it's just a matter of updating a few comments a= fter >>>>>all. >>>> >>>>I'm inclined to believe that the comments in dri_glx.c are just wrong= .=20 >>>>__glXRegisterExtensions has to be called before a call to=20 >>>>glXGetProcAddress. The app can query that string via=20 >>>>glXQueryExtensionsString long before calling glXGetProcAddress. In=20 >>>>fact, it may never call glXGetProcAddress. I'm sure glxinfo doesn't.= :) >>> >>>So this does influence which extensions are listed in the extension >>>string, contradicting what Keith wrote? In that case I have one more >>>question. How can this work with multi-head configurations where you c= an >>>have multiple different cards (different screens) on one display. Then >>>each driver will add or readd extensions. But they should never disabl= e >>>any extensions, right? You don't want drivers to disable each others >>>extensions, do you? >> >>It influences the GLX extension string. To test this, I fired up gears= =20 >>under gdb. Basically, run it once from gdb, then set a break-poing at=20 >>__driRegisterExtensions, then run it again. That's the easiest way.=20 >>Here's the back-trace from that second run: >> >>#0 __driRegisterExtensions () at r200_screen.c:435 >>#1 0x400b959e in __glXRegisterExtensions () at dri_glx.c:464 >>#2 0x400b948f in driCreateDisplay (dpy=3D0x804b7a0, pdisp=3D0x804cd14)= at=20 >>dri_glx.c:394 >>#3 0x400a4f9f in __glXInitialize (dpy=3D0x804b7a0) at glxext.c:885 >>#4 0x400a1ae3 in glXChooseVisual (dpy=3D0x804b7a0, screen=3D0,=20 >>attribList=3D0xbffff360) >> at glxcmds.c:1265 >>#5 0x4003ed20 in getVisualInfoRGB () from /usr/lib/libglut.so.3 >>#6 0x4003edca in __glutDetermineVisual () from /usr/lib/libglut.so.3 >>#7 0x4003ef98 in __glutDetermineWindowVisual () from /usr/lib/libglut.= so.3 >>#8 0x4003f037 in __glutCreateWindow () from /usr/lib/libglut.so.3 >>#9 0x4003f391 in glutCreateWindow () from /usr/lib/libglut.so.3 >>#10 0x0804a2d8 in main (argc=3D2, argv=3D0xbffff5b4) at gears.c:348 >>#11 0x42017589 in __libc_start_main () from /lib/i686/libc.so.6 >> >>You can see that __driRegisterExtensions gets called even if=20 >>glXQueryExtensionsString and glXGetProcAddress are not called. >> >>The problem is that the GLX extensions should be tracked per-screen, bu= t=20 >=20 >=20 > I see:=20 > C SPECIFICATION > const char * glXQueryExtensionsString( Display *dpy, > int screen ) I don't mean what the GLX specification says to do. I mean what our=20 code actually implements. Internally there is a *single* *global* table=20 & extension string. So it's not even tracking it per-display. It's=20 worse than that. :( >>they are instead tracked per-display. This doesn't "matter" right now=20 >>because we don't support the configuration that you describe (at least=20 >>not as far as I know!). Each card would be it's own display. >=20 > Maybe these configs don't work for one reason or another, but the > configuration framework was designed with this in mind and also the cod= e > in dri_glx.c handles the case of different drivers for different > screens. I see two choices here, either glxextensions.c manages multipl= e > screens itself or the four bitfields server/client_support/only are > managed in __GLXscreenConfigsRec. In either case glXGetUsableExtensions > would have to be told about the screen. A screen number in the first > case or a __GLXscreenConfigsRec pointer in the second case. Since glXGetUsableExtensions is only called from=20 glXQueryExtensionsString (glxcmds.c, line 1416), that should be an easy=20 change to make. The bit-fields and next_bit would have to be copied to the=20 __GLXscreenConfigsRec. We'd still want the global copies to track the=20 initial state. We'd also need to add an ext_list to the=20 __GLXscreenConfigsRec to track extensions added by calling=20 __glXAddExtension. >>>Consequently __glXDisableExtension should never be called (or better n= ot >>>even exist ;-). And the only way to disable an extension is to not >>>enable it. Thus, if you don't want to enable the swap-interval >>>extensions if the hardware can't support them (no IRQs) then you have = to >>>know whether IRQs work at the time __driRegisterExtensions is called. = Is >>>that possible? >> >>Now there's an interesting point. The bigger problem is that the drive= r=20 >>might not have a chance to call __glXDisableExtension until *after* the= =20 >>app has called glXQueryExtensionsString. At that point the extension=20 >>string cannot be changed. I'm not sure what the right answer is here. >=20 > Ok, so we have to know which extensions to enable before a driver is > initialized in its createScreenFunc. Sounds tough :-/ Agreed. I think for the most part we can enable the extensions, but=20 just have them fail when used. That's a little better than nothing. |
From: Felix <fx...@gm...> - 2003-07-30 22:03:54
|
On Wed, 30 Jul 2003 09:20:28 -0700 Ian Romanick <id...@us...> wrote: > Felix Kühling wrote: > > I see: > > C SPECIFICATION > > const char * glXQueryExtensionsString( Display *dpy, > > int screen ) > > I don't mean what the GLX specification says to do. I mean what our > code actually implements. Internally there is a *single* *global* table > & extension string. So it's not even tracking it per-display. It's > worse than that. :( Yeah, I was just pointing out how extension tracking is specified. > >>they are instead tracked per-display. This doesn't "matter" right now > >>because we don't support the configuration that you describe (at least > >>not as far as I know!). Each card would be it's own display. > > > > Maybe these configs don't work for one reason or another, but the > > configuration framework was designed with this in mind and also the code > > in dri_glx.c handles the case of different drivers for different > > screens. I see two choices here, either glxextensions.c manages multiple > > screens itself or the four bitfields server/client_support/only are > > managed in __GLXscreenConfigsRec. In either case glXGetUsableExtensions > > would have to be told about the screen. A screen number in the first > > case or a __GLXscreenConfigsRec pointer in the second case. > > Since glXGetUsableExtensions is only called from > glXQueryExtensionsString (glxcmds.c, line 1416), that should be an easy > change to make. It gets more complicated with __glXEnableExtension. If it has to access per-screen extension information it would need some sort of a screen parameter too. As it's called by the driver, this is a binary compatibility problem. Furthermore it is called from __driRegisterExtensions which doesn't know the screen itself. The quick and dirty solution would be a global screen pointer that indicates the screen currently being configured. A more invasive but more elegant solution is this: I observed that glXQueryExtensionsString calls glXInitialize first which in turn loads and initializes the dri drivers (calls their createScreen functions). Thus, before an extension string is returned all drivers are initialized. So why not register extensions in the driver's createScreen function? The only reason I can see is the call to glXRegisterExtensions in glXGetProcAddress. Is there a good reason for not calling glXInitialize in glXGetProcAddress instead? If not then this would kill two birds with one stone. We would know which screen we're dealing with when glXEnableExtension is called and we could enable extensions conditionally, depending on hardware support. We would have to add a new version of glXEnableExtension which takes a pointer to the GLXscreenConfigsRec or __DRIscreenRec (if we move the extension-related data structures there). > > The bit-fields and next_bit would have to be copied to the > __GLXscreenConfigsRec. We'd still want the global copies to track the > initial state. Isn't that just memset to 0? > We'd also need to add an ext_list to the > __GLXscreenConfigsRec to track extensions added by calling > __glXAddExtension. Ok. > > >>>Consequently __glXDisableExtension should never be called (or better not > >>>even exist ;-). And the only way to disable an extension is to not > >>>enable it. Thus, if you don't want to enable the swap-interval > >>>extensions if the hardware can't support them (no IRQs) then you have to > >>>know whether IRQs work at the time __driRegisterExtensions is called. Is > >>>that possible? > >> > >>Now there's an interesting point. The bigger problem is that the driver > >>might not have a chance to call __glXDisableExtension until *after* the > >>app has called glXQueryExtensionsString. At that point the extension > >>string cannot be changed. I'm not sure what the right answer is here. > > > > Ok, so we have to know which extensions to enable before a driver is > > initialized in its createScreenFunc. Sounds tough :-/ > > Agreed. I think for the most part we can enable the extensions, but > just have them fail when used. That's a little better than nothing. Maybe we can do better than that. See above :) Regards, Felix ------------ __\|/__ ___ ___ ------------------------- Felix ___\_e -_/___/ __\___/ __\_____ You can do anything, Kühling (_____\Ä/____/ /_____/ /________) just not everything fx...@gm... \___/ \___/ U at the same time. |
From: Ian R. <id...@us...> - 2003-07-30 23:01:11
|
Felix K=FChling wrote: > On Wed, 30 Jul 2003 09:20:28 -0700 > Ian Romanick <id...@us...> wrote: >=20 >=20 >>Felix K=FChling wrote: >> >>>I see:=20 >>>C SPECIFICATION >>> const char * glXQueryExtensionsString( Display *dpy, >>> int screen ) >> >>I don't mean what the GLX specification says to do. I mean what our=20 >>code actually implements. Internally there is a *single* *global* tabl= e=20 >>& extension string. So it's not even tracking it per-display. It's=20 >>worse than that. :( >=20 >=20 > Yeah, I was just pointing out how extension tracking is specified. >=20 >=20 >>>>they are instead tracked per-display. This doesn't "matter" right no= w=20 >>>>because we don't support the configuration that you describe (at leas= t=20 >>>>not as far as I know!). Each card would be it's own display. >>> >>>Maybe these configs don't work for one reason or another, but the >>>configuration framework was designed with this in mind and also the co= de >>>in dri_glx.c handles the case of different drivers for different >>>screens. I see two choices here, either glxextensions.c manages multip= le >>>screens itself or the four bitfields server/client_support/only are >>>managed in __GLXscreenConfigsRec. In either case glXGetUsableExtension= s >>>would have to be told about the screen. A screen number in the first >>>case or a __GLXscreenConfigsRec pointer in the second case. >> >>Since glXGetUsableExtensions is only called from=20 >>glXQueryExtensionsString (glxcmds.c, line 1416), that should be an easy= =20 >>change to make. >=20 > It gets more complicated with __glXEnableExtension. If it has to access > per-screen extension information it would need some sort of a screen > parameter too. As it's called by the driver, this is a binary > compatibility problem. Furthermore it is called from > __driRegisterExtensions which doesn't know the screen itself. It is a binary compatabilty problem, but a minor one. Since no code=20 with __glXEnableExtension has ever shipped with XFree86 (stable release=20 or their CVS), our exposure is pretty low. Low enough that I wouldn't=20 worry about it much. There is a pre-texmem code-path that was used by=20 the R200 driver that needs to be maintained. I'm not sure how to keep=20 that working. > The quick and dirty solution would be a global screen pointer that > indicates the screen currently being configured. >=20 > A more invasive but more elegant solution is this: >=20 > I observed that glXQueryExtensionsString calls glXInitialize first whic= h > in turn loads and initializes the dri drivers (calls their createScreen > functions). Thus, before an extension string is returned all drivers ar= e > initialized. So why not register extensions in the driver's createScree= n > function? The only reason I can see is the call to glXRegisterExtension= s > in glXGetProcAddress. Is there a good reason for not calling > glXInitialize in glXGetProcAddress instead? That's a really good idea. I think that solves most of the problems.=20 Keith, do you have a problem with that change? > If not then this would kill two birds with one stone. We would know > which screen we're dealing with when glXEnableExtension is called and w= e > could enable extensions conditionally, depending on hardware support. W= e > would have to add a new version of glXEnableExtension which takes a > pointer to the GLXscreenConfigsRec or __DRIscreenRec (if we move the > extension-related data structures there). >=20 >>The bit-fields and next_bit would have to be copied to the=20 >>__GLXscreenConfigsRec. We'd still want the global copies to track the=20 >>initial state. =20 >=20 > Isn't that just memset to 0? No. It pulls some state from an internal table. Some extensions (like=20 GLX_ARB_get_proc_address and GLX_EXT_visual_info) are enabled by=20 default. Since this list of "on by default" extensions my change, the=20 driver needs to pull in that initial state. >>We'd also need to add an ext_list to the=20 >>__GLXscreenConfigsRec to track extensions added by calling=20 >>__glXAddExtension. >=20 > Ok. |
From: Keith W. <ke...@tu...> - 2003-07-30 23:09:43
|
Ian Romanick wrote: > Felix K=FChling wrote: >=20 >> On Wed, 30 Jul 2003 09:20:28 -0700 >> Ian Romanick <id...@us...> wrote: >> >> >>> Felix K=FChling wrote: >>> >>>> I see: C SPECIFICATION >>>> const char * glXQueryExtensionsString( Display *dpy, >>>> int screen ) >>> >>> >>> I don't mean what the GLX specification says to do. I mean what our=20 >>> code actually implements. Internally there is a *single* *global*=20 >>> table & extension string. So it's not even tracking it per-display. = =20 >>> It's worse than that. :( >> >> >> >> Yeah, I was just pointing out how extension tracking is specified. >> >> >>>>> they are instead tracked per-display. This doesn't "matter" right=20 >>>>> now because we don't support the configuration that you describe=20 >>>>> (at least not as far as I know!). Each card would be it's own=20 >>>>> display. >>>> >>>> >>>> Maybe these configs don't work for one reason or another, but the >>>> configuration framework was designed with this in mind and also the=20 >>>> code >>>> in dri_glx.c handles the case of different drivers for different >>>> screens. I see two choices here, either glxextensions.c manages=20 >>>> multiple >>>> screens itself or the four bitfields server/client_support/only are >>>> managed in __GLXscreenConfigsRec. In either case glXGetUsableExtensi= ons >>>> would have to be told about the screen. A screen number in the first >>>> case or a __GLXscreenConfigsRec pointer in the second case. >>> >>> >>> Since glXGetUsableExtensions is only called from=20 >>> glXQueryExtensionsString (glxcmds.c, line 1416), that should be an=20 >>> easy change to make. >> >> >> It gets more complicated with __glXEnableExtension. If it has to acces= s >> per-screen extension information it would need some sort of a screen >> parameter too. As it's called by the driver, this is a binary >> compatibility problem. Furthermore it is called from >> __driRegisterExtensions which doesn't know the screen itself. >=20 >=20 > It is a binary compatabilty problem, but a minor one. Since no code=20 > with __glXEnableExtension has ever shipped with XFree86 (stable release= =20 > or their CVS), our exposure is pretty low. Low enough that I wouldn't=20 > worry about it much. There is a pre-texmem code-path that was used by=20 > the R200 driver that needs to be maintained. I'm not sure how to keep=20 > that working. >=20 >> The quick and dirty solution would be a global screen pointer that >> indicates the screen currently being configured. >> >> A more invasive but more elegant solution is this: >> >> I observed that glXQueryExtensionsString calls glXInitialize first whi= ch >> in turn loads and initializes the dri drivers (calls their createScree= n >> functions). Thus, before an extension string is returned all drivers a= re >> initialized. So why not register extensions in the driver's createScre= en >> function? The only reason I can see is the call to glXRegisterExtensio= ns >> in glXGetProcAddress. Is there a good reason for not calling >> glXInitialize in glXGetProcAddress instead? >=20 >=20 > That's a really good idea. I think that solves most of the problems.=20 > Keith, do you have a problem with that change? Not off the top of my head. It's worth asking Brian about this, as he's had greater involvement in th= ose=20 paths than I. Keith |
From: Marcelo E. M. <mma...@de...> - 2003-07-30 07:28:25
|
> > I'm inclined to believe that the comments in dri_glx.c are just > > wrong. __glXRegisterExtensions has to be called before a call to > > glXGetProcAddress. The app can query that string via > > glXQueryExtensionsString long before calling glXGetProcAddress. In > > fact, it may never call glXGetProcAddress. I'm sure glxinfo > > doesn't. :) On the other hand, the app can call glXGetProcAddress without opening a display or creating a context. If you follow the letter of the spec the only way to be sure that a extension is supported is by inspecting the extensions string, and for that you need a display. > So this does influence which extensions are listed in the extension > string, contradicting what Keith wrote? In that case I have one more > question. How can this work with multi-head configurations where you > can have multiple different cards (different screens) on one display. > Then each driver will add or readd extensions. But they should never > disable any extensions, right? You don't want drivers to disable each > others extensions, do you? Intuitively glGetString(GL_EXTENSIONS) should return the list of the extensions supported on the context that's current. That doesn't say anything about the extension being supported in hardware or software, which is I think the point of your question. That means that if the driver supports the extension and the extension is not precluded by the current context the extension should be reported as being supported. I remember someone on this list mentioning that the NVIDIA drivers do something fuzzy in this respect. OTOH, if you are talking about GLX extensions, most don't make sense without hardware support. Marcelo |
From: Ian R. <id...@us...> - 2003-07-30 16:26:32
|
Marcelo E. Magallon wrote: > > > I'm inclined to believe that the comments in dri_glx.c are just > > > wrong. __glXRegisterExtensions has to be called before a call to > > > glXGetProcAddress. The app can query that string via > > > glXQueryExtensionsString long before calling glXGetProcAddress. In > > > fact, it may never call glXGetProcAddress. I'm sure glxinfo > > > doesn't. :) > > On the other hand, the app can call glXGetProcAddress without opening a > display or creating a context. If you follow the letter of the spec > the only way to be sure that a extension is supported is by inspecting > the extensions string, and for that you need a display. > > > So this does influence which extensions are listed in the extension > > string, contradicting what Keith wrote? In that case I have one more > > question. How can this work with multi-head configurations where you > > can have multiple different cards (different screens) on one display. > > Then each driver will add or readd extensions. But they should never > > disable any extensions, right? You don't want drivers to disable each > > others extensions, do you? > > Intuitively glGetString(GL_EXTENSIONS) should return the list of the > extensions supported on the context that's current. For GLX extensions glXQueryExtensionsString is used. All this function gets is a Display pointer and a screen number. > That doesn't say > anything about the extension being supported in hardware or software, > which is I think the point of your question. That means that if the > driver supports the extension and the extension is not precluded by the > current context the extension should be reported as being supported. I > remember someone on this list mentioning that the NVIDIA drivers do > something fuzzy in this respect. > > OTOH, if you are talking about GLX extensions, most don't make sense > without hardware support. Right. As an example, we don't want the R200 driver to expose GLX_MESA_swap_control if there's no vblank interrupt available. Since it can't implement the functionality, we don't want to advertise it. |
From: Michel <mi...@da...> - 2003-07-30 21:30:04
|
On Wed, 2003-07-30 at 18:26, Ian Romanick wrote: > > [...] we don't want the R200 driver to expose GLX_MESA_swap_control > if there's no vblank interrupt available. Since it can't implement > the functionality, we don't want to advertise it. It could poll for vertical blank... -- Earthling Michel Dänzer \ Debian (powerpc), XFree86 and DRI developer Software libre enthusiast \ http://svcs.affero.net/rm.php?r=daenzer |
From: Alan C. <al...@lx...> - 2003-07-30 23:10:28
|
On Mer, 2003-07-30 at 22:28, Michel D=C3=A4nzer wrote: > On Wed, 2003-07-30 at 18:26, Ian Romanick wrote: > >=20 > > [...] we don't want the R200 driver to expose GLX_MESA_swap_control=20 > > if there's no vblank interrupt available. Since it can't implement=20 > > the functionality, we don't want to advertise it. >=20 > It could poll for vertical blank... I'm suprised to hear the radeon has no vblank interrupt, given the test VGA blank code seems to work on it 8) |
From: Ian R. <id...@us...> - 2003-07-31 02:10:55
|
Alan Cox wrote: > On Mer, 2003-07-30 at 22:28, Michel D=C3=A4nzer wrote: >=20 >>On Wed, 2003-07-30 at 18:26, Ian Romanick wrote: >> >>>[...] we don't want the R200 driver to expose GLX_MESA_swap_control=20 >>>if there's no vblank interrupt available. Since it can't implement=20 >>>the functionality, we don't want to advertise it. >> >>It could poll for vertical blank... >=20 > I'm suprised to hear the radeon has no vblank interrupt, given the test > VGA blank code seems to work on it 8) It does. This is more hypothetically speaking, and just using this as=20 an example of an extension that we may not always be able to enable. |
From: Jens O. <je...@tu...> - 2003-07-30 15:35:34
|
Felix K=FChling wrote: > On Tue, 29 Jul 2003 13:58:58 -0700 > Ian Romanick <id...@us...> wrote: >=20 >=20 >>Felix K=FChling wrote: >> >> >>>Hi, >>> >>>as I'm going to clean up vsync related stuff on the config-0-0-1-branc= h >>>I read the code for dynamic glx extension registration in >>>xc/lib/GL/dri/dri_glx.c and xc/lib/GL/glx/glxextensions.[ch]. I stumbl= ed >>>over this comment in front of __glXRegisterExtensions: >>> >>>** In older versions of libGL (prior to October 2002) we _always_ >>>** called this function during libGL start-up. Now, we only call >>>** it from glXGetProcAddress() as a last resort. >>> >>>However, __glXRegisterExtensions is still called in driCreateDisplay. >>>Hmm, on the other hand I found this comment in radeon_screen.c in fron= t >>>of __driRegisterExtensions: >>> >>>/* This function is called by libGL.so as soon as libGL.so is loaded. >>> * This is where we'd register new extension functions with the dispat= cher. >>> >>>Do the __driRegisterExtensions functions in the drivers rely on being >>>called during initialisation? >>> >>>In fact I believe it could be dangerous if __driRegisterExtensions was >>>called later as it may override extensions disabled in e.g. >>>CreateContext due to lacking hardware support. Fortunately >>>__glXRegisterExtensions returns immediately if it is called the second >>>or later time. Maybe it's just a matter of updating a few comments aft= er >>>all. >> >>I'm inclined to believe that the comments in dri_glx.c are just wrong.=20 >>__glXRegisterExtensions has to be called before a call to=20 >>glXGetProcAddress. The app can query that string via=20 >>glXQueryExtensionsString long before calling glXGetProcAddress. In=20 >>fact, it may never call glXGetProcAddress. I'm sure glxinfo doesn't. := ) >=20 >=20 > So this does influence which extensions are listed in the extension > string, contradicting what Keith wrote? In that case I have one more > question. How can this work with multi-head configurations where you ca= n > have multiple different cards (different screens) on one display. Then > each driver will add or readd extensions. But they should never disable > any extensions, right? You don't want drivers to disable each others > extensions, do you? >=20 > Consequently __glXDisableExtension should never be called (or better no= t > even exist ;-). And the only way to disable an extension is to not > enable it. Thus, if you don't want to enable the swap-interval > extensions if the hardware can't support them (no IRQs) then you have t= o > know whether IRQs work at the time __driRegisterExtensions is called. I= s > that possible? >=20 > Just my thoughts. I hope I'm wrong ;-) Felix, Keep in mind that regular X11 extension (which the DRI itself uses) are=20 on a per server basis and apply to all displays on the server.=20 Consequently for heads that don't support direct rendering we still=20 advertise the DRI extension, but upon a second level query a 3D client=20 side driver would find that extension disabled. OpenGL extension need something different as they may only be supported=20 for specific visual configurations...but I can't remember how this is=20 handled off the top of my head. --=20 /\ Jens Owen / \/\ _ je...@tu... / \ \ \ Steamboat Springs, Colorado |
From: Jon S. <jon...@ya...> - 2003-07-30 19:19:26
|
The dynamic extension code has made the radeon drivers dependent on the GLX subsystem. I just imported the current r200 driver into my version of the embedded Mesa system. Embedded Mesa doesn't have a GLX subsystem. Neither does DirectFB. What do you think about reworking these functions to depend on minimal new definitions in dri_util.h and then add code at a higher level to map from GLX to the dri based interface. Here another case where I need to replace __driCreateScreen with one that doesn't depend on X headers. #ifndef _SOLO void *__driCreateScreen(Display *dpy, int scrn, __DRIscreen *psc, int numConfigs, __GLXvisualConfig *config) { __DRIscreenPrivate *psp; psp = __driUtilCreateScreen(dpy, scrn, psc, numConfigs, config, &r200API); return (void *) psp; } #else void *__driCreateScreen(struct DRIDriverRec *driver, struct DRIDriverContextRec *driverContext) { __DRIscreenPrivate *psp; psp = __driUtilCreateScreen(driver, driverContext, &r200API); return (void *) psp; } #endif ===== Jon Smirl jon...@ya... __________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com |
From: Ian R. <id...@us...> - 2003-07-30 23:02:46
|
Michel D=C3=A4nzer wrote: > On Wed, 2003-07-30 at 18:26, Ian Romanick wrote: >=20 >>[...] we don't want the R200 driver to expose GLX_MESA_swap_control=20 >>if there's no vblank interrupt available. Since it can't implement=20 >>the functionality, we don't want to advertise it. >=20 > It could poll for vertical blank... I thought about that. The problem is that the swap_control extensions=20 say "only swap after at least N refreshes have happened." Without=20 interrupts, how do we know how many refreshes have happened between any=20 two points of time out side the poll-loop? If there's a way, then we=20 can fall-back to that. |
From: Michel <mi...@da...> - 2003-07-31 12:57:30
|
On Thu, 2003-07-31 at 01:02, Ian Romanick wrote: > Michel Dänzer wrote: > > > On Wed, 2003-07-30 at 18:26, Ian Romanick wrote: > > > >>[...] we don't want the R200 driver to expose GLX_MESA_swap_control > >>if there's no vblank interrupt available. Since it can't implement > >>the functionality, we don't want to advertise it. > > > > It could poll for vertical blank... > > I thought about that. The problem is that the swap_control extensions > say "only swap after at least N refreshes have happened." Without > interrupts, how do we know how many refreshes have happened between any > two points of time out side the poll-loop? If there's a way, then we > can fall-back to that. The CRTC_CRNT_FRAME register could work? The counter is only 21 bits though. -- Earthling Michel Dänzer \ Debian (powerpc), XFree86 and DRI developer Software libre enthusiast \ http://svcs.affero.net/rm.php?r=daenzer |
From: Brian P. <br...@tu...> - 2003-07-31 16:09:13
|
Keith Whitwell wrote: > Ian Romanick wrote: >=20 >> Felix K=FChling wrote: >> >>> On Wed, 30 Jul 2003 09:20:28 -0700 >>> Ian Romanick <id...@us...> wrote: >>> >>> >>>> Felix K=FChling wrote: >>>> >>>>> I see: C SPECIFICATION >>>>> const char * glXQueryExtensionsString( Display *dpy, >>>>> int screen ) >>>> >>>> >>>> >>>> I don't mean what the GLX specification says to do. I mean what our= =20 >>>> code actually implements. Internally there is a *single* *global*=20 >>>> table & extension string. So it's not even tracking it=20 >>>> per-display. It's worse than that. :( >>> >>> >>> >>> >>> Yeah, I was just pointing out how extension tracking is specified. >>> >>> >>>>>> they are instead tracked per-display. This doesn't "matter" right= =20 >>>>>> now because we don't support the configuration that you describe=20 >>>>>> (at least not as far as I know!). Each card would be it's own=20 >>>>>> display. >>>>> >>>>> >>>>> >>>>> Maybe these configs don't work for one reason or another, but the >>>>> configuration framework was designed with this in mind and also the= =20 >>>>> code >>>>> in dri_glx.c handles the case of different drivers for different >>>>> screens. I see two choices here, either glxextensions.c manages=20 >>>>> multiple >>>>> screens itself or the four bitfields server/client_support/only are >>>>> managed in __GLXscreenConfigsRec. In either case=20 >>>>> glXGetUsableExtensions >>>>> would have to be told about the screen. A screen number in the firs= t >>>>> case or a __GLXscreenConfigsRec pointer in the second case. >>>> >>>> >>>> >>>> Since glXGetUsableExtensions is only called from=20 >>>> glXQueryExtensionsString (glxcmds.c, line 1416), that should be an=20 >>>> easy change to make. >>> >>> >>> >>> It gets more complicated with __glXEnableExtension. If it has to acce= ss >>> per-screen extension information it would need some sort of a screen >>> parameter too. As it's called by the driver, this is a binary >>> compatibility problem. Furthermore it is called from >>> __driRegisterExtensions which doesn't know the screen itself. >> >> >> >> It is a binary compatabilty problem, but a minor one. Since no code=20 >> with __glXEnableExtension has ever shipped with XFree86 (stable=20 >> release or their CVS), our exposure is pretty low. Low enough that I=20 >> wouldn't worry about it much. There is a pre-texmem code-path that=20 >> was used by the R200 driver that needs to be maintained. I'm not sure= =20 >> how to keep that working. >> >>> The quick and dirty solution would be a global screen pointer that >>> indicates the screen currently being configured. >>> >>> A more invasive but more elegant solution is this: >>> >>> I observed that glXQueryExtensionsString calls glXInitialize first wh= ich >>> in turn loads and initializes the dri drivers (calls their createScre= en >>> functions). Thus, before an extension string is returned all drivers = are >>> initialized. So why not register extensions in the driver's createScr= een >>> function? The only reason I can see is the call to glXRegisterExtensi= ons >>> in glXGetProcAddress. Is there a good reason for not calling >>> glXInitialize in glXGetProcAddress instead? >> >> >> >> That's a really good idea. I think that solves most of the problems.=20 >> Keith, do you have a problem with that change? >=20 >=20 > Not off the top of my head. >=20 > It's worth asking Brian about this, as he's had greater involvement in=20 > those paths than I. And Ian's made a lot of changes since I've worked in that code. I'm=20 not fully up to speed on it anymore. You can't call __glXInitalize from in glXGetProcAddress because you=20 don't have a Display pointer. Earlier, Felix wrote: > Do the __driRegisterExtensions functions in the drivers rely on > being called during initialisation? Yes. The driver's __driRegisterExtensions() function can do two things: 1. Add new gl*() functions into the dispatch table. For example, if=20 libGL doesn't know anything about the GL_ARB_vertex_buffer_object=20 extension but the driver really does implement it, the driver can plug=20 in the glBindBufferARB(), etc functions into the dispatch table so the=20 app can use that extension. 2. The driver can register/enable new glX*() functions with libGL. In either case, this has to be done before the user gets the results=20 of glXGetProcAddressARB() or glXQueryExtensionsString(). Earlier, Felix wrote, and Ian followed up with: >> Consequently __glXDisableExtension should never be called (or better n= ot >> even exist . And the only way to disable an extension is to not >> enable it. Thus, if you don't want to enable the swap-interval >> extensions if the hardware can't support them (no IRQs) then you have = to >> know whether IRQs work at the time __driRegisterExtensions is called. = Is >> that possible? >=20 >=20 > Now there's an interesting point. The bigger problem is that the > driver might not have a chance to call __glXDisableExtension until > *after* the app has called glXQueryExtensionsString. At that point the > extension string cannot be changed. I'm not sure what the right answer > is here. I don't know the answer to this either. -Brian |
From: Felix <fx...@gm...> - 2003-07-31 20:51:57
|
On Thu, 31 Jul 2003 10:12:47 -0600 Brian Paul <br...@tu...> wrote: > Keith Whitwell wrote: > > Ian Romanick wrote: > > > >> Felix Kühling wrote: > >> > >>> I observed that glXQueryExtensionsString calls glXInitialize first which > >>> in turn loads and initializes the dri drivers (calls their createScreen > >>> functions). Thus, before an extension string is returned all drivers are > >>> initialized. So why not register extensions in the driver's createScreen > >>> function? The only reason I can see is the call to glXRegisterExtensions > >>> in glXGetProcAddress. Is there a good reason for not calling > >>> glXInitialize in glXGetProcAddress instead? > >> > >> > >> > >> That's a really good idea. I think that solves most of the problems. > >> Keith, do you have a problem with that change? > > > > > > Not off the top of my head. > > > > It's worth asking Brian about this, as he's had greater involvement in > > those paths than I. > > And Ian's made a lot of changes since I've worked in that code. I'm > not fully up to speed on it anymore. > > > You can't call __glXInitalize from in glXGetProcAddress because you > don't have a Display pointer. Right. glXRegisterExtensions doesn't need a display pointer. It loads the drivers for all displays and calls their __driRegisterExtensions methods. > Earlier, Felix wrote: > > > Do the __driRegisterExtensions functions in the drivers rely on > > being called during initialisation? > > Yes. > > The driver's __driRegisterExtensions() function can do two things: > > 1. Add new gl*() functions into the dispatch table. For example, if > libGL doesn't know anything about the GL_ARB_vertex_buffer_object > extension but the driver really does implement it, the driver can plug > in the glBindBufferARB(), etc functions into the dispatch table so the > app can use that extension. > > 2. The driver can register/enable new glX*() functions with libGL. > > In either case, this has to be done before the user gets the results > of glXGetProcAddressARB() or glXQueryExtensionsString(). Ok. So adding an extension basically involves two steps: 1. add some functions and tell glXGetProcAddress about them. 2. add the extension to the screen's extensions string. The new functions are globally visible and glXGetProcAddress returns the right pointer no matter if the extension is actually available as it can't know on which screen or display the function is going to be used. So we could split the process of adding an extension into the two steps above. __driRegisterExtension tells glXGetProcAddress about the new functions exported by the driver and __driCreateScreen (conditionally) adds something to the extension string of the correct screen. > > > Earlier, Felix wrote, and Ian followed up with: > > >> Consequently __glXDisableExtension should never be called (or better not > >> even exist . And the only way to disable an extension is to not > >> enable it. Thus, if you don't want to enable the swap-interval > >> extensions if the hardware can't support them (no IRQs) then you have to > >> know whether IRQs work at the time __driRegisterExtensions is called. Is > >> that possible? > > > > > > Now there's an interesting point. The bigger problem is that the > > driver might not have a chance to call __glXDisableExtension until > > *after* the app has called glXQueryExtensionsString. At that point the > > extension string cannot be changed. I'm not sure what the right answer > > is here. > > I don't know the answer to this either. Then how am I supposed to know the answer ;-). But seriously, the safest thing is probably to call glXEnableExtension only in the driver's createScreen function so that the extensions string never changes while a GLX application is running (though I don't know if the spec requires this). Then there is probably no need to ever disable an extension later. > > -Brian > Felix ------------ __\|/__ ___ ___ ------------------------- Felix ___\_e -_/___/ __\___/ __\_____ You can do anything, Kühling (_____\Ä/____/ /_____/ /________) just not everything fx...@gm... \___/ \___/ U at the same time. |