From: Keith W. <kei...@go...> - 2010-03-28 17:51:58
|
Hi, I've just pushed a variation on a theme a couple of people have explored in the past, ie. an interface to gallium without an intervening state-tracker. The purpose of this is for writing minimal test programs to exercise new gallium drivers in isolation from the rest of the codebase. In fact it doesn't really make sense to say "without a state tracker", unless you don't mind creating test programs which are specific to the windowing system you're currently working with. Some similar work has avoided window-system issues altogether by dumping bitmaps to files, or using eg. python to abstract over window systems. This approach is a little different - I've defined a super-minimal api for creating/destroying windows, currently calling this "graw", and we have a tiny little co-state-tracker that each implementation provides. This is similar to the glut approach of abstracting over window systems, though much less complete. It currently consists of three calls: struct pipe_screen *graw_init( void ); void *graw_create_window(...); void graw_destroy_window( void *handle ); which are sufficient to build simple demos on top of. A future enhancement would be to add a glut-style input handling facility. Right now there's a single demo, "clear.c" which displays an ugly purple box. Builds so far only with scons, using winsys=graw-xlib. Keith |
From: Luca B. <luc...@gm...> - 2010-03-28 19:19:47
|
I posted something similar some time ago, that however could use hardware accelerated drivers with DRI2 or KMS, provided a substantial set of helpers and offered a complement of 3 demos. My solution to window-system issues was to simply have the application provide a "draw" callback to the framework, which would automatically create a maximized window with the application name in the title, and call draw in a loop, presenting the results. Then I had a path that woud use the X DRI2 interface if possible, and another path that would use the Linux DRM KMS API (and initially some EGL+ad-hoc extension paths that were later dropped). It no longer works due to Gallium interface changes, but maybe it can be resurrected and merged with graw. However, there is a disadvantage to having Gallium programs in-tree: they break every time the Gallium interface in changed and avoiding that means that in addition to fixing all drivers and state trackers, you also need to fix all programs for each change |
From: Corbin S. <mos...@gm...> - 2010-03-28 23:45:07
|
On Sun, Mar 28, 2010 at 12:19 PM, Luca Barbieri <luc...@gm...> wrote: > I posted something similar some time ago, that however could use > hardware accelerated drivers with DRI2 or KMS, provided a substantial > set of helpers and offered a complement of 3 demos. > > My solution to window-system issues was to simply have the application > provide a "draw" callback to the framework, which would automatically > create a maximized window with the application name in the title, and > call draw in a loop, presenting the results. > > Then I had a path that woud use the X DRI2 interface if possible, and > another path that would use the Linux DRM KMS API (and initially some > EGL+ad-hoc extension paths that were later dropped). > > It no longer works due to Gallium interface changes, but maybe it can > be resurrected and merged with graw. > > However, there is a disadvantage to having Gallium programs in-tree: > they break every time the Gallium interface in changed and avoiding > that means that in addition to fixing all drivers and state trackers, > you also need to fix all programs for each change Presumably this will no longer be a problem when Gallium is a more mature, stable interface. I much prefer this "try it and see" mentality over the design-by-committee mess that has popped up elsewhere. -- When the facts change, I change my mind. What do you do, sir? ~ Keynes Corbin Simpson <Mos...@gm...> |
From: Chia-I Wu <ol...@gm...> - 2010-03-29 06:57:05
|
On Mon, Mar 29, 2010 at 1:51 AM, Keith Whitwell <kei...@go...> wrote: > I've just pushed a variation on a theme a couple of people have > explored in the past, ie. an interface to gallium without an > intervening state-tracker. > The purpose of this is for writing minimal test programs to exercise > new gallium drivers in isolation from the rest of the codebase. > In fact it doesn't really make sense to say "without a state tracker", > unless you don't mind creating test programs which are specific to the > windowing system you're currently working with. Some similar work has > avoided window-system issues altogether by dumping bitmaps to files, > or using eg. python to abstract over window systems. > This approach is a little different - I've defined a super-minimal api > for creating/destroying windows, currently calling this "graw", and we > have a tiny little co-state-tracker that each implementation provides. > This is similar to the glut approach of abstracting over window > systems, though much less complete. > It currently consists of three calls: > struct pipe_screen *graw_init( void ); > void *graw_create_window(...); > void graw_destroy_window( void *handle ); > which are sufficient to build simple demos on top of. A future > enhancement would be to add a glut-style input handling facility. > Right now there's a single demo, "clear.c" which displays an ugly > purple box. Builds so far only with scons, using winsys=graw-xlib. I happened to be playing with the idea yesterday. My take is to define an EGL extension, EGL_MESA_gallium. The extension defines Gallium as a rendering API of EGL. The downside of this approach is that it depends on st/egl. The upside is that, it will work on whatever platform st/egl supports. I've cleaned up my work a little bit. You can find it in the attachments. There is a port of "clear" raw demo to use EGL_MESA_gallium. The demo supports window resizing, and is accelerated if a hardware EGL driver is used. The demo renders into a X11 window. It is worth noting that, when there is no need to render into an EGLSurface, eglCreateWindowSurface or eglMakeCurrent is not required. To interface with X11, I've also borrowed some code from OpenVG demos and renamed it to EGLUT. -- ol...@Lu... |
From: Keith W. <ke...@vm...> - 2010-03-30 17:08:32
|
On Sun, 2010-03-28 at 23:56 -0700, Chia-I Wu wrote: > On Mon, Mar 29, 2010 at 1:51 AM, Keith Whitwell > <kei...@go...> wrote: > > I've just pushed a variation on a theme a couple of people have > > explored in the past, ie. an interface to gallium without an > > intervening state-tracker. > > The purpose of this is for writing minimal test programs to exercise > > new gallium drivers in isolation from the rest of the codebase. > > In fact it doesn't really make sense to say "without a state tracker", > > unless you don't mind creating test programs which are specific to the > > windowing system you're currently working with. Some similar work has > > avoided window-system issues altogether by dumping bitmaps to files, > > or using eg. python to abstract over window systems. > > This approach is a little different - I've defined a super-minimal api > > for creating/destroying windows, currently calling this "graw", and we > > have a tiny little co-state-tracker that each implementation provides. > > This is similar to the glut approach of abstracting over window > > systems, though much less complete. > > It currently consists of three calls: > > struct pipe_screen *graw_init( void ); > > void *graw_create_window(...); > > void graw_destroy_window( void *handle ); > > which are sufficient to build simple demos on top of. A future > > enhancement would be to add a glut-style input handling facility. > > Right now there's a single demo, "clear.c" which displays an ugly > > purple box. Builds so far only with scons, using winsys=graw-xlib. > I happened to be playing with the idea yesterday. My take is to define an EGL > extension, EGL_MESA_gallium. The extension defines Gallium as a rendering API > of EGL. The downside of this approach is that it depends on st/egl. The > upside is that, it will work on whatever platform st/egl supports. > > I've cleaned up my work a little bit. You can find it in the attachments. > There is a port of "clear" raw demo to use EGL_MESA_gallium. The demo supports > window resizing, and is accelerated if a hardware EGL driver is used. > > The demo renders into a X11 window. It is worth noting that, when there is no > need to render into an EGLSurface, eglCreateWindowSurface or eglMakeCurrent is > not required. To interface with X11, I've also borrowed some code from OpenVG > demos and renamed it to EGLUT. > I'm not sure how far to take any of these "naked" gallium approaches. My motivation was to build something to provide a very controlled environment for bringup of new drivers - basically getting to the first triangle and not much further. After that, existing state trackers with stable ABIs are probably preferable. Keith |
From: Luca B. <luc...@gm...> - 2010-03-30 23:24:41
|
An interesting option could be to provide a DirectX 10 implementation using TGSI text as the shader interface, which should be much easier than one would think at first. DirectX 10 + TGSI text would provide a very thin binary compatible layer over Gallium, unlike all existing state trackers. It could even run Windows games if integrated with Wine and something producing TGSI from either HLSL text or D3D10 bytecode (e.g. whatever Wine uses to produce GLSL + the Mesa GLSL frontend + st_mesa_to_tgsi). In fact, given the Gallium architecture, it may even make sense to support a variant of DirectX 10 as the main Mesa/Gallium API on all platfoms, instead of OpenGL. |
From: Chia-I Wu <ol...@gm...> - 2010-03-31 08:10:42
|
On Wed, Mar 31, 2010 at 12:52 AM, Keith Whitwell <ke...@vm...> wrote: > On Sun, 2010-03-28 at 23:56 -0700, Chia-I Wu wrote: >> I happened to be playing with the idea yesterday. My take is to define an EGL >> extension, EGL_MESA_gallium. The extension defines Gallium as a rendering API >> of EGL. The downside of this approach is that it depends on st/egl. The >> upside is that, it will work on whatever platform st/egl supports. >> >> I've cleaned up my work a little bit. You can find it in the attachments. >> There is a port of "clear" raw demo to use EGL_MESA_gallium. The demo supports >> window resizing, and is accelerated if a hardware EGL driver is used. >> >> The demo renders into a X11 window. It is worth noting that, when there is no >> need to render into an EGLSurface, eglCreateWindowSurface or eglMakeCurrent is >> not required. To interface with X11, I've also borrowed some code from OpenVG >> demos and renamed it to EGLUT. > I'm not sure how far to take any of these "naked" gallium approaches. > My motivation was to build something to provide a very controlled > environment for bringup of new drivers - basically getting to the first > triangle and not much further. After that, existing state trackers with > stable ABIs are probably preferable. Ok. The benefit of using st/egl is that you get to see the results on the screen. pipe_screen::flush_frontbffer is usually not implemented by hw pipe drivers. But I guess that is minor for bring-up of new drivers. -- ol...@Lu... |
From: Miles B. <mi...@gn...> - 2010-03-31 04:45:19
|
Luca Barbieri <luc...@gm...> writes: > In fact, given the Gallium architecture, it may even make sense to > support a variant of DirectX 10 as the main Mesa/Gallium API on all > platfoms, instead of OpenGL. The apparent benefit would seem to be greater compatibility with software written for windows -- but that benefit is unlikely to remain, as MS basically changes their interfaces drastically with each major revision. If Mesa just tried to stick with the older interface, the advantage of using it would largely evaporate (as software makers abandoned it and their support bit-rots), but if Mesa tried to adopt each new version, it would end up trailing behind on an interface completely controlled by Microsoft, and that's _not_ a good place to be. It's rather fortunate to have a portable and still widely used interface such as OpenGL, and I think the Mesa project should try their best to encourage, not discourage, wider use of it. -Miles -- Alliance, n. In international politics, the union of two thieves who have their hands so deeply inserted in each other's pockets that they cannot separately plunder a third. |
From: Xavier B. <xav...@fr...> - 2010-03-31 15:44:52
|
On Wed, 2010-03-31 at 13:29 +0900, Miles Bader wrote: > Luca Barbieri <luc...@gm...> writes: > > In fact, given the Gallium architecture, it may even make sense to > > support a variant of DirectX 10 as the main Mesa/Gallium API on all > > platfoms, instead of OpenGL. > > The apparent benefit would seem to be greater compatibility with > software written for windows -- but that benefit is unlikely to remain, > as MS basically changes their interfaces drastically with each major > revision. WINE can deal with that. The real showstopper is that WINE has to also work on MacOS X and Linux + NVIDIA blob, where Gallium is unavailable. Xav |
From: Luca B. <luc...@gm...> - 2010-03-31 17:58:51
|
> WINE can deal with that. The real showstopper is that WINE has to also > work on MacOS X and Linux + NVIDIA blob, where Gallium is unavailable. We could actually consider making a Gallium driver that uses OpenGL to do rendering. If the app uses DirectX 10, this may not significantly degrade performance, and should instead appreciably increase it if a Gallium driver is available. On the other hand, for DirectX 9 apps, this could decrease performance significantly (because DirectX 9 has immediate mode and doesn't require CSOs). |
From: Miles B. <mi...@gn...> - 2010-04-01 01:46:11
|
On Thu, Apr 1, 2010 at 12:28 AM, Xavier Bestel <xav...@fr...> wrote: > On Wed, 2010-03-31 at 13:29 +0900, Miles Bader wrote: >> Luca Barbieri <luc...@gm...> writes: >> > In fact, given the Gallium architecture, it may even make sense to >> > support a variant of DirectX 10 as the main Mesa/Gallium API on all >> > platfoms, instead of OpenGL. >> >> The apparent benefit would seem to be greater compatibility with >> software written for windows -- but that benefit is unlikely to remain, >> as MS basically changes their interfaces drastically with each major >> revision. > > WINE can deal with that. The real showstopper is that WINE has to also > work on MacOS X and Linux + NVIDIA blob, where Gallium is unavailable. "Wine can deal with that", how? Once MS changes interfaces, then there's _no advantage_ to using DX10 internally, regardless of what WINE does, and one might as well use OpenGL. Wine doesn't change that. Given that OpenGL has other advantages (portable, publicly accessible standardization proces, etc), adopting DX10 would seem pointless and misguided. -Miles -- Do not taunt Happy Fun Ball. |
From: Luca B. <luc...@gm...> - 2010-04-01 08:32:24
|
> Once MS changes interfaces, then there's _no advantage_ to using DX10 > internally, regardless of what WINE does, and one might as well use > OpenGL. Wine doesn't change that. [resent to ML, inadvertently replied only to Miles] Note that my proposal was not to use DirectX 10 internally, but rather to expose DirectX 10 and promote it initially as an API to test Gallium and later as the preferred Linux graphics API instead of OpenGL, for the technical reason that a DirectX 10 over Gallium implementation carries much less performance overhead than an OpenGL implementation and is much simpler, due to the superior design of DirectX 10. Using an extended version of DirectX 10 internally could also be an option, but I don't think it's worth doing that right now and likely it's not worth at all. Also note that Microsoft does not use DirectX 10 or 11 internally either, but rather uses the "DirectX 10 DDI" or "DirectX 10 Device Driver Interface", which is also publicly documented. The last time Microsoft did an incompatible interface change (DX10), it was to move away from fixed pipeline support with scattered state towards a shader-only pipeline with constant state objects. Exactly the same change was achieved by the move from the classic Mesa architecture to the Gallium architecture: you could think of the move to Gallium as switching to something like DX10 internally, done purely for technical reasons, partially the same as the ones that prompted Microsoft to make the transition. Actually, while this is not generally explicitly stated by Gallium designers, Gallium itself is generally evolving towards being closer to DirectX 10. The biggest deviations are additional features needed to support OpenGL features not included in DirectX 10. For instance, looking at recent changes: - Vertex element CSOs, recently added, are equivalent to DX10 input layouts - Sampler views, also recently added, are equivalent to DX10 shared resource views - Doing transfers per-context (recent work by Keith Whitwell) is what DX10 does - Having a "resource" concept (also recent work by Keith Whitwell) is what DX10 does - Gallium format values where changed from self-describing to a set of stock values like DX10 - Gallium format names where later changed and made identical to DX10 ones (except for the fact that the names of the former start with PIPE_FORMAT_ and the ones of the latter with DXGI_FORMAT_, and the enumerated values are different) - It has been decided to follow the DX9 SM3/DX10 model for shader semantic linkage as opposed to the OpenGL one I recently systematically compared Gallium and DirectX 10, and found them to be mostly equivalent, where the exceptions were usually either additional features Gallium had for the sake of OpenGL, or Gallium misdesigns that are being changed or looked into. This is not likely for the sake of imitating Microsoft, but just because they made a good design, having had made the decision to redesign the whole API from scratch when making DirectX 10. It's also probably because VMWare is apparently funding DirectX 10 support over Gallium, which obviously makes all discrepancies evident for people working on that, and those are generally because DirectX 10 is better, leading those people to improve the Gallium design taking inspiration from DirectX 10. Presumably if Microsoft were to change interfaces incompatibly again (notice that DX11 is a compatible change), Mesa would probably benefit from introducing a further abstraction layer similar to new Microsoft interface and have a Gallium->NewLayer module, since such a change would most likely be a result of a paradigm shift in graphics hardware itself (e.g. a switch to fully software-based GPUs like Larrabee). Also, unless Microsoft holds patents to DirectX 10 (which would be a showstopper, even though Gallium may violate them anyway), I don't see any difference from having to implement DirectX 10 or OpenGL, or any difference in "openness" of the APIs. It is indeed possible to participate in the ARB standardization process and some Mesa contributors/leaders do, but I'm not sure whether this is particularly advantageous: decisions that work well for Microsoft and Windows are also likely to work well for Linux/Mesa since the hardware is the same and the software works mostly equivalently. And should some decisions not work well, it is technically trivial to provide an alternative API. |
From: Corbin S. <mos...@gm...> - 2010-04-01 15:04:01
|
On Thu, Apr 1, 2010 at 1:32 AM, Luca Barbieri <luc...@gm...> wrote: >> Once MS changes interfaces, then there's _no advantage_ to using DX10 >> internally, regardless of what WINE does, and one might as well use >> OpenGL. Wine doesn't change that. > > [resent to ML, inadvertently replied only to Miles] > > Note that my proposal was not to use DirectX 10 internally, but rather > to expose DirectX 10 and promote it initially as an API to test > Gallium and later as the preferred Linux graphics API instead of > OpenGL, for the technical reason that a DirectX 10 over Gallium > implementation carries much less performance overhead than an OpenGL > implementation and is much simpler, due to the superior design of > DirectX 10. > > Using an extended version of DirectX 10 internally could also be an > option, but I don't think it's worth doing that right now and likely > it's not worth at all. > > Also note that Microsoft does not use DirectX 10 or 11 internally > either, but rather uses the "DirectX 10 DDI" or "DirectX 10 Device > Driver Interface", which is also publicly documented. > > The last time Microsoft did an incompatible interface change (DX10), > it was to move away from fixed pipeline support with scattered state > towards a shader-only pipeline with constant state objects. > > Exactly the same change was achieved by the move from the classic Mesa > architecture to the Gallium architecture: you could think of the move > to Gallium as switching to something like DX10 internally, done purely > for technical reasons, partially the same as the ones that prompted > Microsoft to make the transition. > > Actually, while this is not generally explicitly stated by Gallium > designers, Gallium itself is generally evolving towards being closer > to DirectX 10. > The biggest deviations are additional features needed to support > OpenGL features not included in DirectX 10. > > For instance, looking at recent changes: > - Vertex element CSOs, recently added, are equivalent to DX10 input layouts > - Sampler views, also recently added, are equivalent to DX10 shared > resource views > - Doing transfers per-context (recent work by Keith Whitwell) is what DX10 does > - Having a "resource" concept (also recent work by Keith Whitwell) is > what DX10 does > - Gallium format values where changed from self-describing to a set of > stock values like DX10 > - Gallium format names where later changed and made identical to DX10 > ones (except for the fact that the names of the former start with > PIPE_FORMAT_ and the ones of the latter with DXGI_FORMAT_, and the > enumerated values are different) > - It has been decided to follow the DX9 SM3/DX10 model for shader > semantic linkage as opposed to the OpenGL one > > I recently systematically compared Gallium and DirectX 10, and found > them to be mostly equivalent, where the exceptions were usually either > additional features Gallium had for the sake of OpenGL, or Gallium > misdesigns that are being changed or looked into. > > This is not likely for the sake of imitating Microsoft, but just > because they made a good design, having had made the decision to > redesign the whole API from scratch when making DirectX 10. > It's also probably because VMWare is apparently funding DirectX 10 > support over Gallium, which obviously makes all discrepancies evident > for people working on that, and those are generally because DirectX 10 > is better, leading those people to improve the Gallium design taking > inspiration from DirectX 10. > > Presumably if Microsoft were to change interfaces incompatibly again > (notice that DX11 is a compatible change), Mesa would probably benefit > from introducing a further abstraction layer similar to new Microsoft > interface and have a Gallium->NewLayer module, since such a change > would most likely be a result of a paradigm shift in graphics hardware > itself (e.g. a switch to fully software-based GPUs like Larrabee). > > Also, unless Microsoft holds patents to DirectX 10 (which would be a > showstopper, even though Gallium may violate them anyway), I don't see > any difference from having to implement DirectX 10 or OpenGL, or any > difference in "openness" of the APIs. > It is indeed possible to participate in the ARB standardization > process and some Mesa contributors/leaders do, but I'm not sure > whether this is particularly advantageous: decisions that work well > for Microsoft and Windows are also likely to work well for Linux/Mesa > since the hardware is the same and the software works mostly > equivalently. > > And should some decisions not work well, it is technically trivial to > provide an alternative API. Is it really so surprising that an API designed to expose a generic, programmable, shaderful pipeline (Gallium) fits well to multiple APIs based on the same concept (D3D10, OGL 2.x)? -- When the facts change, I change my mind. What do you do, sir? ~ Keynes Corbin Simpson <Mos...@gm...> |