From: Fredrik T. <fr...@do...> - 2005-04-29 17:55:45
|
On Thu, 2005-04-28 at 10:58 -0700, yitzhak bar geva wrote: > Here's a link to an article about a similar, > proprietary project done here in Israel for MS > Windows: > > http://www.haaretz.com/hasen/spages/569358.html > > I spoke to the developer today. What is interesting in > his work is that he uses Nvidia dual port boards and > gets a separate session on each one. > The Nvidia drive documentation has an XConfig layout > (near the end) which describes such a scenario. I had > asked Aivils about this and he said it won't work with > Ribui (Ruby). I never tried it. > What are the chances that we can get separate sessions > on each port of an Nvidia dual-port board? > Does it require work on the part of the Xorg > developers? > What are the technicalities involved? The problem is that the X server handles one (or more) *card* at a time. Two X servers cannot share a single physical card. The way to solve this would be to write a kernel framebuffer driver that presents one framebuffer device to userspace for each head, and then run two X servers that use these framebuffer devices instead of controlling hardware directly (as they should be doing either way). The problem with this, as far as I know, is that either the framebuffer device interface isn't accelerated, or that there is no X server capable of using the acceleration. I also don't think that there is a framebuffer driver that does this with nVidia cards, but writing such a driver would probably be the smallest subproblem involved. At least this is my take on the situation. Fredrik Tolf |