From: Eric B. <eb...@us...> - 2000-07-14 18:13:42
|
Hey guys...I was starting to think a little more (run for cover :-) and started thinking, what exactly do services do we want 3Dsia to provide? It seems to me, that for 3Dsia, should for all practical purposes, be like X-Windows, or more specifically, like a GTK+, QT, or MOTIF. X has the basic graphic and network operations (kernel level development), which are used in a higher level library (user level development - MOTIF,etc), which in turn are used to make applications conforming to that standard API. So do we not essentially need to make a tool kit built on top of X, which provides a good 3D Widget set, which can be the basis for 3Dsia applications to be developed? Not that I am saying we necessarily need to build everything from the ground up on X Although there was a thread on Slashdot (http://slashdot.org/articles/00/07/14/1218257.shtml), that seem to indicate some feel that X should be replaced all together since they feel it's not as robust, and unbloated as it use to be...perhaps that concern can be a basis for some of our work. I guess in the model of sound and graphics that I've seen recently, you would have your kernel and your userspace development. I guess the kernel would be responsible for maintain the interaction with the odb database, processing kernel requests (update buffer/object content, return buffer/object content to the clients, send buffer/object content from client), and resolving conflicting kernel requests (when buffer/object relate request conflict). The user level development would involve writing applications, that make user level calls to the 3Dsia API, which involve handling widget level interaction, which would make kernel server requests, that are processed and returned by the kernel, with the output being displayed in a 3Dsia/X Window. I would get in addition to the common types of widgets - buttons - switches - text entries - radio buttons - text fields - etc We would have to start defining widgets that are 3D specific. With grouping in mind, we can combine simple 3D widgets into complex 3D widgets. And each widget would be visible, invisible, or semi-visible (during debug/programming they would be visible) I think each widget would accept and trigger certain prefined actions and contain certain state values. Examples that come to mind are - a 3D Canvas/Object widget used to represent 3D objects, similar to a 2D canvas ; contains state widgets; Avatar maybe a derivative of this - grouping widget directory would be a group of document widgets; also usable as a bounding box, sphere, cone, etc - State widget used to represent a memory/variable content an animated 3D Object, triggering events when predefined states occur - Process Widget (Plugin) represent some form of calculations are occuring, this is very similar to the user defined. once specific processes are defined, we can detail these more. This would provide an input plug(s) and an output plug(s). Process code would go here. - plug represent input or output to a state widget; used for connecting links; similar to doorways; similar to a file handle - can be the input or output of a process - a document widget a book perhaps - a doorway widget represent entry point to grouped widgets - a link widget a tunnel or tube perhaps; similar to a pipe linking processes; sort of an interprocess mechanism - a billboard widget a derivative of a text field/text entry - which we might make so that it is visible to each user independent of the direction they are at - an Avatar widget made up of a 3D Canvas/Object widget, many State widgets - User Defined Widget/Plugin triggers user defined events and event handling Let try to design a Chat App with these Widgets. You would create a Billboard Widget. The Billboard widget would have an internal state which contains the content of the billboard. This state content would have a plug widget. We would connect a process widget, to the plug of the state content. The User Input process widget input would be standard input, the User Input process output would be to the state Buffer widget. When a user inputs on the keypad, the contents would be echoed out to the state widget, which would trigger some form of update to the billboard, updating the content. You would have some timeout process widget, that would take the output from the User Input Process, which would reset a timeout value, and pass the output to the billboard input. If a timeout occurs, then clearing the billboard would occur (don't want the content to always be on the billboard). The flow would be something like (1) stdinput -(2)-> (3) UserInput -(4)-> (5) BBContent( (6)Buffer )-(7)-> (8) Billboard with each arrow representing a link of some kind. Here some pseudo code ChatApp () { BBWidget BBW BBContent BBC ProcessNode UserInput; Buffer Buf BBC = CreateStateWidget( Buf ); // link a state content with a physical content BBC.CreatePlug( input, Buffer ) // define input plug and the type of input BBC.CreatePlug( output, Buffer ) // define output plug and the type of output BBW = CreateBBWidget(); BBW.CreatePlug( input, Buffer ) // define input plug and the type of input BBW.CreatePlug( output, Buffer ) // define output plug and the type of output // link the input of a node to the output of a node LinkWidgets( stdinput, UserInput.input // link the input of a node to the output of a node LinkWidget( UserInput.output, BBC.input ); // link the input of a node to the output of a node LinkWidgets( BBC.output, BBW.input ); EndLessLoop(); } The BBW would then send output to the 3Dsia/XWindow kernel and display accordingly. My brain is starting to hurt... I am just tossing these ideas out...I think some of it is already addressed, but I was hoping some comments could be made. Eric Bresie eb...@us... |