Thanks for your enthusiastic reply. I was working on the basic blocks of the framework so that I could come with a concrete idea. Google Sketchup is a CAD software where they build 3D structures from 2D structures by pulling it etc. Such type of gestures will be very useful . Similarly , panning, rotation and movement through a 3D structure will be very easy with MT-gestures just as we have seen in MT-based Google earth application. A lot of work is being done in this field and coming up in news , notably by AutoCAD .( http://labs.autodesk.com/technologies/multitouch/ )
Here goes the propsed plan:
The project envisages the development of a package that could be used in conjunction with Tbeta/Touchlib and BRL-CAD to develop a CAD multi-touch application to interact with 3D models. Tbeta/Touchlib (from NUIGroup ) will encapsulate the hardware of the Multi-Touch setup and will send data of touch-events in TUIO protocol through TCP/UDP packets. The functionalities provided by the package will start from this step.
In the implementation the touch-events will be received by a particular application, relayed by an MPX-type sever, and then it will be patterned according to the scene ( which is divided into regions of various sizes which needs to be updated alone when a touch-event occurs in that region ). The powerful collision detection of BRL-CAD will be used for thie purpose of patterning.
The MPX-based server thus developed will help in collaboration of multiple persons working on the design at the same time - through both at MT surface(s) or with multiple mice support systems- which I hope will be a cool thing for CAD design systems.
According to the corresponding region, the touch-event data will be clubbed and sent to Gesture Recognition module in a particular format which will then, as per the application requirement, pass commands to the scene-manager of BRL-CAD to update the required region(s).
The whole work can be summarized in the following five sequential stages :1. Design and testing of a ANN based gesture recognition and its integration with OSCPack and patterning .2. Integration of the gesture recognition with the BRL-CAD scene manager.
3. Development of some primitive MT-primitives for BRL-CAD which can be easily used for model developement .
4. Gesture recognition for designing, editing and model building from the MT-primitives.
Waiting for your comments and suggestions eagerly !!On Thu, Mar 19, 2009 at 11:42 PM, Christopher Sean Morrison <firstname.lastname@example.org> wrote:
Thanks for the message and interest in working with BRL-CAD! Your idea of working on a touch screen interface for CAD is interesting and something several of us have been following the research literature on for many years now. It's a very enticing concept but I have several questions about what you'd foresee being the near term gains and what you have to say about some of the criticisms that MTI's have received.
You do identify one potential problem there. MT is in its infancy, even with efforts like Surface and various phones that now support MT. What's not been shown yet, though, is the efficacy of MT for non-casual interactions.
>In this regard I feel that we can have a headway start in developing
>multi-touch capabilities in CAD softwares.
>I do feel that Multi-Touch(MT) stuff is still in infancy stage but it has
>started to show its effect through the phenomenal success of IPhone and
>various big-shot companies are puring in huge money to develop in-house
>multi-touch software. Hence I would like to propose to start with the
>development of MT based user interface for CAD systems in open source.
Intuitive and friendly, but not shown to necessarily be more (or less) productive either as far as I know for "real work". It makes a great demo and probably even makes for a great visualization interface, but would you want to use it for actual architecting or designing? Maybe. Therein, though, is a dilemma in terms of defining a project that can be scoped with specific goals. :-) How do you see that working?
>It is much more intuitive and friendly way for an architect or a designer to
>develop and show models with his both hands by directly touching the models
>instead of using the mouse as a pointer device which is more abstract.
Similarly, the various multicontact gestures for 2D imagery is easy enough but how would you go about extending that to 3D? Is there any research you'd be using to back up a given direction?
As is often the case, the least intuitive and least user-friendly interface is often one of the most efficient interfaces. Ignoring productivity, though, as there are plenty of folks that use operating systems that are entirely less efficient because they think they're easier/familiar, these phases you speak of would need to tie directly in with BRL-CAD.
>In the first phase I would like to develop aaplication and libraries to help
>in interacting with the already build models in an interactive way on a
>In the second phase, I will work on how to develop a User Interface so that
>a designer can very easily develop models as well as has a commandline
>running at the same time to give fast inputs. I agree to the fact that
>currently it is much more covinient and faster to develop a model with
>commandline. It will start with taking inspirations from Google's Sketchup
>but will be much more intuitive and easy to use and I hope will become much
>different than that.
You need to propose how you'd enhance BRL-CAD, not how you'd make something that could then be used to enhance BRL-CAD. How does your project fit in with our tools and services specifically? If it's some general framework that you want to work on that could just as easily be completely independent of BRL-CAD, then that would be more like an independent study project that is outside the realm of GSoC.
Glad to hear that. Paweł is a good guy with a lot of good insights to share.
>own a MT surface. The project has been highly popular and is available for
>both windows and Lniux. This project was my GSoC project last summer which I
>successfully completed under the mentorship of NuiGroup (Pawel Solyga).
Awesome. The more you can tie your work into what we already have and do, the better off your proposal will come across as being directly beneficial and applicable to our community at large.
>I have a working knowledge experince with MGED and Archer and have already
>started looking into the code base and the libraries of BRL-CAD.
In any regard, thanks again for the interest and I look forward to hearing more about your project and your proposal.
Ashish Kumar Rai
Electronics Engineering Department,