Re: [Plib-users] Readable user docs for PUI?
Brought to you by:
sjbaker
From: James F. <fr...@cs...> - 2008-05-08 22:40:24
|
On Thu, 8 May 2008, Steve Baker wrote: > James Frye wrote: >> Hummm... I had been thinking I'd have to re-write most of the drawing >> code to do 3D drawing instead of 2D - e.g. calling glVertex3f instead of >> glVertex2i - then changing setPosition & setSize to take 3D coords, adding >> a setRotation, etc. >> > Oh - no, no, no! Certainly not! PUI uses glVertex2D - but all that > does is generate a full 3D vertex with Z=0,W=1. The subsequent > rotate/translate/perspective transforms operate exactly the same way > they do for 3D vertices. You CERTAINLY won't have to do anything like > that if all you want to do is to simulate a 2D GUI on a virtual 2D > display placed somewhere out in the 3D world. Well, I do want more than that: the ability to use the functions in any way needed. It might be a display on a simulated device somewhere in the world, or it might be simulated buttons on a device the user carries with him, or "magic" system menus that the user can pull out of thin air, as it were. So it seems that somehow pui has to be altered/extended so that each menu carries around its 3D transform information, then puDisplay (or code beneath) contains a loop that sets the transform for each menu. And of course the transform can be dynamically modified, if for instance the menu is attached to a moving object... > Once you have the right transform on the stack, PUI will render in full > 3D just fine. > > One thing you MIGHT have problems with is that PUI doesn't clip it's > underlying primitives to the screen - it lets OpenGL do that. You'll > need to set up some user clip planes to fix that properly - but you > could consider cheating and building your virtual monitor with a nice > chunky frame that Z-buffers out the PUI widgets that hang off the edges > of the screen a bit. May not be a problem, since in a CAVE environment there aren't any edges... > Actually - the super-modern way to do this would be to leave PUI 100% > alone and simply have it render to an RGB texture instead of to the > screen. Set up a render-to-texture rendering target - run PUI exactly > as you would normally - then apply that texture to your virtual computer > screen. That would completely take care of the clipping problem and > allow you to use PUI's standard puDisplay() function. Then you can > apply that texture to any polygon(s) in your 3D scene and be utterly > assured that everything will play nicely. If you have multiple PUI > screens in your virtual world then you could also gain some performance > by only re-rendering the texture for the screen that the user is > interacting with. That would be really cool actually - you could easily > have PUI displayed on translucent screens (like on the movie "Minority > Report") or whatever. > > If you are familiar with the render-to-texture approach then you should > be able to get this working with just a couple of hour's work and no > changes whatever to the PUI source. This would be by far the simplest > way to do it. I'm not really familiar with textures at all. What I've done with OpenGL has always been things like data display, where you draw objects/surfaces, and color them according to data values. |