From: Matt S. <ma...@ki...> - 2003-06-04 21:13:11
|
> -----Original Message----- > From: dri...@li... > [mailto:dri...@li...]On Behalf Of Ian Romanick > Sent: Wednesday, June 04, 2003 9:21 PM > To: DRI developer's list > Subject: [Dri-devel] Re: [Mesa3d-dev] RE: Not confused so much anymore, > but.. (pointers to colour buffers again..) > > > Matt Sealey wrote: > > > If the application developers requests that the GL view be 10 pixels > > from the bottom, 50 pixels from the left, and 300 pixels wide and > > high, surely they are the coordinates we're doing the viewport > > transformation into? > > If I have a window that is 640x480, and I can set my view port to > (0,0)-(319,239), do some drawing, set it to (320,0)-(639,239), do some > drawing, etc. to get several views within a window. Sure thing, but those values DIRECTLY correspond to the pixels in the window, right? Forget resizing for a second, assume the window is of fixed width and height (for instance, a full screen application). By default OpenGL is supposed to set the Viewport to (0,0,winWidth,winHeight) which means if you open a window and bind a context you get a "full" view. But nothing stops you shaving 20 pixels off the edges, or pulling the glViewport() up by 60 pixels to reserve space for your own rendering. Consider a game where the status bar wasn't rendered by OpenGL, but used a standard OS blitting routine. I *am* right in thinking that the viewport values are intended to be "pixel perfect"? > CAD program that gives the user multiple views of the scene does. > Unless the app sets clip planes when it changes the viewport drawing > that goes outside the viewport will still be draw (but drawing that goes > outside the window will not). The center of projection will (modulo > changing other settings) be the center of whatever the current viewport is. > > Does that clarify things? That's exactly what I thought it was. But.. > Like various people have said before, it's just a common coincidence > that apps call glViewport when the window size chages. If they didn't > the center of projection would be "wrong" for the new window dimensions. I would assume that any CAD application, or a glut callback, since they have no good access to the internals of the OpenGL implementation, simply use glViewport() in this manner once they detect (either in glut or a custom event loop) a window resize. The difference really is that DRI *knows* when a resize happens (I still don't understand how it manages to respect the Viewport settings if it disregards them and simply uses window geometry?), but in my case, the application is expected to inform the OpenGL subsystem of such changes. I can easily inform OpenGL via some custom function. So surely the application sets those viewport values relative to what it *thinks* the window width and height is anyway. So, for instance, in a glutReshape() event: void reshape(int w, int h) { /* for our bottom right hand CAD window */ glViewport(w/2,h/2,w/2,h/2); } .. therefore also the physical location and geometry inside the window the context is bound to, are directly corresponding to the ones passed into glViewport(). Let us be very naive and trusting, and assume that when we are calling glViewport(), as an application, we always know the current window dimensions. I would class passing in bogus values as a quirk of implementation - with no warnings and no real errors, just bad results (like Motorola's AltiVec unit, it implements no alignment exceptions or the like, it simply gives you bad results of you forget to permute the data for it) .. Would it work just fine if we used the values passed in? If this is NOT true, I must say I am at a loss as to what to do next. -- Matt Sealey <ma...@ki...> |