Re: [PyOpenGL-Users] Tracking down an invalid operation
Brought to you by:
mcfletch
|
From: Derakon <de...@gm...> - 2011-06-02 21:35:42
|
On Thu, Jun 2, 2011 at 2:23 PM, Ian Mallett <geo...@gm...> wrote: > On Thu, Jun 2, 2011 at 1:35 PM, Derakon <de...@gm...> wrote: >> >> I have an OpenGL-related crash that's giving me fits trying to trace >> it. This is gonna take a bit to describe, unfortunately. >> >> We have a computerized microscope with four cameras which each image >> different wavelength bands (colors) of the sample as 512x512 pixel >> arrays. These are displayed as OpenGL textures by some C++ code. I've >> made an overlaid view as a separate window, which runs in Python -- >> our C++ code is old and crufty and every time I touch it I'm worried >> it'll collapse into dust, but the Python "half" (more like 80%) of the >> program is more up-to-date. Basically, each time a camera receives new >> image, an event is generated on the C++ side, which is picked up by >> the Python side. The Python side makes a request to the C++ side for >> the image data, converts that into its own texture, and displays it. >> Ideally I'd just re-use the same textures the normal camera displays >> use, but they're in separate OpenGL contexts so, as far as I'm aware, >> that's not possible. > > Hi, > > Generally, having multiple contexts in OpenGL is a very very very bad idea. > If you're doing a readback of the texture data from one context, and then > trying to use that data in another, it may not work properly, because OpenGL > works asynchronously. > I should have clarified: I'm not reading the texture data out, I'm reading the array of brightness values that was used to generate the texture. Basically the camera sends us a bunch of bytes, then one of our viewers reads from those bytes to generate a texture. Now I'm adding a second viewer to read from the same set of bytes. As for multiple contexts, I may have mis-stated or maybe I am in fact using multiple contexts; I'm not that boned up on OpenGL. I'd assumed each non-overlaid camera display was its own context, since each one has the same texture ID of 1, and I'd assume that in a single context no two textures could have the same texture ID. Is this inaccurate? This app has many wxGLCanvases all over the place and I've never run into trouble before. This window should work just like the other canvases in the app. According to this: http://wiki.wxwidgets.org/WxGLCanvas#Sharing_wxGLCanvas_context it should be possible to share the OpenGL context across canvases, but we aren't doing that right now. Of course none of the canvases are trying to share resources either. It sounds like you're suggesting that in the middle of my viewer's paint logic, another context is barging in and confusing OpenGL. I take that to mean that when I make an OpenGL API call, my context isn't implicitly passed along? But then why wouldn't my other canvases ever get screwed up? Assuming that is the problem, theoretically I could fix that by creating one canvas, getting its context, and then storing that globally and using it whenever I create any other canvases. Does that sound about right? -Chris |