I am running an app in an PyQt QGLWidget. If the window is minimized, and
the method for rendering the object that should be displayed is called, I
get the error 'OpenGL.error.Error: Attempt to retrieve context when no
valid context' (included below), when glColorPointerd() is called.
* a way to check whether a valid context is set?
* force the context to be set?
line 410, in wrapperCall
line 66, in __call__
contextdata.setValue( self.constant, pyArgs[self.pointerIndex] )
line 49, in setValue
context = getContext( context )
line 32, in getContext
"""Attempt to retrieve context when no valid context"""
OpenGL.error.Error: Attempt to retrieve context when no valid context
I've noticed some ATI cards flip textures some times too... not sure why.
It could be that some matrix is getting a negative sign somehow... or some
other random reason. Perhaps something to do with the perspective matrix.
Are you using glReadPixels? I've heard that sometimes flips things on
On Sat, May 9, 2009 at 11:56 AM, Ian Mallett <geometrian@...> wrote:
> Any idea what could possibly be causing these problems? Is there some
> function that flips the framebuffer's texture? What am I missing and what's
> going on?
[snip crappy sf mailing list spam]
This method uses a framebuffer object for the render to texture:
-Enable the framebuffer object
-Disable the framebuffer object
I'm aware that framebuffer objects can render to textures using render
targets (glFramebufferTexture2DEXT...()), but I didn't want to deal with
making it a depth texture. In my implementation, the framebuffer object is
just used to increase the size of the viewport meaningfully. Could this
setup be the problem?
Incidentally, when is used glCopyTexImage2D(...) alone, everywhere appears
to be in shadow.
In my shader, I hardcoded the shadow functions to avoid card to card
problems, such as what to do with the edges outside the shadowmap (we
already saw that's a problem earlier, René).
I will check to see if the matrices are somehow different between cards.