Thread: [PyOpenGL-Users] problem with soya's usage of OpenGL
Brought to you by:
mcfletch
From: Christopher A. <ra...@tw...> - 2004-07-19 21:18:49
Attachments:
signature.asc
|
Hello soya-user, hello pyopengl-users. This message isn't _totally_ relevant to pyopengl-users, but it's not unlikely that you guys have the requesite knowledge to help me fix this problem. I've been trying to get Soya3d (http://home.gna.org/oomadness/en/soya/index.html) working for a while, but it's got problems with initialization. Soya3d talks to the OpenGL API directly with Pyrex. The problem is that a bunch of the calls it's making to OpenGL are returning NULL, while PyOpenGL calls to the same functions are returning expected values. At first we thought this was a problem with the way Soya was initializing GL (It uses SDL_Init to accomplish this), like it wasn't properly waiting for the OpenGL system to be initialized before it was making the calls. But to disprove that, I wrote this bit of Pyrex code: .. inside of soya's init() function... from OpenGL import GL print "PyOGL:", GL.glGetString(GL.GL_VENDOR) my_dump_info() cdef void my_dump_info(): cdef char* gl_vendor gl_vendor = <char*> glGetString(GL_VENDOR) if gl_vendor == NULL: print "OGL: Wargh glGetString returned NULL" check_gl_error() else: print "OGL:", PyString_FromString(gl_vendor) GL.glGetString was returning the expected "NVIDIA Corporation", but the direct call to the C glGetString is still returning NULL. The call to check_gl_error does a glCheckError, but that's returning GL_NO_ERROR. The same thing happens for stuff like glGetIntegerv. So the only conclusion I can draw here is that somehow the PyOpenGL interface is calling these functions in a way that's different from pyrex's direct calls to them. I tried reading through the source of PyOpenGL but the SWIG stuff wasn't elucidative. It just seems like the wrappers are pretty direct. Does PyOpenGL involve some kind of different context that it might use where a direct call to the C interface wouldn't? -- Twisted | Christopher Armstrong: International Man of Twistery Radix | Release Manager, Twisted Project ---------+ http://radix.twistedmatrix.com/ |
From: Mike C. F. <mcf...@ro...> - 2004-07-19 23:00:20
|
We don't do anything particularly fancy in the wrapper for glGetString, here's what it looks like for the generated OpenGL 1.1 wrapper from PyOpenGL 2.0.1.08: static PyObject *_wrap_glGetString(PyObject *self, PyObject *args) { PyObject *resultobj; GLenum arg1 ; GLubyte *result; PyObject * obj0 = 0 ; if(!PyArg_ParseTuple(args,(char *)"O:glGetString",&obj0)) return NULL; arg1 = (GLenum) PyInt_AsLong(obj0); if (PyErr_Occurred()) return NULL; { result = (GLubyte *)glGetString(arg1); if (GLErrOccurred()) return NULL; } { if (result) { resultobj= PyString_FromString(result); } else { Py_INCREF(resultobj = Py_None); } } return resultobj; } I can't see anything there which is markedly different from your approach. PyOpenGL is fairly minimal wrt what it sets up for contexts. AFAIK we aren't doing anything funky with initialising our reference to OpenGL, leaving the context-creation work to the GUI libraries as much as possible, (though I should note that that stuff would all have been written by someone else, so it could be we spend thousands of lines of code on it somewhere and I've just missed it during maintenance). We do some minimal stuff such as defining functions for retrieving the current context under OpenGL 1.0, but I doubt that's relevant. From the docs: (http://pyopengl.sourceforge.net/documentation/manual/glGetString.3G.xml) GL_INVALID_OPERATION is generated if glGetString is executed between the execution of glBegin and the corresponding execution of glEnd. I'd confirm that you are not calling this within those functions. That and an invalid enum (feature-name) are the only two listed errors, and a 0 (NULL) return value apparently only occurs on error. Could your GLenum somehow be suffering from an off-by-one or similar error such that it's giving you lots of "invalid enum" errors across many parts of the API? Seems your check_gl_error() isn't picking up the failure for some reason, but that doesn't solve the base problem. Good luck, Mike ________________________________________________ Mike C. Fletcher Designer, VR Plumber, Coder http://members.rogers.com/mcfletch/ blog: http://zope.vex.net/~mcfletch/plumbing/ |
From: Christopher A. <ra...@tw...> - 2004-07-20 22:02:12
Attachments:
signature.asc
|
Mike C. Fletcher wrote: Thanks for the help, Mike! I don't know if you remember, but I think we met at PyCon this year; I was rambling about how horrible the scene is for open source 3d game engines with you and Tamer ;) > We don't do anything particularly fancy in the wrapper for glGetString, > here's what it looks like for the generated OpenGL 1.1 wrapper from > PyOpenGL 2.0.1.08: > [snip C code] Here's the C code that's being generated from pyrex from that snippet I showed in my post: char (*__pyx_v_gl_vendor); /* ... */ int __pyx_2; /* ... */ /* "/home/radix/Projects/soya/init.pyx":307 */ __pyx_v_gl_vendor = ((char (*))glGetString(GL_VENDOR)); /* "/home/radix/Projects/soya/init.pyx":308 */ __pyx_2 = (__pyx_v_gl_vendor == 0); if (__pyx_2) { /* etc */ So the (__pyx_v_gl_vendor == 0) is ALWAYS true, no matter WHAT I pass to glGetString (I tried random numbers like 9999), and glCheckError is NEVER returning an error code. Are all of those (implicit) casts reasonable? I also checked that the value of GL_VENDOR is expected; it's the same as PyOpenGL.GL.GL_VENDOR. > From the docs: > (http://pyopengl.sourceforge.net/documentation/manual/glGetString.3G.xml) > GL_INVALID_OPERATION is generated if glGetString is executed > between the > execution of glBegin and the corresponding execution of glEnd. > > I'd confirm that you are not calling this within those functions. I checked with Jiba, as well as the soya source code, and indications are that it's not being called between glBegin and glEnd. > That > and an invalid enum (feature-name) are the only two listed errors, and a > 0 (NULL) return value apparently only occurs on error. Could your > GLenum somehow be suffering from an off-by-one or similar error such > that it's giving you lots of "invalid enum" errors across many parts of > the API? Something might be going wrong with my error checking code, since no matter what I pass to glGetString, I'm not getting any errors. I should mention that this code is working for other people; but I don't know if any of those other people are using the implementation of OpenGL provided by NVidia's proprietary Linux drivers (anyone?). Thanks for the help, I'll flail around at the problem some more. -- Twisted | Christopher Armstrong: International Man of Twistery Radix | Release Manager, Twisted Project ---------+ http://radix.twistedmatrix.com/ |