[PyOpenGL-Devel] [ pyopengl-Bugs-2895081 ] Unknown specifier GL_MAX_TEXTURE_IMAGE_UNITS
Brought to you by:
mcfletch
From: SourceForge.net <no...@so...> - 2009-11-13 16:16:20
|
Bugs item #2895081, was opened at 2009-11-10 04:04 Message generated for change (Comment added) made by mcfletch You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=105988&aid=2895081&group_id=5988 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: GL Group: v3.0.0 Status: Open Resolution: None Priority: 5 Private: No Submitted By: https://www.google.com/accounts () Assigned to: Mike C. Fletcher (mcfletch) Summary: Unknown specifier GL_MAX_TEXTURE_IMAGE_UNITS Initial Comment: Running: print 'Max texture image units ', glGetInteger(GL_MAX_TEXTURE_IMAGE_UNITS) Produces the following error: File "/usr/lib/python2.6/site-packages/OpenGL/wrapper.py", line 1282, in __call__ return self._finalCall( *args, **named ) File "/usr/lib/python2.6/site-packages/OpenGL/wrapper.py", line 552, in wrapperCall cArgs = tuple(calculate_cArgs( pyArgs )) File "/usr/lib/python2.6/site-packages/OpenGL/wrapper.py", line 355, in calculate_cArgs yield converter( pyArgs, index, self ) File "/usr/lib/python2.6/site-packages/OpenGL/converters.py", line 195, in __call__ return self.arrayType.zeros( self.getSize(pyArgs) ) File "/usr/lib/python2.6/site-packages/OpenGL/converters.py", line 234, in getSize raise KeyError( """Unknown specifier %s"""%( specifier )) KeyError: ('Unknown specifier GL_MAX_TEXTURE_IMAGE_UNITS (34930)', 'Failure in cConverter <OpenGL.converters.SizedOutput object at 0x187a140>', (GL_MAX_TEXTURE_IMAGE_UNITS,), 1, <OpenGL.wrapper.glGetIntegerv object at 0x1872f38>) ---------------------------------------------------------------------- >Comment By: Mike C. Fletcher (mcfletch) Date: 2009-11-13 11:16 Message: Just implemented the first suite of truly automated tests, but they wouldn't have caught this, as I don't have any code exercising the missing features. As for an ugly workaround, you can call OpenGL.GL.glget.addGLGetConstant( constant, (1,) ) where (1,) is the dimension of the array to be created for the constant, so e.g. (4,4) for most matrices. I've also added a script that goes ahead and attempts to use glGet on *every* constant in the core, but that's limited by the hardware/drivers I have available. ---------------------------------------------------------------------- Comment By: https://www.google.com/accounts () Date: 2009-11-10 23:12 Message: Some automated tests might be helpful too =) Is there some ugly workaround for this? I am not sure that people running my software will have the desire/ability to upgrade to 3.0.1? ---------------------------------------------------------------------- Comment By: Mike C. Fletcher (mcfletch) Date: 2009-11-10 11:48 Message: Wow, there were a *lot* of bugs hiding behind that one. Any non-extension glGet constant over about GL 1.1 was going to raise an error. I've updated the code to have all GL 2.1 constants, but there's going to be 3.0, 3.1 and 3.2 constants missing. Need to get a better mechanism for determining what can be used with glGet to make this automatic. 3.0.1 should have the fix for the particular bug here. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=105988&aid=2895081&group_id=5988 |