Running:
print 'Max texture image units ', glGetInteger(GL_MAX_TEXTURE_IMAGE_UNITS)
Produces the following error:
File "/usr/lib/python2.6/site-packages/OpenGL/wrapper.py", line 1282, in __call__
return self._finalCall( *args, **named )
File "/usr/lib/python2.6/site-packages/OpenGL/wrapper.py", line 552, in wrapperCall
cArgs = tuple(calculate_cArgs( pyArgs ))
File "/usr/lib/python2.6/site-packages/OpenGL/wrapper.py", line 355, in calculate_cArgs
yield converter( pyArgs, index, self )
File "/usr/lib/python2.6/site-packages/OpenGL/converters.py", line 195, in __call__
return self.arrayType.zeros( self.getSize(pyArgs) )
File "/usr/lib/python2.6/site-packages/OpenGL/converters.py", line 234, in getSize
raise KeyError( """Unknown specifier %s"""%( specifier ))
KeyError: ('Unknown specifier GL_MAX_TEXTURE_IMAGE_UNITS (34930)', 'Failure in cConverter <OpenGL.converters.SizedOutput object at 0x187a140>', (GL_MAX_TEXTURE_IMAGE_UNITS,), 1, <OpenGL.wrapper.glGetIntegerv object at 0x1872f38>)
View and moderate all "bugs Discussion" comments posted by this user
Mark all as spam, and block user from posting to "Bugs"
Wow, there were a *lot* of bugs hiding behind that one. Any non-extension glGet constant over about GL 1.1 was going to raise an error. I've updated the code to have all GL 2.1 constants, but there's going to be 3.0, 3.1 and 3.2 constants missing. Need to get a better mechanism for determining what can be used with glGet to make this automatic.
3.0.1 should have the fix for the particular bug here.
View and moderate all "bugs Discussion" comments posted by this user
Mark all as spam, and block user from posting to "Bugs"
Some automated tests might be helpful too =)
Is there some ugly workaround for this? I am not sure that people running my software will have the desire/ability to upgrade to 3.0.1?
Just implemented the first suite of truly automated tests, but they wouldn't have caught this, as I don't have any code exercising the missing features.
As for an ugly workaround, you can call OpenGL.GL.glget.addGLGetConstant( constant, (1,) )
where (1,) is the dimension of the array to be created for the constant, so e.g. (4,4) for most matrices.
I've also added a script that goes ahead and attempts to use glGet on *every* constant in the core, but that's limited by the hardware/drivers I have available.
Okay, the particular bug should be fixed in the 3.0.1 release going out today. There may be more bugs in missing glGet constants, but we'll have to cross that bridge when we get there.