The suggestion of checking for unicode objects makes sense to me -- although I would suggest to use ascii encoding, unless someone could point me to a document that explicitly states what encoding is used for the GLubyte* arguments. Since ascii encoding raises an error upon encountering bytes > 127, I think that is the safest bet unless we are sure what encoding is actually used.

As far as I know (but my OpenGL knowledge is limited) glGetString is the only function that actually returns a string. So we could add a glGetUnicode or glGetStr or glGetPyStr (suggestions for appropriate name welcome) that always returns a 'str' -- 8 bit on Py2 and unicode on Py3. And leave glGetString return a bytes object for the reasons mentioned. This new function could then be used in the extensions module; I think this way many of the as_8_bit calls can be avoided because then the extensions module would just use the Python native string type.

I'd be happy to go and implement this solution, if you can point me to the appropriate place to put the code. It will probably be a bit of puzzling how to write it such that it works properly when transformed by the 2to3 tool :-)


Rob Reilink, M.Sc
Science Applied

phone: +31 6 187 26562
e-mail: r.reilink@science-applied.nl

Op 4 apr 2012, om 17:44 heeft Mike C. Fletcher het volgende geschreven:

On 12-04-04 07:23 AM, Rob Reilink wrote:

I've noticed that under Python3, PyOpenGL inconsistently expects 'str'
and 'bytes' objects as function arguments. For some functions, I'd
expect to use 'str' while in the current implementation 'bytes' is used.

E.g. in glGetUniformlocation, the name of the uniform is to be
specified as a 'bytes' object, while I would expect to use 'str' since
it is a name. Also, shaders.compileShader takes a 'str' for the shader

Similarly, extensions.hasGLExtension() expects a 'str' object for the
extension name, but extension.AVAILABLE_GL_EXTENSIONS is a list of

Of course, for arguments dealing with binary data (e.g. glTexImage2D),
a 'bytes' object is to be used.

Apart from the actual implementation, has there been any thought on
how to expose things like uniform names to the user?
My first reaction would be to do this:

    if isinstance( arg, unicode):
        arg = arg.encode( 'latin-1' )

in the wrapper (the as_8_bit() hack has been added to the extensions
module, for instance), that is, for each argument which is current str,
make the argument capable of accepting a unicode argument... as for
producing them... I'm hesitant to convert the 8-bit values (which is
internally what OpenGL is specifying; GLchar * is an 8-bit value) to
unicode arbitrarily, as now code that uses formal, correct, byte-strings
is going to crash when it tries to interact with the generated unicode

Everything in OpenGL is binary data. Everything has an expressly defined
binary representation, and that includes byte-strings.  Anything I do
here to paper over the difference is, I expect, going to come back to
byte us in the future.  Someone is going to do a glGetActiveUniform() in
my_unicode_shader and have it blow up on a unicode/bytes disagreement if
I convert on return, or is going to do glGetActiveUniform() in
my_bytes_shader if I don't, but I expect that the number of problems
with glGetUniform( 'myuniform' ) will be substantial.

Basically I don't have a good solution.  Either we create an API
inconsistency between Python 2 and Python 3 (returning "str" in both,
though they are different types), or we make Python 3 users explicitly
deal with the return-type of the GLchar* calls and/or use byte-strings


  Mike C. Fletcher
  Designer, VR Plumber, Coder

Better than sec? Nothing is better than sec when it comes to
monitoring Big Data applications. Try Boundary one-second
resolution app monitoring today. Free.
PyOpenGL Homepage
PyOpenGL-Devel mailing list