The values returned by the default array type, numpy, are not accepted by OpenGL; more importantly, they are not accepted by the respective set/bind functions.
This output is from the attached test file, run with the latest CVS revision:
glGenTextures(1) -> 1 (<type 'long'>)
glGenTextures(2) -> [2 3] (list: <type 'numpy.ndarray'>, elements: <type 'numpy.uint32'>)
Calling: glBindTexture(GL_TEXTURE_2D, 1)
(created from glGenTextures(1))
Calling: glBindTexture(GL_TEXTURE_2D, 2)
(created from glGenTextures(2), element 0)
Exception Caught: argument 2: <type 'exceptions.TypeError'>: wrong type
The returned type of the array is numpy.ndarray, with each element having the type numpy.uint32. This element type is also not immediately convertable to a function argument type such as GLuint.
The return type of glGenTextures(1), however, is of the type long due to the special-case functionality. This is not the case for functions that do not handle special cases similar to this, such as OpenGL.GL.EXT.framebuffer_object.glGenFramebuffersEXT
A quick global work-around is to change the array type to ctypes after importing OpenGL:
from OpenGL.arrays import formathandler
formathandler.FormatHandler.chooseOutput( 'ctypesarrays' )
Log in to post a comment.