Re: [PyOpenGL-Users] glGetTexImage weirdness
Brought to you by:
mcfletch
From: Gijs <in...@bs...> - 2009-02-19 16:50:35
|
On 2/19/09 5:42 PM, Mike C. Fletcher wrote: > Gijs wrote: > ... >> data = glGetTexImageub(GL_TEXTURE_2D, 0, GL_RGBA) >> print data >> data = glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE) >> print data >> >> As I expect that PyOpenGL would pass GL_UNSIGNED_BYTE to the >> underlying function when I use glGetTexImageub, and that if I pass >> the type myself directly, the result would be the same. But in the >> first case I get a proper response, containing [[[97 97 97 97]]], and >> the second I get a rather weird response "aaaa" (a string). In the >> end it's of course quite easy to work around it, since you can just >> use the glGetTexImageub function, but when I stumbled upon it, it >> took me quite some time to find it since I assumed both would be the >> same. > It's a legacy compatibility feature requested to be re-instated by the > Pygame folks. See the flag: > > UNSIGNED_BYTE_IMAGES_AS_STRING > > in the OpenGL/__init__.py module for how to go back to always getting > back your "normal" array types. IIRC some common function in the > released versions of Pygame was using the function expecting a string > back (PyOpenGL 2.x behaviour) and broke with 3.x returning the > registered preferred array handler type. > > HTH, > Mike > Hmm ok, guess I'll use the array-functions then. Btw, when I use glGetTexImageub and I set the fragment shader to output 1/256 on every pixel, I get a 0 in every pixel instead of a 1. And every subsequent number returns the number-1, so 34/256 returns 33. Does this also have something to do with backwards compatibility? |