Re: [PyOpenGL-Users] Textures on Bad Hardware
Brought to you by:
mcfletch
From: Dirk R. <dir...@gm...> - 2010-07-31 22:05:57
|
Hi Ian, On 07/31/2010 11:06 AM, Ian Mallett wrote: >> > Yes, although programs without extensions were crashing. OK. > Of the errors, the only one that seems even possible is: > GL_INVALID_OPERATION is generated if a non-zero buffer object name is bound > to the GL_PIXEL_UNPACK_BUFFER target and the data would be unpacked from the > buffer object such that the memory reads required would exceed the data > store size. > It seems strange, though, because other programs don't seem to have a > problem. Here's the graphics "card" he has: > http://www.notebookcheck.net/Intel-Graphics-Media-Accelerator-950.2177.0.html That is a pretty unusual error, so that doesn't sound very probable, yes. > My gut feeling would guess you're using a texture format that the card >> doesn't know, try using the generic GL_RGB/GL_RGBA instead of the >> bit-specific ones. >> > I am. OK. >> We might be able to help you better if we could see your actual code. >> > I was trying to avoid it, as it is a general class of functions that are > failing. Some of my newer code is also too flexible to be immediately > readable. Here's some really old code that fails: > def Texture(surface,filters): > data = pygame.image.tostring(surface,"RGBA",True) > width,height = surface.get_size() ... > > glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,width,height,0,GL_RGBA,GL_UNSIGNED_BYTE,data) > return texture All of that looks pretty straightforward. The only thing I can think of, but that should give a different error, is textures that are not power-of-two size. What are the sizes of your textures? Dirk |