Thread: [PyOpenGL-Users] Textures on Bad Hardware
Brought to you by:
mcfletch
From: Ian M. <geo...@gm...> - 2010-07-31 06:07:00
|
Hi, So, my laptop has got a decent graphics card, processor, RAM, etc. Using it, I developed several programs using textures, shaders, etc. My brother, with a netbook, wants to run some of these programs. The computer, which is available <a href=" http://www.msimobile.com/level3_productpage.aspx?cid=3&id=127">here</a>, has an Intel integrated graphics chipset. I understood that such graphics cards have poor support for GLSL, but upon running the (shader-less) code on his Intel hardware, all of my texturing code failed (on the glTexImage2D(...) call, with invalid operation), whereas it works perfectly on my computer, and on several others, Mac, PC, and Linux, and NVidia and ATI. The code is written in Python, and for my test program, I wasn't doing anything fancy, just setting up a standard OpenGL 2D texture. Through games that work on his computer, I'm pretty sure that (DirectX?) texturing works (Motocross Madness) and that OpenGL texturing works (Tux Racer). I'm setting up the texture with standard OpenGL routines (and after I've set up a context, etc.). The glTexImage2D() calls give invalid operations no matter what. I'm passing them string buffers with GL_UNSIGNED_BYTE: pygame.image.tostring(data,"RGB",True) So . . . I don't think it's any particular setup; all the texturing code is failing. Do PyOpenGL textures work on Intel cards? Or is this something I don't know about? Or what? Ian |
From: Ian M. <geo...@gm...> - 2010-07-31 16:06:58
|
On Fri, Jul 30, 2010 at 11:16 PM, Dirk Reiners <dir...@gm...>wrote: > As a first start I'd look at the manpages (e.g. > http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml). At the end it > lists all the possible reasons for error codes, see if one of those is true > for you. Unfortunately those manpages don't cover extensions... > Yes, although programs without extensions were crashing. Of the errors, the only one that seems even possible is: GL_INVALID_OPERATION is generated if a non-zero buffer object name is bound to the GL_PIXEL_UNPACK_BUFFER target and the data would be unpacked from the buffer object such that the memory reads required would exceed the data store size. It seems strange, though, because other programs don't seem to have a problem. Here's the graphics "card" he has: http://www.notebookcheck.net/Intel-Graphics-Media-Accelerator-950.2177.0.html My gut feeling would guess you're using a texture format that the card > doesn't know, try using the generic GL_RGB/GL_RGBA instead of the > bit-specific ones. > I am. > We might be able to help you better if we could see your actual code. > I was trying to avoid it, as it is a general class of functions that are failing. Some of my newer code is also too flexible to be immediately readable. Here's some really old code that fails: def Texture(surface,filters): data = pygame.image.tostring(surface,"RGBA",True) width,height = surface.get_size() texture = glGenTextures(1) glBindTexture(GL_TEXTURE_2D,texture) if filters == None: filters = [] if "filter" in filters: glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR) else: glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST) if "mipmap" in filters: if "mip filter 0" in filters: glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST_MIPMAP_NEAREST) elif "mip filter 1" in filters: glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_NEAREST) elif "mip filter 2" in filters: glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST_MIPMAP_LINEAR) elif "mip filter 3" in filters: glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_LINEAR) glPixelStoref(GL_UNPACK_ALIGNMENT,1) gluBuild2DMipmaps(GL_TEXTURE_2D,3,width,height,GL_RGBA,GL_UNSIGNED_BYTE,data) else: if "filter" in filters: glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR) else: glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST) glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,width,height,0,GL_RGBA,GL_UNSIGNED_BYTE,data) return texture Thanks, Ian |
From: Dirk R. <dir...@gm...> - 2010-07-31 22:05:57
|
Hi Ian, On 07/31/2010 11:06 AM, Ian Mallett wrote: >> > Yes, although programs without extensions were crashing. OK. > Of the errors, the only one that seems even possible is: > GL_INVALID_OPERATION is generated if a non-zero buffer object name is bound > to the GL_PIXEL_UNPACK_BUFFER target and the data would be unpacked from the > buffer object such that the memory reads required would exceed the data > store size. > It seems strange, though, because other programs don't seem to have a > problem. Here's the graphics "card" he has: > http://www.notebookcheck.net/Intel-Graphics-Media-Accelerator-950.2177.0.html That is a pretty unusual error, so that doesn't sound very probable, yes. > My gut feeling would guess you're using a texture format that the card >> doesn't know, try using the generic GL_RGB/GL_RGBA instead of the >> bit-specific ones. >> > I am. OK. >> We might be able to help you better if we could see your actual code. >> > I was trying to avoid it, as it is a general class of functions that are > failing. Some of my newer code is also too flexible to be immediately > readable. Here's some really old code that fails: > def Texture(surface,filters): > data = pygame.image.tostring(surface,"RGBA",True) > width,height = surface.get_size() ... > > glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,width,height,0,GL_RGBA,GL_UNSIGNED_BYTE,data) > return texture All of that looks pretty straightforward. The only thing I can think of, but that should give a different error, is textures that are not power-of-two size. What are the sizes of your textures? Dirk |
From: Ian M. <geo...@gm...> - 2010-08-01 00:06:06
|
On Sat, Jul 31, 2010 at 3:05 PM, Dirk Reiners <dir...@gm...>wrote: > All of that looks pretty straightforward. The only thing I can think of, > but > that should give a different error, is textures that are not power-of-two > size. > What are the sizes of your textures? > I hadn't even thought of that! And after some testing, it seems that's the problem. My test code can be found here: http://download577.mediafire.com/jnza7ybur4qg/ievqu4buekair6k/test_texture.zip I'm surprised at how ubiquitous using power-of-two textures must be. Basically no other applications seemed to have problems. I never had problems, so I guess I assumed it was irrelevant. Uggggh I assume I'll have to rewrite my texture code so that it can take an argument to scale surfaces to the nearest factor of two. Any other way? I know of ARB_texture_rectangle, but that sounds horrible. Blech. I hate Intel. Ian |
From: Ian M. <geo...@gm...> - 2010-08-01 00:06:28
|
On Sat, Jul 31, 2010 at 3:05 PM, Dirk Reiners <dir...@gm...>wrote: > All of that looks pretty straightforward. The only thing I can think of, > but > that should give a different error, is textures that are not power-of-two > size. > What are the sizes of your textures? > I hadn't even thought of that! And after some testing, it seems that's the problem. My test code can be found here: http://download577.mediafire.com/jnza7ybur4qg/ievqu4buekair6k/test_texture.zip I'm surprised at how ubiquitous using power-of-two textures must be. Basically no other applications seemed to have problems. I never had problems, so I guess I assumed it was irrelevant. Uggggh I assume I'll have to rewrite my texture code so that it can take an argument to scale surfaces to the nearest factor of two. Any other way? I know of ARB_texture_rectangle, but that sounds horrible. Blech. I hate Intel. Ian |
From: Almar K. <alm...@gm...> - 2010-09-14 19:59:28
|
On 1 August 2010 02:06, Ian Mallett <geo...@gm...> wrote: > On Sat, Jul 31, 2010 at 3:05 PM, Dirk Reiners <dir...@gm...>wrote: > >> All of that looks pretty straightforward. The only thing I can think of, >> but >> that should give a different error, is textures that are not power-of-two >> size. >> What are the sizes of your textures? >> > I hadn't even thought of that! > > And after some testing, it seems that's the problem. My test code can be > found here: > > http://download577.mediafire.com/jnza7ybur4qg/ievqu4buekair6k/test_texture.zip > > I'm surprised at how ubiquitous using power-of-two textures must be. > Basically no other applications seemed to have problems. I never had > problems, so I guess I assumed it was irrelevant. > > Uggggh I assume I'll have to rewrite my texture code so that it can take an > argument to scale surfaces to the nearest factor of two. Any other way? I > know of ARB_texture_rectangle, but that sounds horrible. Blech. I hate > Intel. > The pretties way of doing this is checking the openGl version and if its < 2.0, you need power-of-two textures. However, I recently encountered a system with an ATI card, that did have OpenGl >=2.0, but did NOT support non-power-of-two textures. So my current implementation just tries resizing the texture if it fails to initialize, or if OpenGl <2.0. Cheers, Almar |
From: Ian M. <geo...@gm...> - 2010-09-16 00:47:59
|
On Tue, Sep 14, 2010 at 1:59 PM, Almar Klein <alm...@gm...> wrote: > The pretties way of doing this is checking the openGl version and if its < > 2.0, you need power-of-two textures. However, I recently encountered a > system with an ATI card, that did have OpenGl >=2.0, but did NOT support > non-power-of-two textures. So my current implementation just tries resizing > the texture if it fails to initialize, or if OpenGl <2.0. > I've simply added a global flag to my library that automatically resizes all textures to a power of two (either automatically up, or to the nearest level), and then made a note to favor power-of-two textures. It's pretty lame that Intel does this. I don't suppose there's a more useful error that could be thrown here--does the GL return something helpful in this regard? > Cheers, > Almar > Ian |
From: Almar K. <alm...@gm...> - 2010-09-16 06:39:52
|
On 16 September 2010 02:47, Ian Mallett <geo...@gm...> wrote: > On Tue, Sep 14, 2010 at 1:59 PM, Almar Klein <alm...@gm...>wrote: > >> The pretties way of doing this is checking the openGl version and if its < >> 2.0, you need power-of-two textures. However, I recently encountered a >> system with an ATI card, that did have OpenGl >=2.0, but did NOT support >> non-power-of-two textures. So my current implementation just tries resizing >> the texture if it fails to initialize, or if OpenGl <2.0. >> > I've simply added a global flag to my library that automatically resizes > all textures to a power of two (either automatically up, or to the nearest > level), and then made a note to favor power-of-two textures. It's pretty > lame that Intel does this. > > I don't suppose there's a more useful error that could be thrown here--does > the GL return something helpful in this regard? > I'm not sure if I understand your question correctly. Are you asking for the best way to detect whether power-of-two textures are required on a particular system? Well, what *should* work 100% is checking the opengl version (with glGetString(GL_VERSION)). But since this is not always enough for ATI, and maybe also for Intel, I check after creating a texture, whether the texture is valid (with glIsTexture), and try making it a power-of-two if its not. Cheers, Almar |
From: Ian M. <geo...@gm...> - 2010-09-16 15:55:51
|
On Thu, Sep 16, 2010 at 12:39 AM, Almar Klein <alm...@gm...> wrote: > I'm not sure if I understand your question correctly. Are you asking for > the best way to detect whether power-of-two textures are required on a > particular system? Well, what *should* work 100% is checking the opengl > version (with glGetString(GL_VERSION)). But since this is not always enough > for ATI, and maybe also for Intel, I check after creating a texture, whether > the texture is valid (with glIsTexture), and try making it a power-of-two if > its not. > Originally, I was trying to track down the problem, and it turned out to be that the textures weren't powers of two. The real solution is to habitually use textures that are sized to a power of two to ensure compatibility. > Cheers, > Almar > Ian |
From: Greg E. <gre...@ca...> - 2010-09-16 22:26:40
|
Almar Klein wrote: > Well, what *should* work 100% is checking the opengl > version (with glGetString(GL_VERSION)). No, this won't work 100%. It's better to test for the presence of a specific extension, in this case "ARB_texture_non_power_of_two". -- Greg |
From: Almar K. <alm...@gm...> - 2010-09-17 09:01:22
|
On 17 September 2010 00:26, Greg Ewing <gre...@ca...> wrote: > Almar Klein wrote: > > Well, what *should* work 100% is checking the opengl > > version (with glGetString(GL_VERSION)). > > No, this won't work 100%. It's better to test for the presence > of a specific extension, in this case "ARB_texture_non_power_of_two". > Testing only the version *does* make sure that the system supports non-power-of two textures. By testing also for the extension, you could detect that the system supports non-power-of-two, even if the version is smaller than 2.0. The latter is obviously a nicer solution, indeed. Almar |