From: Vincent V. <viv...@ir...> - 2007-06-06 19:12:13
|
Dear all, A collegue is trying to run an OpenSceneGraph application through Chromium. However, any application with a textured scene segfaults. After some investigations, I found that OSG relies on GL_MAX_TEXTURE_SIZE to resize a TEXTURE_2D to the nearest power of 2. Under a Chromium cave configuration, GL_MAX_TEXTURE_SIZE returns 0, therefore OSG tries to allocate a texture of (0,0) size which causes the segfault. Could anyone confirm that bug ? Regards, Vincent --------------- The piece of code to insert within osgviewer render loop : GLint _maxTextureSize; glGetIntegerv(GL_MAX_TEXTURE_SIZE,&_maxTextureSize); printf("GL_MAX_TEXTURE_SIZE : %d\n",_maxTextureSize); My configuration : 3 linux PC under Fedora 5 GPUs : ATI Radeon X800 GTO with fglrx drivers / the situation is similar with NVIDIA GPUs. OSG version : 1.2 Chromium version : 1.9. |