Thread: [PyOpenGL-Users] Shader problems
Brought to you by:
mcfletch
From: Tim B. <tim...@gm...> - 2008-11-16 01:12:07
|
Using Kubuntu Intrepid (8.10), PyOpenGL 3.0b6, proprietary nvidia drivers (though I'm not sure what specific card is in this computer). I am trying to get shaders to work using PyOpenGL, but calling glCreateShader(GL_FRAGMENT_SHADER) produces the exception: Traceback (most recent call last): File "./main.py", line 154, in <module> testShader = shader.FragmentShader("testShader.glsl") File "/home/rya/hw3/shader.py", line 11, in __init__ self.__shader = glCreateShader(GL_FRAGMENT_SHADER) File "/usr/lib/python2.5/site-packages/PyOpenGL-3.0.0b6-py2.5.egg/OpenGL/platform/baseplatform.py", line 280, in __call__ self.__name__, self.__name__, OpenGL.error.NullFunctionError: Attempt to call an undefined function glCreateShader, check for bool(glCreateShader) before calling Does this mean that my video card and/or opengl install does not support shaders? If that's not what this means, then what could be going on? Thanks! |
From: Mike C. F. <mcf...@vr...> - 2008-11-16 18:07:20
|
Tim Bocek wrote: > Using Kubuntu Intrepid (8.10), PyOpenGL 3.0b6, proprietary nvidia > drivers (though I'm not sure what specific card is in this computer). > I am trying to get shaders to work using PyOpenGL, but calling > > glCreateShader(GL_FRAGMENT_SHADER) produces the exception: > > Traceback (most recent call last): > File "./main.py", line 154, in <module> > testShader = shader.FragmentShader("testShader.glsl") > File "/home/rya/hw3/shader.py", line 11, in __init__ > self.__shader = glCreateShader(GL_FRAGMENT_SHADER) > File "/usr/lib/python2.5/site-packages/PyOpenGL-3.0.0b6-py2.5.egg/OpenGL/platform/baseplatform.py", > line 280, in __call__ > self.__name__, self.__name__, > OpenGL.error.NullFunctionError: Attempt to call an undefined function > glCreateShader, check for bool(glCreateShader) before calling > > Does this mean that my video card and/or opengl install does not > support shaders? If that's not what this means, then what could be > going on? > The possibilities are (there are probably other ones, but these are what comes to mind immediately): * Your card does not support shaders o possible for fairly old/low-level hardware * Your card only supports shaders with the ARB extension (glCreateShaderARB) o common for slightly older hardware or drivers * Your card only supports lower-level "program" shaders (pre-GLSL) o common for older integrated graphics solutions * Your card is currently running in non-accelerated mode o e.g. because of a kernel mismatch versus the X driver * You are calling the shader creation before you have an active OpenGL context o many implementations require a valid GL context before they let you create shaders and the like * You are calling the code on a software-only context o e.g. because of requesting a context which can't be hardware accelerated To debug these kinds of things: * lspci | grep nVidia | grep VGA o should tell you your card name, you can then google to see what features the card supports * glxinfo | grep -ir version o should tell you whether your drivers support GL version 2.0.0 or greater * add glGetString( GL_VERSION ) and print the result in your code o if > 2.0.0 you should have glCreateShader available * add glGetString( GL_EXTENSIONS ) and print the result in your code o if GL_ARB_shader_objects is in the list, then you need to use the ARB form of the functionality * import OpenGL.GL.ARB.shader_objects and check for bool( glCreateShaderObjectARB ) o when it's true, you can use it... * try different (known-to-work) code for initializing the context to be sure you get a shader-friendly context HTH, Mike -- ________________________________________________ Mike C. Fletcher Designer, VR Plumber, Coder http://www.vrplumber.com http://blog.vrplumber.com |
From: Tim B. <tim...@gm...> - 2008-11-16 19:23:01
|
Mike- Thanks for your comprehensive response! I've managed to figure out that I need to use the ARB extensions but that they do work (downloaded a c example program that convinced myself of this). Now I gotta figure out whether there's an ARB-supporting computer I can demo to my instructor with, if not it doesn't look too bad to support both with a runtime switch ;) Thanks again! -Tim On Sun, Nov 16, 2008 at 11:07 AM, Mike C. Fletcher <mcf...@vr...>wrote: > Tim Bocek wrote: > >> Using Kubuntu Intrepid (8.10), PyOpenGL 3.0b6, proprietary nvidia >> drivers (though I'm not sure what specific card is in this computer). >> I am trying to get shaders to work using PyOpenGL, but calling >> >> glCreateShader(GL_FRAGMENT_SHADER) produces the exception: >> >> Traceback (most recent call last): >> File "./main.py", line 154, in <module> >> testShader = shader.FragmentShader("testShader.glsl") >> File "/home/rya/hw3/shader.py", line 11, in __init__ >> self.__shader = glCreateShader(GL_FRAGMENT_SHADER) >> File >> "/usr/lib/python2.5/site-packages/PyOpenGL-3.0.0b6-py2.5.egg/OpenGL/platform/baseplatform.py", >> line 280, in __call__ >> self.__name__, self.__name__, >> OpenGL.error.NullFunctionError: Attempt to call an undefined function >> glCreateShader, check for bool(glCreateShader) before calling >> >> Does this mean that my video card and/or opengl install does not >> support shaders? If that's not what this means, then what could be >> going on? >> >> > The possibilities are (there are probably other ones, but these are what > comes to mind immediately): > > * Your card does not support shaders > o possible for fairly old/low-level hardware > * Your card only supports shaders with the ARB extension > (glCreateShaderARB) > o common for slightly older hardware or drivers > * Your card only supports lower-level "program" shaders (pre-GLSL) > o common for older integrated graphics solutions > * Your card is currently running in non-accelerated mode > o e.g. because of a kernel mismatch versus the X driver > * You are calling the shader creation before you have an active > OpenGL context > o many implementations require a valid GL context before they > let you create shaders and the like > * You are calling the code on a software-only context > o e.g. because of requesting a context which can't be hardware > accelerated > > To debug these kinds of things: > > * lspci | grep nVidia | grep VGA > o should tell you your card name, you can then google to see > what features the card supports > * glxinfo | grep -ir version > o should tell you whether your drivers support GL version > 2.0.0 or greater > * add glGetString( GL_VERSION ) and print the result in your code > o if > 2.0.0 you should have glCreateShader available > * add glGetString( GL_EXTENSIONS ) and print the result in your code > o if GL_ARB_shader_objects is in the list, then you need to > use the ARB form of the functionality > * import OpenGL.GL.ARB.shader_objects and check for bool( > glCreateShaderObjectARB ) > o when it's true, you can use it... > * try different (known-to-work) code for initializing the context to > be sure you get a shader-friendly context > > HTH, > Mike > > -- > ________________________________________________ > Mike C. Fletcher > Designer, VR Plumber, Coder > http://www.vrplumber.com > http://blog.vrplumber.com > > |
From: Mike C. F. <mcf...@vr...> - 2008-11-17 00:16:22
|
Tim Bocek wrote: > Mike- > Thanks for your comprehensive response! I've managed to figure out > that I need to use the ARB extensions but that they do work > (downloaded a c example program that convinced myself of this). Now I > gotta figure out whether there's an ARB-supporting computer I can demo > to my instructor with, if not it doesn't look too bad to support both > with a runtime switch ;) from OpenGL.extensions import alternate from OpenGL.GL import * from OpenGL.GL.ARB.shader_objects import * from OpenGL.GL.ARB.fragment_shader import * from OpenGL.GL.ARB.vertex_shader import * glCreateShader = alternate( 'glCreateShader', glCreateShader, glCreateShaderObjectARB ) glShaderSource = alternate( 'glShaderSource', glShaderSource, glShaderSourceARB) glCompileShader = alternate( 'glCompileShader', glCompileShader, glCompileShaderARB) glCreateProgram = alternate( 'glCreateProgram', glCreateProgram, glCreateProgramObjectARB) glAttachShader = alternate( 'glAttachShader', glAttachShader,glAttachObjectARB ) glValidateProgram = alternate( 'glValidateProgram',glValidateProgram,glValidateProgramARB ) glLinkProgram = alternate( 'glLinkProgram',glLinkProgram,glLinkProgramARB ) glDeleteShader = alternate( 'glDeleteShader', glDeleteShader,glDeleteObjectARB ) glUseProgram = alternate('glUseProgram',glUseProgram,glUseProgramObjectARB ) glGetProgramInfoLog = alternate( glGetProgramInfoLog, glGetInfoLogARB ) Should let you use the GL 2.0 versions and auto-fallback to the ARB versions when 2.0 isn't available. HTH, Mike -- ________________________________________________ Mike C. Fletcher Designer, VR Plumber, Coder http://www.vrplumber.com http://blog.vrplumber.com |