[PyOpenGL-Users] Shader doesn't compile with Python 3
Brought to you by:
mcfletch
|
From: Matt W. <li...@mi...> - 2013-04-14 13:43:03
|
Hi all,
I'm really enjoying using PyOpenGL. I've just switched over to using
Python3 but I've hit a problem with compiling my shaders. As an example,
consider this code:
---------- test.py
from OpenGL.GL import shaders, GL_VERTEX_SHADER
import pygame
screen = pygame.display.set_mode((800, 800), pygame.OPENGL)
VERTEX_SHADER = shaders.compileShader("""
#version 130
void main()
{
}
""", GL_VERTEX_SHADER)
----------
I'm running all this on Linux (openSUSE 12.3) with the following
configuration:
GL_RENDERER: GeForce GTX 560 Ti/PCIe/SSE2
GL_VERSION: 4.3.0 NVIDIA 313.30
GL_SHADING_LANGUAGE_VERSION: 4.30 NVIDIA via Cg compiler
If I run this code with 'python test.py' then it runs fine (briefly flashes
a window without complaint). However, when running under Python 3 with
'python3 test.py' it gives the following error:
----------
Traceback (most recent call last):
File "./test.py", line 14, in <module>
""", GL_VERTEX_SHADER)
File
"/home/matt/.local/lib/python3.3/site-packages/OpenGL/GL/shaders.py", line
233, in compileShader
shaderType,
RuntimeError: ('Shader compile failure (0):\n0(1) : error C0130: invalid
character literal\n0(1) : error C0000: syntax error, unexpected special
error tag, expecting "::" at token "<error>"\n0(1) : error C0130: invalid
character literal\n', [b'\n\t#version 130\n\t\n\tvoid
main()\n\t{\n\t}\n\t'], GL_VERTEX_SHADER)
----------
Given that the first error is 'invalid character literal', I'm assuming
that this is an encoding problem. It looks like (as far as I can tell)
PyOpenGL is sending an ASCII-encoded byte string but maybe that's not what
is expected?
Is anyone else seeing this problem or can reproduce this? Or does anyone
have an idea of the reason it's happening or a fix for it?
Regards,
Matt Williams
|