Thread: [PyOpenGL-Users] Shader doesn't compile with Python 3
Brought to you by:
mcfletch
|
From: Matt W. <li...@mi...> - 2013-04-14 13:43:03
|
Hi all,
I'm really enjoying using PyOpenGL. I've just switched over to using
Python3 but I've hit a problem with compiling my shaders. As an example,
consider this code:
---------- test.py
from OpenGL.GL import shaders, GL_VERTEX_SHADER
import pygame
screen = pygame.display.set_mode((800, 800), pygame.OPENGL)
VERTEX_SHADER = shaders.compileShader("""
#version 130
void main()
{
}
""", GL_VERTEX_SHADER)
----------
I'm running all this on Linux (openSUSE 12.3) with the following
configuration:
GL_RENDERER: GeForce GTX 560 Ti/PCIe/SSE2
GL_VERSION: 4.3.0 NVIDIA 313.30
GL_SHADING_LANGUAGE_VERSION: 4.30 NVIDIA via Cg compiler
If I run this code with 'python test.py' then it runs fine (briefly flashes
a window without complaint). However, when running under Python 3 with
'python3 test.py' it gives the following error:
----------
Traceback (most recent call last):
File "./test.py", line 14, in <module>
""", GL_VERTEX_SHADER)
File
"/home/matt/.local/lib/python3.3/site-packages/OpenGL/GL/shaders.py", line
233, in compileShader
shaderType,
RuntimeError: ('Shader compile failure (0):\n0(1) : error C0130: invalid
character literal\n0(1) : error C0000: syntax error, unexpected special
error tag, expecting "::" at token "<error>"\n0(1) : error C0130: invalid
character literal\n', [b'\n\t#version 130\n\t\n\tvoid
main()\n\t{\n\t}\n\t'], GL_VERTEX_SHADER)
----------
Given that the first error is 'invalid character literal', I'm assuming
that this is an encoding problem. It looks like (as far as I can tell)
PyOpenGL is sending an ASCII-encoded byte string but maybe that's not what
is expected?
Is anyone else seeing this problem or can reproduce this? Or does anyone
have an idea of the reason it's happening or a fix for it?
Regards,
Matt Williams
|
|
From: Ian M. <ia...@ge...> - 2013-04-14 15:06:14
|
Pass it binary strings. |
|
From: Matt W. <li...@mi...> - 2013-04-14 20:44:07
|
On 14 April 2013 16:05, Ian Mallett <ia...@ge...> wrote:
> Pass it binary strings.
If I do that with:
VERTEX_SHADER = shaders.compileShader(b"""
#version 130
void main()
{
}
""", GL_VERTEX_SHADER)
then it still works correctly on Python 2.7 but with Python 3.3 it errors with:
RuntimeError: ('Shader compile failure (0):\n0(1) : error C0125:
integer constant overflow\n0(1) : error C0000: syntax error,
unexpected unsigned integer constant at token "<uint-const>"\n', [10,
9, 35, 118, 101, 114, 115, 105, 111, 110, 32, 49, 51, 48, 10, 9, 10,
9, 118, 111, 105, 100, 32, 109, 97, 105, 110, 40, 41, 10, 9, 123, 10,
9, 125, 10, 9], GL_VERTEX_SHADER)
So it looks to be passing a list of integers referring to the correct
ASCII characters.
The Python source file itself is utf-8 encoded but since I'm only
using the ASCII range of characters, it's the same as if it were ASCII
encoded.
I have the same problem if I do:
VERTEX_SHADER = shaders.compileShader("""
#version 130
void main()
{
}
""".encode('ascii'), GL_VERTEX_SHADER)
with an identical error.
Regards,
Matt Williams
|
|
From: Chris B. - N. F. <chr...@no...> - 2013-04-15 15:14:53
|
I'm not using py3, but maybe you need to encode the string into a
bytes object ASCII.
-Chris
On Apr 14, 2013, at 1:44 PM, Matt Williams <li...@mi...> wrote:
> On 14 April 2013 16:05, Ian Mallett <ia...@ge...> wrote:
>> Pass it binary strings.
>
> If I do that with:
>
> VERTEX_SHADER = shaders.compileShader(b"""
> #version 130
>
> void main()
> {
> }
> """, GL_VERTEX_SHADER)
>
> then it still works correctly on Python 2.7 but with Python 3.3 it errors with:
>
> RuntimeError: ('Shader compile failure (0):\n0(1) : error C0125:
> integer constant overflow\n0(1) : error C0000: syntax error,
> unexpected unsigned integer constant at token "<uint-const>"\n', [10,
> 9, 35, 118, 101, 114, 115, 105, 111, 110, 32, 49, 51, 48, 10, 9, 10,
> 9, 118, 111, 105, 100, 32, 109, 97, 105, 110, 40, 41, 10, 9, 123, 10,
> 9, 125, 10, 9], GL_VERTEX_SHADER)
>
> So it looks to be passing a list of integers referring to the correct
> ASCII characters.
>
> The Python source file itself is utf-8 encoded but since I'm only
> using the ASCII range of characters, it's the same as if it were ASCII
> encoded.
>
> I have the same problem if I do:
>
> VERTEX_SHADER = shaders.compileShader("""
> #version 130
>
> void main()
> {
> }
> """.encode('ascii'), GL_VERTEX_SHADER)
>
> with an identical error.
>
> Regards,
> Matt Williams
>
> ------------------------------------------------------------------------------
> Precog is a next-generation analytics platform capable of advanced
> analytics on semi-structured data. The platform includes APIs for building
> apps and a phenomenal toolset for data science. Developers can use
> our toolset for easy data analysis & visualization. Get a free account!
> http://www2.precog.com/precogplatform/slashdotnewsletter
> _______________________________________________
> PyOpenGL Homepage
> http://pyopengl.sourceforge.net
> _______________________________________________
> PyOpenGL-Users mailing list
> PyO...@li...
> https://lists.sourceforge.net/lists/listinfo/pyopengl-users
|
|
From: Mike C. F. <mcf...@vr...> - 2013-04-16 02:43:13
|
On 13-04-14 04:43 PM, Matt Williams wrote:
> On 14 April 2013 16:05, Ian Mallett <ia...@ge...> wrote:
>> Pass it binary strings.
> If I do that with:
>
> VERTEX_SHADER = shaders.compileShader(b"""
> #version 130
>
> void main()
> {
> }
> """, GL_VERTEX_SHADER)
>
> then it still works correctly on Python 2.7 but with Python 3.3 it errors with:
The above code (with or without the b) works with Python 2.7, 3.2 and
3.3 on Ubuntu 64-bit with fglrx drivers when using bzr head. bzr head
*also* for the first time, can run test_core.py on all of those
platforms (when numpy and pygame are installed), however, I don't see
anything in the changes I made which should have caused the error you saw.
If you can test on bzr head, that would let me know whether we're
looking at something already fixed, or a continuing bug.
HTH,
Mike
--
________________________________________________
Mike C. Fletcher
Designer, VR Plumber, Coder
http://www.vrplumber.com
http://blog.vrplumber.com
|
|
From: Matt W. <li...@mi...> - 2013-04-16 20:05:47
|
On 16 April 2013 03:43, Mike C. Fletcher <mcf...@vr...> wrote: > The above code (with or without the b) works with Python 2.7, 3.2 and > 3.3 on Ubuntu 64-bit with fglrx drivers when using bzr head. bzr head > *also* for the first time, can run test_core.py on all of those > platforms (when numpy and pygame are installed), however, I don't see > anything in the changes I made which should have caused the error you saw. > > If you can test on bzr head, that would let me know whether we're > looking at something already fixed, or a continuing bug. I've just tested it with bzr head and it seems the issue is resolved there. It appears to compile the shaders without problem. I also ran test_core.py and it ran fine without any apparent failures. I am noticing a new and different problem (something ctypes-related I think) but I'll investigate that a little first and post on the list if I can't resolve it. cheers, Matt Williams |
|
From: Gabriele L. <gab...@gm...> - 2013-05-29 05:55:12
|
I'm having the same problem in python 3 (not being able to compile the shaders because it gets an invalid literal). Did you find any actual workaround for that? |
|
From: Chris B. - N. F. <chr...@no...> - 2013-05-29 16:24:06
|
On Tue, May 28, 2013 at 10:53 PM, Gabriele Lanaro <gab...@gm...> wrote: > I'm having the same problem in python 3 (not being able to compile the > shaders because it gets an invalid literal). Did you find any actual > workaround for that? IIUC shaders have to be written in ASCII, perhaps with Py3, you are passing unicode in -- ty encoding it to ascii and passing the bytes object instead. Or maybe I totally mis-understand the issue! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no... |
|
From: Gabriele L. <gab...@gm...> - 2013-05-29 18:49:32
|
@Chris yes, that's basically the issue. The function glShaderSource in py3
wants an 'str' object instead of a 'bytes' one. If you pass a 'bytes' you
have to convert it to 'str'. And in the non-bzr version there is probably
some code that doesn't correctly handles bytes strings.
I was able to solve the problem by backporting the function compileshader.
In this way it works with both py2 and py3:
def compileShader( source, shaderType ):
"""Compile shader source of given type
source -- GLSL source-code for the shader
shaderType -- GLenum GL_VERTEX_SHADER, GL_FRAGMENT_SHADER, etc,
returns GLuint compiled shader reference
raises RuntimeError when a compilation failure occurs
"""
if isinstance(source, str):
source = [source]
elif isinstance(source, bytes):
source = [source.decode('utf-8')]
shader = glCreateShader(shaderType)
glShaderSource(shader, source)
glCompileShader(shader)
result = glGetShaderiv(shader, GL_COMPILE_STATUS)
if not(result):
# TODO: this will be wrong if the user has
# disabled traditional unpacking array support.
raise RuntimeError(
"""Shader compile failure (%s): %s"""%(
result,
glGetShaderInfoLog( shader ),
),
source,
shaderType,
)
return shader
On Wed, May 29, 2013 at 9:23 AM, Chris Barker - NOAA Federal <
chr...@no...> wrote:
> On Tue, May 28, 2013 at 10:53 PM, Gabriele Lanaro
> <gab...@gm...> wrote:
> > I'm having the same problem in python 3 (not being able to compile the
> > shaders because it gets an invalid literal). Did you find any actual
> > workaround for that?
>
> IIUC shaders have to be written in ASCII, perhaps with Py3, you are
> passing unicode in -- ty encoding it to ascii and passing the bytes
> object instead.
>
> Or maybe I totally mis-understand the issue!
>
> -Chris
>
>
>
> --
>
> Christopher Barker, Ph.D.
> Oceanographer
>
> Emergency Response Division
> NOAA/NOS/OR&R (206) 526-6959 voice
> 7600 Sand Point Way NE (206) 526-6329 fax
> Seattle, WA 98115 (206) 526-6317 main reception
>
> Chr...@no...
>
|
|
From: Matt W. <li...@mi...> - 2013-05-29 17:19:48
|
On 29 May 2013 06:53, Gabriele Lanaro <gab...@gm...> wrote: > I'm having the same problem in python 3 (not being able to compile the > shaders because it gets an invalid literal). Did you find any actual > workaround for that? The only solution I found was to update to the latest version in bzr. Have you tried that? I agree that it's not an ideal solution but it worked for me. Matt |
|
From: Mike C. F. <mcf...@vr...> - 2013-05-30 17:11:33
|
On 13-05-29 12:23 PM, Matt Williams wrote:
> On 29 May 2013 06:53, Gabriele Lanaro <gab...@gm...> wrote:
>> I'm having the same problem in python 3 (not being able to compile the
>> shaders because it gets an invalid literal). Did you find any actual
>> workaround for that?
> The only solution I found was to update to the latest version in bzr.
> Have you tried that?
>
> I agree that it's not an ideal solution but it worked for me.
I've released a 3.1.0a1 on PyPi (hidden, so they are not automatically
installed when users ask to install PyOpenGL). You should be able to
pull the equivalent of bzr head with:
pip install "PyOpenGL==3.1.0a1" "PyOpenGL-accelerate==3.1.0a1"
with the caveat that those are source-code only releases at the moment,
so your target platform will need a Python-module compilation
environment available. The tarballs include the still-in-process egl
and es[1,2,3] packages, those are *not* ready for *any* use.
HTH,
Mike
--
________________________________________________
Mike C. Fletcher
Designer, VR Plumber, Coder
http://www.vrplumber.com
http://blog.vrplumber.com
|