|
From: Lucas C. V. <lv...@gm...> - 2009-02-17 02:10:51
|
This in happening in two different computers, and may be related to other two problems I saw. I am bringing it here in hope some of you may have a clue on how to solve the problem. I have a team who is developing a game using Ogre3d engine, and one of my teammates have a notebook with an Intel video adapter (GM965/GL960 on lspci). Using Ubuntu 8.04, with stock Mesa and drivers (Mesa 7.0.3), he was able to run our game. But after upgrading to Ubuntu 8.10, with Mesa 7.2, the game started to run at less than 1 FPS, with DRI and all installed (OpenGL renderer string: Mesa DRI Intel(R) 965GM 20061102 x86/MMX/SSE2), and I can almost see the drawing taking place on screen. I have a PC with an Intel i945, and saw exactly the same problem after upgrading from Ubuntu 8.04 to Ubuntu 8.10. There is this bug reported that seems to be about this problem, but nobody knows the cause. The suggested workaround of adding the following lines to xorg.conf: Option "XaaNoPixmapCache" Option "XAANoOffscreenPixmaps" "1" Option "DRI" "true" Option "AccelMethod" "XAA" only improved glxgears performance, but the game runs at the same bad speed. I saw two other systems with problems that may be related: my fathers notebook, with an ATI RS690 chip and Ubuntu 8.10, could not run the game satisfactorily, looking like the color buffer was being flipped before the frame was finished. The objects in the scene looked transparent, incomplete, and I was able to see parts that should be occluded by other undrawn primitives. Using the proprietary driver I was able to run the game, but that is not a solution, since it's not Mesa and is not open source. Lastly, on my boss notebook, with some Intel chip and Fedora (I am not sure if 9 or 10), I've got the same problem that in Ubuntu 8.10 with Intel chips. So what I compiled: a recent distro plus OGRE plus Mesa causes the problem. Do you have any clue on why? -- Lucas Clemente Vella lv...@gm... |
|
From: Philipp K. K. <pk...@sp...> - 2009-02-17 10:07:59
|
Lucas Clemente Vella schrieb: > > So what I compiled: a recent distro plus OGRE plus Mesa causes the > problem. Do you have any clue on why? > Just a guess: Newer Mesa offers new feature X, OGRE sees it, enables rendering path Y, which needs X, but also uses Z, with Z triggering a software fallback, since not being supported by the drivers. You might want to check glxinfo output of the configurations where Ogre is slow vs. those where it's fast to check this. Philipp |
|
From: Lucas C. V. <lv...@gm...> - 2009-02-20 20:02:02
|
I still have no success finding the why of the slowdown. While I don't know how to properly use oprofile, this is the best info I could get from it: 72.02% of the time is spent inside glibc, on memcpy depending on opreport paremeters, on the same profiling, I also get: samples % linenr info image name symbol name 449 87.1845 (no location information) [vdso] (tgid:18705 range:0xb804d000-0xb804e000) (no symbols) Does all this memcopying means something to you? -- Lucas Clemente Vella lv...@gm... |
|
From: Lucas C. V. <lv...@gm...> - 2009-02-25 20:22:42
|
2009/2/20 Lucas Clemente Vella <lv...@gm...>: > I still have no success finding the why of the slowdown. While I don't > know how to properly use oprofile, this is the best info I could get > from it: > > 72.02% of the time is spent inside glibc, on memcpy > > depending on opreport paremeters, on the same profiling, I also get: > > samples % linenr info image name > symbol name > 449 87.1845 (no location information) [vdso] (tgid:18705 > range:0xb804d000-0xb804e000) (no symbols) > > Does all this memcopying means something to you? I found that with low memory consuming scenes, I have about the same speed I used to have. In my test level I have no textures, a few vertex buffers and hundreds of polygons. The game runs at capped 30 FPS with 17% processor consuming. In the true level, I have many textures, many vertex buffers -- both managed by Ogre -- and (trusting in the scene culling) about the same number of polygons. The game runs at about 1 FPS, consuming 100% of processor time. So, I figured out that the true level runs at the same speed it ran in the old system when my textures were not compressed. Back then, when I compressed the textures with DXT1, the game started to run at not so bad 10 FPS. Now it is slow again. Judging by the profiler info, I guess the system update somehow screwed my shared memory size so memcpy's form main memory are needed. Does it make sense? Can I get the old behavior? Also, running a debug version of Mesa (Ubuntu sources, just compiled again), I get this message: Mesa warning: couldn't open libtxc_dxtn.so, software DXTn compression/decompression unavailable Is it somehow affecting me? -- Lucas Clemente Vella lv...@gm... |
|
From: Philipp K. K. <pk...@sp...> - 2009-02-27 23:02:55
|
Lucas Clemente Vella schrieb: > So, I figured out that the true level runs at the same speed it ran in > the old system when my textures were not compressed. Back then, when I > compressed the textures with DXT1, the game started to run at not so > bad 10 FPS. Did you compress them manually or does OpenGL compress them at application runtime? In the latter case you need libtxc_dxtn. > > [...] > > Also, running a debug version of Mesa (Ubuntu sources, just compiled > again), I get this message: > Mesa warning: couldn't open libtxc_dxtn.so, software DXTn > compression/decompression unavailable > Is it somehow affecting me? > Just try installing libtxc_dxtn (may be illegal depending on your jurisdiction). Philipp |
|
From: Lucas C. V. <lv...@gm...> - 2009-03-03 20:41:07
|
2009/2/27 Philipp Klaus Krause <pk...@sp...>:
> Did you compress them manually or does OpenGL compress them at
> application runtime? In the latter case you need libtxc_dxtn.
Compressed manually.
> Just try installing libtxc_dxtn (may be illegal depending on your
> jurisdiction).
>
> Philipp
Thanks for the help.
I finally came to a solution, this line in .drirc configuration file
(I would not find it without the great DRIconf application):
<option name="force_s3tc_enable" value="true" />
There is no need for libtxc_dxtn, since i945 (and all the chips I
know) have hardware accelerated DXTn compression; but it seems that if
Mesa can't find the lib, it disables the support for compression. In
my case, Ogre was decompressing it in software. The option forces Mesa
to support the compression even without software fallback (IMHO, what
should be default), and my game runs faster than ever (really, full
speed, better than Ubuntu 8.04).
It seems textures now fits in GFX memory.
One caveat though is that I was not using double buffer, so I could
see the scene being drawn, what is really bizarre. Enabling double
buffering solved this problem.
--
Lucas Clemente Vella
lv...@gm...
|
|
From: Philipp K. K. <pk...@sp...> - 2009-03-03 21:00:12
|
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Lucas Clemente Vella schrieb: > > I finally came to a solution, this line in .drirc configuration file > (I would not find it without the great DRIconf application): > > <option name="force_s3tc_enable" value="true" /> > > There is no need for libtxc_dxtn, since i945 (and all the chips I > know) have hardware accelerated DXTn compression; but it seems that if > Mesa can't find the lib, it disables the support for compression. In > my case, Ogre was decompressing it in software. The option forces Mesa > to support the compression even without software fallback (IMHO, what > should be default), No, it shouldn't be the default. Enabling this option breaks OpenGL. libtxc_dxtn is needed if you want S3TC texture (de)compression in your Mesa-provided OpenGL. Hardware support for S3TC is not enough. There will always be cases where Mesa has to fall back to software, at least partially, when you're using some feature that your hardware or the driver doesn't support (maybe it's glDrawPixles(), maybe it's some fog option, maybe it's the depth test, maybe it's the render mode, whatever). And at that moment things will break apart, since the software fallback won't support S3TC without libtxc_dxtn. Oh, and none of the chips I know has hardware support for S3TC _compression_ (though most have support for hardware _decompression_). The S3TC extension requires support for _compression_, too. Philipp -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.9 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iEYEARECAAYFAkmtmlYACgkQbtUV+xsoLppQwQCfeSndZzla3+TMyukyHuS5m/pB vfkAnRuF0AupsqcQeg80/iAN4235l37B =z1BQ -----END PGP SIGNATURE----- |