|
From: Chris J. <cjn...@gm...> - 2009-04-05 03:29:50
|
On Fri, Apr 03, 2009 at 11:53:22AM EDT, c3kkos wrote: > > Ok i've got the same here > > old Acer Travelmate 743tl > > with the ugly mach64 chip on it. the M, agp 2x version...... Same as me, it's the agp version of the card/chip. Only system where I got dri to work on this hardware was debian sarge with a 2.4 kernel and I do remember that I needed to load the agpgart kernel module prior to loading the mach64 kernel module. Two other things that I learned the hard way was that I needed was to run X with 16-bit color and a screen resolution that was compatible with the Mach64's miserly memory. My laptop's LCD's native resolution is 1400x1050 but in order to get dri to work, I had to downgrade to 1024x768. Since I never got it to work on debian "etch" on the same laptop, I still have this old "sarge" system on a separate partition. > i've compilated my own kernel module from the git source.. and that > spits out two .ko listed in: > /lib/modules/'uname -r'/kernel/drivers/char/drm/drm.ko && mach64.ko > well.. everything seems to run fine.. those modules loads with NO > PROBLEMS at all.. [..] > Straight to the point: > at this time, it seems that DRM for mach64 + Xorg cause a kernel panic > (no logs are created from X, so i think that happens BEFORE the > graphical server loads up) In any event, please forget everything I said regarding ubuntu. I don't understand what the ubuntu packagers did in this respect, but having had the time to look more closely at the output of glxinfo, I noticed that although it says "direct rendering: yes" it also has something that doesn't look promising a few lines down: "OpenGL Renderer: Software Rasterizer". Hmm... So, I rebooted to my old sarge system and sure enough, glxinfo says: "Direct Rendering: Yes" _but_ it also says: "OpenGL Renderer: Mesa DRI Mach64". Not knowing exactly what this glxinfo renderer stuff really means, I thought I'd run a few "tests" with some xscreensaver stuff (antinspect, glmatrix, blocktube, dangerball, GLForestFire.. etc.) as well as glxgears on both my old sarge system and the ubuntu system (on the same multi-boot laptop) and compare the results. Well, the difference was immediately noticeable and also quite measurable, at least if you trust the -fps options of these programs. Basically, on my old sarge system I was consistently getting FPS rates that were roughly 5+ times higher than on the ubuntu system where dri is supposedly enabled. True, I'm probably running different versions of these programs, since the stuff in the sarge repositories must be about 5 years older than the ubuntur stuff.. but all the same, there does seem to be a pattern. As an example, at the same color depth, same resolution, glxgears is just short of 300 FPS on my old debian sarge system. When I ran it on ubuntu, it did about 56 FPS..! So, to paraphrase what you said, I thought dri was a matter of 1, you have it, or 0 you don't.. Not so, the ubuntu packagers seem to have it "half-enabled". :-) > my kernel is 2.6.29 on a Debian box. COME ON BOYS LET'S TRY TO SOLVE > THIS!! +1 The "ATI Technologies Inc Rage Mobility P/M AGP 2x (rev 64)" is probably not powerful enough to make much difference for a lot of stuff, even with dri fully (?) enabled, but on my old "sarge" system I was able to play (or demo) Heretic II with all textures and visual effects enabled via the "GL renderer" and Tuxracer was tolerably responsive.. So, even though getting this to work is not one of my priorities, I would really like to understand this stuff a little better and I _do_ wish someone knowledgeable could spare a couple of minutes and give us a few pointers. Thanks, CJ |