From: Robert S. K. <rs...@ds...> - 2005-02-28 15:25:57
|
Hi, I have a box on which I run MythTv (a homebrew PVR app). I recently upgraded my xserver to use the DRI. DRI is definately enabled, my glxgears count went from ~130fps to ~500fps & glxinfo reports direct rendering enabled. However I'm seeing issues now with the performance under MythTv and in several other, seemingly unrelated areas. In general, the issue I see is that when I drag windows around the screen, or under Myth, when I play video, I see a spike in my Xserver's cpu utilization. Comparing with before I installed DRI, my Xserver %CPU is several times higher than previously reported. (30-50% where previously it never broke 10%). I'm using FC3 (2.6.10) with the CVS DRI code. My machine is an Athlon XP 2400 with 512Mb ram and an on board ProSavageDDR chip. This MB uses shared system memory for the graphics card. I've tried several different sets of compiler flags with no apparent effect, I've looked up and down the config files for X, Mesa, and the DRM for debug flags, compiler optimizations, etc. all to no avail. I've set the AGPSize & AGPMode options in my server appropriately. I've also tried both settings for BCI for Xv. I'm at a loss as to what might be causing this loss in performance. When rendering video, MythTv is admittedly a memory BW hog. It also uses Xv to scale the video. (Xorg.0.log reports hardware Xv scaling is on) I've seen some discussion recently on the users list about DRM and the savage driver. Is it possible that the stock Xserver driver enables DRM for areas that the shiny new one doesn't? If not, are there any options in my config files that I should take a look at for ideas? Thanks in advance, Rob |