Re: [Secureideas-base-devel] One last valiant effort before break
Brought to you by:
secureideas,
sinukas
From: Chris S. <ch...@co...> - 2004-11-24 06:21:29
|
Kevin Johnson wrote: >>The first thing that will be blatently obvious to you is that we spend a >>lot of time including and defining "stuff". Looking at one of the traces >>I see that we actually define over 450 items and include (in one way or >>another) 23 files. > > Damn, I knew this, but having it spelled out was a shock! Well, from my experiences with other massive PHP projects (namely Horde + modules), this isn't really an issue. >>I'll let you all digest the larger trace file ( mainly because it is >>quite a lot to swallow ) but one thing sticks out enormously in that >>trace; the fact that adodb takes up a huge chunk of the file in doing >>its operations. > > Yes, adodb does take a lot of the trace, but is that because we do > almost everything from the database? Possibly. We don't really know how much of it is potentially ADODB's inefficiencies. This was one of the reasons why I was pushing for PEAR::DB vs ADODB. The PEAR devs generally seem to really know well what they're doing, and aren't attempting ASP-style connection creation. >>Mind you that I run BASE on a pentium 166 with ~128 meg of ram and >>several other server apps running simultaneously, but no matter who you >>talk to, the fact that it takes 8 seconds to load the homepage is a >>sacrilage. Sorry, but I've seen graphing software that pulls from >>_million_ line databases run that fast, I should be able to smoke that >>graphing software with my _30k_ line database. Well, it depends. What was the graphing software running on? Did it use views on the database side of things? >> * ADODB. I believe the scripts do way more than we care for them to >> do. It supports more than 16 databases, compared to BASE's claimed >> 3, and I just think that overall it is overkill. We may be >> suffering massive performance hits just to make the code a tadbit >> easier to read. > > I agree. We need to look at the data access layer and maybe what we > need to do IS build our own abstraction. We do only support 3 right now > and even with adding Oracle that only brings us to 4. >> * Includes are out of control. Sorry, but I think the fact that we >> include at the _very_ least 15 files on every page is a really >> poor decision. Just for kicks I went through the >> base_include.inc.php file and commented out include lines one at a >> time, refreshing my base_main.php page with each comment. I was >> able to get the exact same page with about 80% of those lines >> commented out. This tells me that we're including stuff we never >> use which is obviously a waste of resources and a tax on PHP. > > The includes were one of the many things we carried over from ACID. I > agree that we need to audit what gets included and stop using the > include_everything_file.php >> * Defines may be out of control. Usually I'm pretty lax with >> constants because, like with this project and other php projects, >> we use language files which include all our output, it's expected >> that we would have a lot of defines. The question I'm wondering >> then is, and this goes along the same lines as the second bullet ( >> overkill ), is defining words that we never use on the page giving >> us a noticable performance hit? What could we do to maybe make the >> load a little less? Would it be possible to declare, say, a >> variable at the start of every page that includes the name of the >> page, and then have the language files encased in a switch which >> switches to the correct defines to run based on the page that is >> being loaded? This really is irrelevant, and comes down into "how long can a compiled app process 100 lines we don't use out of a text file that's 300 lines long", and how much RAM does it end up using? I suggest instead of such ponderings we allocate someone to performance testing to actually see where our bottlenecks are. Without hard data all we can say is "it's slow somewhere". ;) I'll volunteer to play around with it, as I've done Webapp stress testing and performance bottleneck identification before -- in an in-house on the fly translation engine no less. > This would cause us to repeat definitions across pages, but should be a > problem. Then we maintain a single file that we send to people to > translate and then when they return it, we break it up over the various > files. The english.lang.php file already has the start of comments to > what page is using what... of course I didn't do that everywhere. In my experiences, the size of the database (flatfile or otherwise) that the translations are read from aren't as much of an issue as keeping the translations maintainable. >>I think ado, the includes, and the defines are the points we need to >>consider. As that trace clearly points out, ado is wasting a lot of our >>time with its over abundance of features. Yeah the includes are a >>problem and so are the defines, but 7 seconds of loading was all thanks >>to ado. Why not just see if switching to something else (home grown or otherwise) would yield higher speeds? If ~85% of the problem can be fixed by altering one piece of functionality, why not try fixing it first, then move onto the next-highest item on the list of what slows the app down? Have a happy holiday. :) -- Chris Shepherd Wise man say, chemist who falls into acid, absorbed in his work. |