From: Dominique D. <DDe...@lg...> - 2003-08-28 17:03:44
|
This is what I've seen from applying the patch, in one of my projects, with 1569 .cpp, 1944 .h, 48 .hpp, so a good size project already, totaling 1,178,784 lines of code (including comments). First Full Analysis Subsequent Analysis Official CppTasks: 63 seconds +/- 2 24 seconds +/- 1 Patched CppTasks: 61 seconds +/- 0 19 seconds +/- 1 Note that this timing is the full build time, including Ant startup, and recursing the directory tree, since the projects is composed of 30+ libs, as well deciding that 423 Java files are up to date. Compare this to the full time to build (a few seconds taken by Java compile): 21 minutes 28 seconds. So all in all, I'm seeing very little benefit from this myself. The project I've tested the patch against is very mature, and has very few headers including unnecessary headers (forward prototyping of everything used as pointers/references). That's certainly not 1000 times faster :( I'll thus hold off committing this patch to my version of CppTasks until Curt can look at it more closely. I hope this helps. --DD > -----Original Message----- > From: Curt Arnold [mailto:ca...@ho...] > Sent: Thursday, August 28, 2003 8:24 AM > To: Helge Schulz; ant...@li... > Subject: Re: [Ant-contrib-developers] Patch: cpptasks dependency analysis > up to 1000 times faster > > Helge Schulz wrote: > > >Hello ANT-Contrib developers, > > > >you will find as attachment a cpptasks patch, which speed up include > >dependency analysis up to 1000 times. The patch fixes a serious flaw of > >the analysis to evaluate the same dependency information again and again, > >if the dependency graph isn't a strict tree without cycles. > > > Thanks for the patch. I'll review it this weekend along with a whole > bunch of pending stuff. > > One of the goals (possibly not acheived) of the original design was to > compile the files most likely to fail first. For example, if you > modified foo.h, then foo.c would compile first, then any other files > that directly included 'foo.h', then any files that included a header > file that included 'foo.h', then files that included a header that > included a header that included 'foo.h', etc. Performance might require > a two step algorithm, determining what files need to be recompiled and > then trying to determine an optimal sequence. > > So these things don't happen in a vacuum, I'll try to add some tests > using a dependency tree from OpenSHORE > (http://sf.net/projects/openshore) since that project appears to > obviously suffer from the current algorithm. |