[Doxygen-users] Using doxygen with large projects
Brought to you by:
dimitri
From: Ray F. C. <rf...@SL...> - 2001-12-17 20:56:17
|
Hi, I am attempting to use doxygen to produce docs for a rather large C++ project, about 1000 packages containing in total some 20,000 individual {.cc,.hh} files. This is the bulk of the software used by the international high-energy physics experiment, BaBar (www-public.slac.stanford.edu/babar). However, attempts to run doxygen fail when running on our complete code base; on a linux box doxygen runs for four hours or so in the 'Reading input files' stage, reaches around 900 MB of memory use, then aborts. The Doxyfile has been configured for html output only. No html is produced in the output directory. I'm a novice at running doxygen, so perhaps someone has some suggestions on how to handle large projects? Doxygen runs fine on small subsets of our code, so I think our problem is due to the large amount of source code when running all of it together. Another problem--assuming we can get doxygen to run on our code--is that all the output html files are placed in a single directory. Since each source file produces several .html files, the output from our 20,000 files will approach 50,000 files or more...and our afs file system can only handle about 31,000 files per directory. It would be extremely nice if doxygen maintained the package structure of the input files in the output directory. Anyone know how to do this? Or a reasonable workaround? Any suggestions for solving these problems would be appreciated. Thanks much. --Ray Cowan BaBar Collaboration docmaster |