|
From: jlh <jl...@gm...> - 2008-05-17 22:55:20
|
Dirk Stöcker <valgrind <at> dstoecker.de> writes: > Usually also constructors and destructors of the classes, and other > allocating functions (and lots of others) have an #ifdef DEBUG'ed printf() > inside and a little Perl script to test if the constructions and > destructions match is written very fast. I do that a lot, I have a debug() function that is either defined printf() or (void)0. But I don't use it for memory allocation logging. > And I always have an #ifdef DEBUG_OLD, which contains all the old debug > statements added for special debugging purposes (I never remove them, > except they conflict with new code). Turning this on usually makes > software 1-10 times slower, but sometimes is very very helpful (I once > got 1GB debug data in 10 minutes . > I really can not understand, why nowadays all prefer debuggers and forget > the good old printf. I don't see how filling the program with lots of printf()s and then writing a perl script that analyzes its output is different from using a debugger. Doing that *is* using a debugger: the one you just wrote yourself. And when there are good debuggers available already, why spend time reinventing the wheel? In fact, mtrace works exactly like this: It generates lots of output and mtrace is a perl script that analyzes it and tells you at which lines memory has been allocated but never released. And valgrind is a really great tool. It points its finger to a line of code and says "this malloc()ed memory has not been free'ed anywhere". Getting this kind of comfort is not trivial with self-written printf()s + perl. jlh |