From: Nelle V. <nel...@gm...> - 2014-06-04 16:20:11
|
> Our standard test has gotten out of control. The most serious problem is > that running a full test suite now fails on a linux VM with 4 GB--it's out > of memory. Half-way through the set, it is already using more than 2 GB. > That's ridiculous. Running nosetests separately on each test module keeps > the max reported by top to 1.6 GB, and the max by report_memory to 0.5 GB; > still quite a bit, but tolerable. (I don't know why there is this factor of > 3 between top and report_memory.) This scheme of running test modules one > at a time also speeds it up by a factor of 2; I don't understand why. > > The script I used for the module-at-a-time test is attached. It is a > modification of matplotlib.tests(). > > Are there any nosetest experts out there with ideas about how to streamline > the standard test routine? This issue is probably worth mentionning on other mailing list of people using nosetest, and nosetests. I'm thinking of scikit-learn in particular, which also uses nosetest heavily. The scipy-users list might be a good place to exchange experience. N > > Eric > > ------------------------------------------------------------------------------ > Time is money. Stop wasting it! Get your web API in 5 minutes. > www.restlet.com/download > http://p.sf.net/sfu/restlet > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel > |