From: Fernando P. <Fer...@co...> - 2005-04-06 21:25:17
|
D Brown wrote: > I'm writing batch plotting code in the "traditional" sloppy > way by typing scripts from the command line in ipython > until I get what I want and then stringing the lines > together in a file. Then I put the whole thing in a loop > and use run from ipython. Now I can batch process a bunch > of .cvs files into nice plots. > > I have two problems though: > > 1. The memory use in python increases about 5-10MB /sec > during processing. I have pylab.ioff() in the loop and put > a pylab.close('all') in the loop to try to close the figure > and release memory. Now 24 files processed results in > ~190MB memory use. When I run again it keeps increasing. > I'm basically drawing the fig with several subplots and > labels and then using savefig() to save it to a .png file. > Is there some way to release memory explicity? I'm using > WinXP and tkAgg backend. I've tried things like gca() and > clf() at the beginning of the script to try to reuse the > canvas but It's not clear it it helps. Sometimes if I wait > long enough the memory use goes down so I don't suspect > it's not a memory leak, but garbage collection problem. > Unfortunately the wait can be very long. General as well > as specific tips are wellcome. You can try to do: import gc gc.collect() This will force the garbage collector to kick in, which sometimes may help. Best, f |