From: D B. <db...@ya...> - 2005-04-06 19:20:58
|
I'm writing batch plotting code in the "traditional" sloppy way by typing scripts from the command line in ipython until I get what I want and then stringing the lines together in a file. Then I put the whole thing in a loop and use run from ipython. Now I can batch process a bunch of .cvs files into nice plots. I have two problems though: 1. The memory use in python increases about 5-10MB /sec during processing. I have pylab.ioff() in the loop and put a pylab.close('all') in the loop to try to close the figure and release memory. Now 24 files processed results in ~190MB memory use. When I run again it keeps increasing. I'm basically drawing the fig with several subplots and labels and then using savefig() to save it to a .png file. Is there some way to release memory explicity? I'm using WinXP and tkAgg backend. I've tried things like gca() and clf() at the beginning of the script to try to reuse the canvas but It's not clear it it helps. Sometimes if I wait long enough the memory use goes down so I don't suspect it's not a memory leak, but garbage collection problem. Unfortunately the wait can be very long. General as well as specific tips are wellcome. 2. If I use show() or ion() the new plot window pops up where it likes, usually on top of my other windows. Is there a way to control this better? This may be a FAQ but I didn't see it. In enviroments like IDL there is a "window" command for this. Curious what Matlab does and if there is a platform independent pylab equivalent. -- David +_+_+_+_+_+_+_+_+_+_+_+_+_+_+_+_+ __________________________________ Do you Yahoo!? Yahoo! Personals - Better first dates. More second dates. http://personals.yahoo.com |
From: Fernando P. <Fer...@co...> - 2005-04-06 21:25:17
|
D Brown wrote: > I'm writing batch plotting code in the "traditional" sloppy > way by typing scripts from the command line in ipython > until I get what I want and then stringing the lines > together in a file. Then I put the whole thing in a loop > and use run from ipython. Now I can batch process a bunch > of .cvs files into nice plots. > > I have two problems though: > > 1. The memory use in python increases about 5-10MB /sec > during processing. I have pylab.ioff() in the loop and put > a pylab.close('all') in the loop to try to close the figure > and release memory. Now 24 files processed results in > ~190MB memory use. When I run again it keeps increasing. > I'm basically drawing the fig with several subplots and > labels and then using savefig() to save it to a .png file. > Is there some way to release memory explicity? I'm using > WinXP and tkAgg backend. I've tried things like gca() and > clf() at the beginning of the script to try to reuse the > canvas but It's not clear it it helps. Sometimes if I wait > long enough the memory use goes down so I don't suspect > it's not a memory leak, but garbage collection problem. > Unfortunately the wait can be very long. General as well > as specific tips are wellcome. You can try to do: import gc gc.collect() This will force the garbage collector to kick in, which sometimes may help. Best, f |
From: John H. <jdh...@ac...> - 2005-04-06 21:29:49
|
>>>>> "Fernando" == Fernando Perez <Fer...@co...> writes: Fernando> You can try to do: Fernando> import gc gc.collect() Fernando> This will force the garbage collector to kick in, which Fernando> sometimes may help. I'm not seeing any leak on linux with my standard memory leak test, eg unit/memleak_hawaii3.py in CVS. Have you read over http://matplotlib.sourceforge.net/faq.html#LEAKS Unfortunately, the standard idiom I use for testing memory usage is platform specific. A complete script which exposes the leak would help... JDH |
From: Fernando P. <Fer...@co...> - 2005-04-06 21:46:52
|
John Hunter wrote: >>>>>>"Fernando" == Fernando Perez <Fer...@co...> writes: > > > Fernando> You can try to do: > > Fernando> import gc gc.collect() > > Fernando> This will force the garbage collector to kick in, which > Fernando> sometimes may help. > > I'm not seeing any leak on linux with my standard memory leak test, eg > > unit/memleak_hawaii3.py > > in CVS. Oh, I don't think it's necessarily a leak. Simply a ton of unreachable stuff lying around, which a call to gc.collect() may help with. Best, f |
From: John H. <jdh...@ac...> - 2005-04-06 23:03:46
|
>>>>> "Fernando" == Fernando Perez <Fer...@co...> writes: Fernando> Oh, I don't think it's necessarily a leak. Simply a ton Fernando> of unreachable stuff lying around, which a call to Fernando> gc.collect() may help with. I hope you're right -- matplotlib uses a lot of cyclic references. But if he's using the pylab interface to manage the figure windows, the destroy event of the window manager already triggers a call to gc.collect in the _pylab_helpers module. JDH |
From: Fernando P. <Fer...@co...> - 2005-04-06 23:48:41
|
John Hunter wrote: >>>>>>"Fernando" == Fernando Perez <Fer...@co...> writes: > > Fernando> Oh, I don't think it's necessarily a leak. Simply a ton > Fernando> of unreachable stuff lying around, which a call to > Fernando> gc.collect() may help with. > > I hope you're right -- matplotlib uses a lot of cyclic references. > But if he's using the pylab interface to manage the figure windows, > the destroy event of the window manager already triggers a call to > gc.collect in the _pylab_helpers module. Never mind then. If gc.collect is already called reasonably often, an extra manual call will do zilch. Sorry. Best, f |