From: Jason M. <jas...@vp...> - 2010-10-29 14:52:59
|
Thanks Beracah, I've not got time right now to track this one down, but it is good to know that somebody else has confirmed this as a bug. Cheers, Jason. -----Original Message----- From: Beracah Yankama [mailto:be...@MI...] Sent: 29 October 2010 15:39 To: Jason Morgan Cc: vis...@li... Subject: Re: [Visualpython-users] Possible memory leak hah! now i'm not the only one who noticed this! i was graphing a lot of data too with lots of curves (a graph cluster growing to thousands of nodes and edges, which are dynamically positioned), but after like 200 edges and being repositioned, an out of memory error gets thrown. I had tracked it down to vpython. I had written (in case it is helpful to you): """ I made a test script, that only creates objects in the window, then deletes them over and over, and I found the long-term memory use/growth to be related linearly to the .pos attribute. For instance, a curve with 20 points leads to unreleased memory that grows 10x as fast as curves with only 2 points. Similarly, if I assign the .pos attribute as a numpy ndarray directly, the leak grows twice as fast as if assign the individual xyz values. """ B On 10/29/2010 9:16 AM, Jason Morgan wrote: > Hi, > > I am using visual python to graph a lot of data in real time (I tried with > matplotlib but it's just too slow). > > Anyway, it works incredibly well (thanks for a great little library by the > way - it's not exactly publicised just how fast and easy to use it is!!). > > I have live data on 15 separate windows, all being updated at 125 times per > second!! > > However, there is a problem. After about ten minutes I get MemoryError from > numpy. > > The code works by using one curve per display entity. Each curve has 300 > x/y points. The y points in every graph are updated each frame, the x > points are unchanged (0..299). > > The error comes from the line of code that shifts the data along as it adds > new data (I've tried various methods, including manual iteration, list > splices and numpy.roll) but all have the same memory effect. I first tried > the old 3.x VPython and got the same results from Numeric under Python 2.5. > I upgraded to Python 2.7 with latest numpy and the latest VPython and got > the same result. > > <code> > for c in curves: > n=curves.index(c) > b=np.roll(c.y,-1) > b[-1]=new_value[n] > c.y=b > </code> > > The MemoryError is always at one of the last three lines or within numpy's > libraries. > I've tried c.get_y() and c.set_y() with the same results. > > I've checked all my other buffers in the rest of the code and non are > getting larger, nor is c.y or the temporary variable, b in the example > above. > > My conculusion is that this can only be something to do with the internal > workings of either numpy (and Numeric) or VPython. > > I tried the following in IDLE: > > <code> > import numpy as np > count=0 > a=np.arange(10) > while True: > count=count+1 > b=np.roll(a,-1) > b[-1]=count > a=b > </code> > > And let this run for several minutes (42+ million cycles) without a crash. > > So, what am I doing wrong (or is there a memory bug in VPython)? > > Cheers, > Jason. > > -------------------------------------------------------------------------- ---- > Nokia and AT&T present the 2010 Calling All Innovators-North America contest > Create new apps& games for the Nokia N8 for consumers in U.S. and Canada > $10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing > Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store > http://p.sf.net/sfu/nokia-dev2dev > _______________________________________________ > Visualpython-users mailing list > Vis...@li... > https://lists.sourceforge.net/lists/listinfo/visualpython-users |