From: Jason M. <jas...@vp...> - 2010-10-29 13:35:30
|
Hi, I am using visual python to graph a lot of data in real time (I tried with matplotlib but it's just too slow). Anyway, it works incredibly well (thanks for a great little library by the way - it's not exactly publicised just how fast and easy to use it is!!). I have live data on 15 separate windows, all being updated at 125 times per second!! However, there is a problem. After about ten minutes I get MemoryError from numpy. The code works by using one curve per display entity. Each curve has 300 x/y points. The y points in every graph are updated each frame, the x points are unchanged (0..299). The error comes from the line of code that shifts the data along as it adds new data (I've tried various methods, including manual iteration, list splices and numpy.roll) but all have the same memory effect. I first tried the old 3.x VPython and got the same results from Numeric under Python 2.5. I upgraded to Python 2.7 with latest numpy and the latest VPython and got the same result. <code> for c in curves: n=curves.index(c) b=np.roll(c.y,-1) b[-1]=new_value[n] c.y=b </code> The MemoryError is always at one of the last three lines or within numpy's libraries. I've tried c.get_y() and c.set_y() with the same results. I've checked all my other buffers in the rest of the code and non are getting larger, nor is c.y or the temporary variable, b in the example above. My conculusion is that this can only be something to do with the internal workings of either numpy (and Numeric) or VPython. I tried the following in IDLE: <code> import numpy as np count=0 a=np.arange(10) while True: count=count+1 b=np.roll(a,-1) b[-1]=count a=b </code> And let this run for several minutes (42+ million cycles) without a crash. So, what am I doing wrong (or is there a memory bug in VPython)? Cheers, Jason. |