Hi
I'm new to programming in python and I hope that this is the problem.
I've created a cellular automata program in python with the numpy array
extensions. After each cycle/iteration the memory used to examine and
change the array as determined by the transition rules is never freed.
I've tried using "del" on every variable possible, but that hasn't
worked. I've read all the forums for helpful hints on what to do, but
nothing has worked so far. I've even tried the "python memory
verification" (beta) program, which did point to numpy.dtype and
numpy.ndarray as increasing objects, before the whole computer crashed.
I can supply the code if needed. I'm desperate because this is part of
my thesis, and if I can't get this fixed, I'll try another programming
language.
Update: I posted this message on the comp.lang.python forum and their
response was to get the numbers of references with sys.getrefcount(obj).
After doing this I see that iterative counters used to count occurrences
and nested loop counters (ii & jj) as seen in the code example below are the
culprits with the worst ones over 1M:
for ii in xrange(0,40):
for jj in xrange(0,20):
try:
nc = y[a+ii,b+jj]
except IndexError: nc = 0
if nc == "1" or nc == "5":
news = news +1
if news == 100:
break
else:
pass
y[a+ii,b+jj] = 4
else:
pass
The version of python I'm using is 2.4.3 and the version of NumPy is 0.9.8
thanks in advance
Sonja
|