Hi,
i've used SQLObject on ubuntu with sqlite and found a strange behavior consuming the machine memory where it shouldnt do.
i wrote a script which creates 2 Tables mostly stringcol,
i then iter over a list of names inserting them into just one table buy creating a new Object from the Class for my table, after inserting the values i even del the Object but the memoryuseage increases still
<code snip>
while [ 1 ]:
line = sys.stdin.readline()
if not line: break
tbl = Table2(col1="a",
col2="b",
col3="c",
...
)
del tbl
<code snip>
im a getting something wrong ? the data gets inserted, sqlite> SELECT count(*) FROM table2;
117760
but after these Records the machine craps up in swaping due ~350M memory usage on 512M Machine running other things to.
thanks for any help
kind regards
Michael lang
Logged In: YES
user_id=4799
Originator: NO
The rows you draw from the table are put in the cache, and are not removed with "del tbl". How many rows have you selected? The cache is being culled after selecting 100 rows (after every 100 rows). You can change the culling frequency or clear the cache yourself: Table2._connection.cache.clear(). Does it help?
Logged In: YES
user_id=1216160
Originator: YES
Hi olec,
yes that helps and saves my memory :) maybe you can apply an automatic clean on object destruction like attached
below.
thanks for your fast response.
kind regards
Michael Lang
--- /usr/lib/python2.4/site-packages/sqlobject/main.py 2004-10-20 22:28:18.000000000 +0200
+++ /tmp/main.py 2006-12-12 11:38:03.000000000 +0100
@@ -331,6 +331,10 @@
# aren't using integer IDs
_idType = int
+ def __del__(self):
+ # cleanup cache before wiping object
+ self._connection.cache.clear()
+
def get(cls, id, connection=None, selectResults=None):
assert id is not None, 'None is not a possible id for %s' % cls.__name
Logged In: YES
user_id=4799
Originator: NO
No, an SQLObject should not clear the entire cache - it is application's task.