From: Ryan H. <rh...@sh...> - 2004-11-30 17:26:25
|
* Martin d'Anjou <Mar...@s2...> [2004-11-29 21:01]: > Hi, > > I think I have found a memory leak. I have a loop that inserts hundreds of > It is not a memory leak. I thought the same thing, but found out that SQLObject caches object creation. Hence, python's memory footprint grows with each insert. in main.py: def _SO_finishCreate(self, id=None): <snip> ... cache.created(id, self.__class__, self) then in cache.py: def created(self, id, obj): if self.doCache: self.cache[id] = obj else: self.expiredCache[id] = ref(obj) You can flush the cache as follows: table._connection.cache.clear() In your example code: def fill_table(size): for i in xrange(int(size)): p = Persons(name='Joe'+`i`) Persons._connection.cache.clear() or you can delay until all inserts are done: def fill_table(size): for i in xrange(int(size)): p = Persons(name='Joe'+`i`) Persons._connection.cache.clear() On a related note, you will notice that in addition to the caching of the object, immediately following the insert, SQLObject will issue a select to fetch the values to create an instance in _init(). This degrades mysql insert performance. It would be nice to be able to toggle caching as well as instance _init. Ryan Harper |