From: Alex T. <fl...@ax...> - 2002-06-12 15:14:29
|
Hello, I apologize if it is a FAQ -- didn't find anything in the mailing list archives. Anybody has any hints on how to control memory allocator/garbage collector behaviour ? Consider the following bit of code: #md5 fodder to make it a sensible bit of code :)) import md5 m = md5.new() f = open('foo', 'rb') b = f.read(2048) while b != '': m.update(b) b = f.read(2048) print m.hexdigest() The interesting part here is 'b = f.read()' in the loop. I tried different approaches, but invariably this bit of code tends to run out of memory fairly quickly on the moderate size files, say 130Mb. Increasing the block size doesn't help either. Same bit of code run under py2.2 works even on windoze, where memory is always an issue. Any hints, ways to kickstart garbage collection cycles, anything -- highly appreciated. Regards, Alex |