Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo
There is a potential memory leak in org.semanticdesktop.aperture.crawler.web.WebCrawler.processQueue(): the created DataObject is not disposed in all cases.
The attached patch fixes the issue.
patch against SVN trunk which fixes the described issue
Committed in r2102. Thanks for noticing that. My commit message:
"in WebCrawler - the data objects that exceeded the maximum size limit weren't reported to the crawlerhandler (as specified), but they weren't disposed either - this lead to a little resources leak. Detected by Christian Spurk, who also submitted a patch, which I hereby commit. :)"
I close this issue.