There is a potential memory leak in org.semanticdesktop.aperture.crawler.web.WebCrawler.processQueue(): the created DataObject is not disposed in all cases.
The attached patch fixes the issue.
patch against SVN trunk which fixes the described issue
Committed in r2102. Thanks for noticing that. My commit message:
"in WebCrawler - the data objects that exceeded the maximum size limit weren't reported to the crawlerhandler (as specified), but they weren't disposed either - this lead to a little resources leak. Detected by Christian Spurk, who also submitted a patch, which I hereby commit. :)"
I close this issue.
Log in to post a comment.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.