Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo

Close

#109 Potential Memory Leak in WebCrawler

1.4.0 - bugs
closed-fixed
Antoni Mylka
crawlers (23)
5
2009-10-15
2009-10-15
Anonymous
No

There is a potential memory leak in org.semanticdesktop.aperture.crawler.web.WebCrawler.processQueue(): the created DataObject is not disposed in all cases.

The attached patch fixes the issue.

Discussion

  • patch against SVN trunk which fixes the described issue

     
    Attachments
  • Antoni Mylka
    Antoni Mylka
    2009-10-15

    Committed in r2102. Thanks for noticing that. My commit message:

    "in WebCrawler - the data objects that exceeded the maximum size limit weren't reported to the crawlerhandler (as specified), but they weren't disposed either - this lead to a little resources leak. Detected by Christian Spurk, who also submitted a patch, which I hereby commit. :)"

    I close this issue.

     
  • Antoni Mylka
    Antoni Mylka
    2009-10-15

    • assigned_to: nobody --> mylka
    • status: open --> closed-fixed