"I increased the ulimit to 4096 and watched the ammount of open files which are being used by the process while crawling. I noticed increases as long as the crawler is running, the amount of open files becomes rather large. When is has fetched 3455 files, processed 3455 files and parsed 3208 files there still looks to be no decrease in open files. The amount of open files is now 3375, then it crashes again with the message Error (java.io.EOFException). Presuming the file handles are closed properly, isn't there a way to limit the amount of open files?
I then increased to 8192 but it keeps crashing around 3375 with Error (java.io.EOFException)."
Log in to post a comment.