when crawling a specific page with about 100 pages the crawler stops
on Debian while it runs through on Windows. There is no multi process,
no error in the log, no error message. Could you recommend a way to figure
out why it stops?
Thanks!
Lars
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
View and moderate all "Help" comments posted by this user
Mark all as spam, and block user from posting to "Forum"
Hi,
when crawling a specific page with about 100 pages the crawler stops
on Debian while it runs through on Windows. There is no multi process,
no error in the log, no error message. Could you recommend a way to figure
out why it stops?
Thanks!
Lars
Hi Lars,
do you have some more information?
When does it stop? Does it stop regularry or does it just die?
Does your script maybe hit the php timelimit (or something like that)?
View and moderate all "Help" comments posted by this user
Mark all as spam, and block user from posting to "Forum"
You have 3 limits in the script, a time,a memory and the url limit.