dlc-users Mailing List for Dead Link Check (DLC)
Brought to you by:
martial
You can subscribe to this list here.
| 2001 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(1) |
|---|
|
From: <dlc...@li...> - 2001-12-03 13:01:53
|
Hi, ive never "make"ed a perl file before.. Im running Activeperl at home (Win 98) & Work (WIN 95).. & dos/ windows... Ive d/loaded the DLC Dead Link Checker from :- http://dlc.sourceforge.net/ Where, What do i do now ? PS if replying please do it on-list or at my home & work email - gor...@ya... (Im practicing/testing it at home, but my main purpose is to use it at work..) G. NZ Pacific time. ===== G. New Zealand chat & Discussion group email new...@ya... for details. |
|
From: <dlc...@li...> - 2001-05-15 23:04:02
|
I have already managed to find a solution to the problem. My options now look like this: deadlinkcheck -Verb -indicator -timeCache 0 -Timeout 150 -noCache -detOutput -codeConversion -HTMLoutput urllist.txt >checker.htm As for the Web sites that were a problem earlier, giving the URL's wouldn't have told you much. I checked them by clicking on the links in the resulting output and the pages immediately opened. Anyway, the output I get with the change in timeout settings is much more reliable. > Sir, > > Wihtout example of troubling web site I can not run into a > more precise debugging process. Still, I would recommend trying to add > the "userRedirect" option in your command line, since it forces the > content checking of the web page (it seems to help so HTTP servers > also) |
|
From: <dlc...@li...> - 2001-05-14 18:57:05
|
---------- I am using dlc to check for bad links in a database of special interest URL's. It seems designed for just this task. I run it with the following options: deadlinkcheck -Verb -Timeout 100 -noCache -detOutput -codeConversion -HTMLoutput urllist.txt >checker.htm I had hoped that raising the timeout value to 100 keep me from getting too many "Entries with code 5xx". But as it goes through the list, it seems to assign this error to the majority of sites without even waiting the specified timeout interval. What is going on here? About 70% of my checks are returning this error. If I have such a high failure rate, the program won't be of much use to me. ---------- Sir, Wihtout example of troubling web site I can not run into a more precise debugging process. Still, I would recommend trying to add the "userRedirect" option in your command line, since it forces the content checking of the web page (it seems to help so HTTP servers also) -- Martial MICHEL E-mail : ma...@us... Home page : http://www.loria.fr/~michel/ PBM : http://pbm.sourceforge.net/ DLC : http://dlc.sourceforge.net/ |
|
From: <dlc...@li...> - 2001-05-13 19:42:25
|
I am using dlc to check for bad links in a database of special interest URL's. It seems designed for just this task. I run it with the following options: deadlinkcheck -Verb -Timeout 100 -noCache -detOutput -codeConversion -HTMLoutput urllist.txt >checker.htm I had hoped that raising the timeout value to 100 keep me from getting too many "Entries with code 5xx". But as it goes through the list, it seems to assign this error to the majority of sites without even waiting the specified timeout interval. What is going on here? About 70% of my checks are returning this error. If I have such a high failure rate, the program won't be of much use to me. |