You can subscribe to this list here.
2010 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
(1) |
Nov
|
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2011 |
Jan
(4) |
Feb
(3) |
Mar
(5) |
Apr
|
May
(9) |
Jun
|
Jul
(13) |
Aug
(3) |
Sep
|
Oct
(1) |
Nov
|
Dec
(1) |
2012 |
Jan
|
Feb
|
Mar
(8) |
Apr
(2) |
May
(5) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
|
Oct
|
Nov
|
Dec
|
2013 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(2) |
Dec
|
2014 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Hank F. <st...@so...> - 2011-02-24 02:04:06
|
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <html> <head> <meta http-equiv="content-type" content="text/html; charset=ISO-8859-1"> </head> <body bgcolor="#ffffff" text="#000000"> Hi!<br> <br> I tried to use the online bug report link but it didn't work.<br> <br> I just installed the new version of link checker and the View Parent Source, which I consider of great importance, is grayed out and unusable.<br> <br> Please let me know when this is fixed.<br> <br> thanks<br> <br> Hank Friedman<br> </body> </html> |
From: Bastian K. <ca...@us...> - 2011-02-08 20:26:44
|
Hello Dan, this was a bug in LinkChecker and has been fixed in the git source repository. The fix will be included in the next release. Regards, Bastian On Tuesday, February 01, 2011 06:07:31 pm Dan Kowal wrote: > Hello Bastian, > > Another Linkchecker error that gets picked up after each run is when it > runs across a link being referenced in a map element as follows: > > usemap="#map name" > > We use this across our website, but for a particular set of pages (see > link below) Linkchecker identifies it as a 404 error: > > http://www.ngdc.noaa.gov/nndc/struts/results?nd=suppress&eq_0=8&t=101365&s= > 4&d=3&d=5&d=6 > > It appears to work fine...do you know what's missing? > > thanks, > > Dan |
From: Dan K. <Dan...@no...> - 2011-02-01 17:07:39
|
Hello Bastian, Another Linkchecker error that gets picked up after each run is when it runs across a link being referenced in a map element as follows: usemap="#map name" We use this across our website, but for a particular set of pages (see link below) Linkchecker identifies it as a 404 error: http://www.ngdc.noaa.gov/nndc/struts/results?nd=suppress&eq_0=8&t=101365&s=4&d=3&d=5&d=6 It appears to work fine...do you know what's missing? thanks, Dan -- Dan Kowal IT Specialist (Data Management) National Geophysical Data Center/NOAA (303) 497-6118 Dan...@no... |
From: Bastian K. <ca...@us...> - 2011-01-29 11:33:19
|
On Friday, January 28, 2011 09:41:27 pm Dan Kowal wrote: > In the error report, Linkchecker will report problem urls like: > data/WMM2010/WMM2010_Report.pdf > icons/WMM2010/WMM2010_F_MERC.png > > Does the problem have to do with the use of relative urls in the web > page? Are there any work-arounds? Remove the trailing space in the URLs. Instead of <a href="data/xyz.pdf "> write <a href="data/xzy.dpf"> Regards, Bastian |
From: Dan K. <Dan...@no...> - 2011-01-28 20:41:34
|
Hello Bastian, I'm trying to track something down with false positives I'm getting from Linkchecker 5.3. Here's a page that Linkchecker is checking: http://www.ngdc.noaa.gov/geomag/WMM/image.shtml The images you see here and the underlying links to data -- primarily pdf files -- are reporting as 404s. The data files reside here: http://wwwzenith.ngdc.noaa.gov/geomag/WMM/data/ In the error report, Linkchecker will report problem urls like: data/WMM2010/WMM2010_Report.pdf icons/WMM2010/WMM2010_F_MERC.png Does the problem have to do with the use of relative urls in the web page? Are there any work-arounds? thanks, Dan -- Dan Kowal IT Specialist (Data Management) National Geophysical Data Center/NOAA (303) 497-6118 Dan...@no... |
From: Bastian K. <ca...@us...> - 2011-01-09 16:54:50
|
Hi Michael, On Sunday 09 January 2011 06:42:26 Michael Lueck wrote: > First off, what is with the Ubuntu version numbering of said package. To me > it appears the Ubuntu package lags one major version number behind the > builds from the official project site. Overview for Ubuntu versions of LinkChecker: http://packages.ubuntu.com/search?keywords=linkchecker It lags behind official releases, but not much. Usually after 2-4 weeks the latest version is packaged which is quite good. > external sites. Soon I saw it jump to the first external site URL in one > of the links and start checking that site for dead links. I am not really > interested in checking external sites for dead links... rather that this > one site actually complete without crashing the tool! > > Suggestions please? - Try to upgrade to the latest version (if you're still using 5.1) - Try to use the --ignore-url option. - Send me the URL you are checking or at least the commandline options and config files you are using. You can send it directly to me (ca...@us...) if you don't want it to be public. Regards, Bastian |
From: Michael L. <ml...@lu...> - 2011-01-09 12:55:25
|
Greetings, I am cross posting between this list and the UbuntuForum site seeking an answer: "How to keep linkchecker-gui focused on the domain requested?" http://ubuntuforums.org/showthread.php?p=10334621 Greetings, I have a couple of questions about linkchecker-gui. First off, what is with the Ubuntu version numbering of said package. To me it appears the Ubuntu package lags one major version number behind the builds from the official project site. Lucid has 5.1, and according to this list: http://sourceforge.net/projects/linkchecker/files/ that seems to match up with the official 6.1 version, perhaps? <><><><> Anyway, I am trying to run this against a rather large web site that has countless pages linking to external sites. It ran for like 16 or so hours against said site, then crashed. I tried running it against a much smaller site, with a few links to external sites. Soon I saw it jump to the first external site URL in one of the links and start checking that site for dead links. I am not really interested in checking external sites for dead links... rather that this one site actually complete without crashing the tool! Suggestions please? TIA! -- Michael Lueck Lueck Data Systems http://www.lueckdatasystems.com/ |
From: Stefan K. <ste...@gm...> - 2010-10-15 12:43:44
|
Hello, I would love to get a report that lists all META tags per page in a site. Looking at the FAQ it sounds that this could done with a custom class. Unfortunately, I am a Python newbie.. - How hard would it be to do it? Any pointers? Regards Stefan |
From: Davide M. <dm...@op...> - 2010-09-14 09:37:39
|
Hi to all; I'm a newbie use of linkcheker and I need some assistance. I have to test the reserved area of a php forum useing the commandline options: -u username. -p password. I'm sure that the credentials I'm using are correct but the site redirects me to the login page everytime. Thanks in advance for your answers. -- *Davide Melfi* ** ** ******** |