From: Malcolm R. <mal...@cs...> - 2001-01-12 04:05:25
|
On Thu, Jan 11, 2001 at 09:50:36AM +0100, Arno Hollosi wrote: > > > Just for kicks, I added a "Site Map" function to the Wiki, which takes a > > given starting page and does a breadth-first-search of the entire Wiki > > using that page as the root. The result is remarkably readable (for my > > Wiki, at least). Have a look at: > > > > http://ccls1.cse.unsw.edu.au/~malcolmr/nomicwiki/index.php?map=NomicWiki > > Looks nice :o) Post the code please. The file "sitemap.php" is attached. It goes in the lib directory, with an appropriate change to index.php to make it accessible. I haven't done a lot of PHP programming yet, so if my style is not too good, please tell me how I could make it better. > > 1) I want to render it more like a standard Wiki page. In particular, I > > want to make all the links active. Is there an appropriate function > > to output the HTML code for a link? > > In stdlib.php there are two functions called LinkExistingWikiWord() and > LinkUnknownWikiWord(). I guess this is what you're looking for. > Also, you can check the existence of a page with IsWikiPage(). This looks like exactly what I want. > Problem with sitemaps like these is, that they are quite expensive to > compute. If your wiki is small and has view visitors it doesn't matter much. > But my public wiki at senseis.xmp.net gets mirrored via wget at least twice > a week. And those !&@(? mirror every link they get, even the edit page and > similar stuff. They also would mirror the sitemap for every page. I'm not clear on this. Are you assuming that every page had a site-map link with itself at the root, or would this still be a problem if every page had an identical site-map link with the front page at the root? I never seriously intended the sitemap function to go on every page. I just made it flexible so that you could put the root wherever you wanted it. > > > 2) I am using ExtractWikiPageLinks to get a list of outgoing links from > > a page, but this also returns things I don't want: outside links, > > disabled links, and text of the form '[[...]'. What is the best way > > to filter these out? > > I think there's a bug in 1.1.9 which causes this behaviour. Try the latest > nightly build instead. Or replace it with the following function Ta. Malcolm -- Malcolm Ryan - mal...@cs... - http://www.cse.unsw.edu.au/~malcolmr/ AI Dept, CSE, UNSW, Australia, Phone: +61 2 9385-3988 Fax: +61 2 9385-1814 "Just as we are. At a total loss. Jesus is for losers. Broken at the foot of the cross." - Steve Taylor |