From: Malcolm R. <mal...@cs...> - 2001-01-11 01:22:10
|
I have just recently started using PhpWiki to run a NomicWiki at http://ccls1.cse.unsw.edu.au/~malcolmr/nomicwiki/index.php. It is working quite well. Just for kicks, I added a "Site Map" function to the Wiki, which takes a given starting page and does a breadth-first-search of the entire Wiki using that page as the root. The result is remarkably readable (for my Wiki, at least). Have a look at: http://ccls1.cse.unsw.edu.au/~malcolmr/nomicwiki/index.php?map=NomicWiki I wonder whether this might be a useful addition to the PhpWiki source. Generating the site map is easy, but there are still a few problems: 1) I want to render it more like a standard Wiki page. In particular, I want to make all the links active. Is there an appropriate function to output the HTML code for a link? 2) I am using ExtractWikiPageLinks to get a list of outgoing links from a page, but this also returns things I don't want: outside links, disabled links, and text of the form '[[...]'. What is the best way to filter these out? Malcolm -- Malcolm Ryan - mal...@cs... - http://www.cse.unsw.edu.au/~malcolmr/ AI Dept, CSE, UNSW, Australia, Phone: +61 2 9385-3988 Fax: +61 2 9385-1814 "Just as we are. At a total loss. Jesus is for losers. Broken at the foot of the cross." - Steve Taylor |
From: Arno H. <aho...@xm...> - 2001-01-11 08:50:21
|
Malcom, > Just for kicks, I added a "Site Map" function to the Wiki, which takes = a > given starting page and does a breadth-first-search of the entire Wiki > using that page as the root. The result is remarkably readable (for my > Wiki, at least). Have a look at: > > http://ccls1.cse.unsw.edu.au/~malcolmr/nomicwiki/index.php?map=3DNomicW= iki Looks nice :o) Post the code please. > 1) I want to render it more like a standard Wiki page. In particular, I > want to make all the links active. Is there an appropriate function > to output the HTML code for a link? In stdlib.php there are two functions called LinkExistingWikiWord() and=20 LinkUnknownWikiWord(). I guess this is what you're looking for. Also, you can check the existence of a page with IsWikiPage(). Problem with sitemaps like these is, that they are quite expensive to=20 compute. If your wiki is small and has view visitors it doesn't matter mu= ch. But my public wiki at senseis.xmp.net gets mirrored via wget at least twi= ce=20 a week. And those !&@(? mirror every link they get, even the edit page an= d similar stuff. They also would mirror the sitemap for every page. So=20 basically each mirroring would be a small DoS attack on your wiki. That's= =20 why Ward's c2.com wiki uses the referer header to get rid of robots. > 2) I am using ExtractWikiPageLinks to get a list of outgoing links from > a page, but this also returns things I don't want: outside links, > disabled links, and text of the form '[[...]'. What is the best way > to filter these out? I think there's a bug in 1.1.9 which causes this behaviour. Try the lates= t=20 nightly build instead. Or replace it with the following function function ExtractWikiPageLinks($content) { global $WikiNameRegexp; $wikilinks =3D array(); $numlines =3D count($content); for($l =3D 0; $l < $numlines; $l++) { // remove escaped '[' $line =3D str_replace('[[', ' ', $content[$l]); // bracket links (only type wiki-* is of interest) $numBracketLinks =3D=20 preg_match_all("/\[\s*([^\]|]+\|)?\s*(.+?)\s*\]/", $line, $brktlinks); for ($i =3D 0; $i < $numBracketLinks; $i++) { $link =3D ParseAndLink($brktlinks[0][$i]); if (preg_match("#^wiki#", $link['type'])) $wikilinks[$brktlinks[2][$i]] =3D 1; $brktlink =3D preg_quote($brktlinks[0][$i]); $line =3D preg_replace("|$brktlink|", '', $line); } // BumpyText old-style wiki links if (preg_match_all("/!?$WikiNameRegexp/", $line, $link)) { for ($i =3D 0; isset($link[0][$i]); $i++) { if($link[0][$i][0] <> '!') $wikilinks[$link[0][$i]] =3D 1; } } } return $wikilinks; } =20 This should return only wiki internal links. /Arno |
From: Pablo R. R. <pr...@cl...> - 2001-01-11 09:09:48
|
Hi Arno, I'm finishing the phpNuke & myphportal integration with phpWiki, you can see it at: http://proca.nexen.net in main menu (Wiki test2) What code must I use for doing this? the latest version or the latest CVS? Cause I see that some changes were done. Pablo Roca (pr...@cl...) La Coruna - Espana myPHPortal Team http://sourceforge.net/projects/myphportal |
From: Pablo R. R. <pr...@cl...> - 2001-01-11 09:23:01
|
Can the phpWiki team do this change? Table names changed to: wiki wikiarchive wikilinks wikihitcount wikiscore Cause some developers used wiki with others products and table names can mix with ours. Well I have done this changes to my phpWiki, but I would like this will be standard. Thanks. Pablo Roca (pr...@cl...) La Coruna - Espana myPHPortal Team http://sourceforge.net/projects/myphportal |
From: Steve W. <sw...@wc...> - 2001-01-11 16:18:12
|
This is a good suggestion. We take similar care with function names and file names, and table names should be no different. ~swain On Thu, 11 Jan 2001, Pablo Roca Rozas wrote: > Can the phpWiki team do this change? > > Table names changed to: > > wiki > wikiarchive > wikilinks > wikihitcount > wikiscore > > Cause some developers used wiki with others products and table names can mix > with ours. > > Well I have done this changes to my phpWiki, but I would like this will be > standard. > > Thanks. > > Pablo Roca (pr...@cl...) > La Coruna - Espana > myPHPortal Team > http://sourceforge.net/projects/myphportal > > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > http://lists.sourceforge.net/mailman/listinfo/phpwiki-talk > ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
From: Pablo R. R. <pr...@cl...> - 2001-01-11 16:34:51
|
Ok, thanks Steve. Pablo Roca (pr...@cl...) La Coruna - Espana myPHPortal Team http://sourceforge.net/projects/myphportal > -----Mensaje original----- > De: php...@li... > [mailto:php...@li...]En nombre de Steve > Wainstead > Enviado el: jueves, 11 de enero de 2001 17:19 > Para: php...@li... > Asunto: Re: [Phpwiki-talk] table names suggest > > > > This is a good suggestion. We take similar care with function names and > file names, and table names should be no different. > > ~swain > > On Thu, 11 Jan 2001, Pablo Roca Rozas wrote: > > > Can the phpWiki team do this change? > > > > Table names changed to: > > > > wiki > > wikiarchive > > wikilinks > > wikihitcount > > wikiscore > > > > Cause some developers used wiki with others products and table > names can mix > > with ours. > > > > Well I have done this changes to my phpWiki, but I would like > this will be > > standard. > > > > Thanks. > > > > Pablo Roca (pr...@cl...) > > La Coruna - Espana > > myPHPortal Team > > http://sourceforge.net/projects/myphportal > > > > > > _______________________________________________ > > Phpwiki-talk mailing list > > Php...@li... > > http://lists.sourceforge.net/mailman/listinfo/phpwiki-talk > > > > ...............................ooo0000ooo................................. > Hear FM quality freeform radio through the Internet: http://wcsb.org/ > home page: www.wcsb.org/~swain > > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > http://lists.sourceforge.net/mailman/listinfo/phpwiki-talk |
From: Arno H. <aho...@xm...> - 2001-01-11 09:26:54
|
> What code must I use for doing this? the latest version or the latest > CVS? Cause I see that some changes were done. I suggest you use the nightly tarball. We are very close to 1.2.0 and 1.1= =2E9=20 had some serious bugs. /Arno |
From: Pablo R. R. <pr...@cl...> - 2001-01-11 09:40:14
|
Thanks Arno, Any expected date for releasing the 1.2.0 ? Pablo Roca (pr...@cl...) La Coruna - Espana myPHPortal Team http://sourceforge.net/projects/myphportal > -----Mensaje original----- > De: php...@li... > [mailto:php...@li...]En nombre de Arno > Hollosi > Enviado el: jueves, 11 de enero de 2001 10:27 > Para: php...@li... > Asunto: Re: [Phpwiki-talk] tarbal or latest 1.1.19? > > > > > What code must I use for doing this? the latest version or the latest > > CVS? Cause I see that some changes were done. > > I suggest you use the nightly tarball. We are very close to 1.2.0 > and 1.1.9 > had some serious bugs. > > /Arno > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > http://lists.sourceforge.net/mailman/listinfo/phpwiki-talk |
From: Pablo R. R. <pr...@cl...> - 2001-01-11 09:15:42
|
> Problem with sitemaps like these is, that they are quite expensive to > compute. If your wiki is small and has view visitors it doesn't > matter much. Yes, agreed, I also don't like a SiteMap cause with a hughe Wiki it would be very time consuming. Pablo Roca (pr...@cl...) La Coruna - Espana myPHPortal Team http://sourceforge.net/projects/myphportal |
From: Malcolm R. <mal...@cs...> - 2001-01-12 04:05:25
Attachments:
sitemap.php
|
On Thu, Jan 11, 2001 at 09:50:36AM +0100, Arno Hollosi wrote: > > > Just for kicks, I added a "Site Map" function to the Wiki, which takes a > > given starting page and does a breadth-first-search of the entire Wiki > > using that page as the root. The result is remarkably readable (for my > > Wiki, at least). Have a look at: > > > > http://ccls1.cse.unsw.edu.au/~malcolmr/nomicwiki/index.php?map=NomicWiki > > Looks nice :o) Post the code please. The file "sitemap.php" is attached. It goes in the lib directory, with an appropriate change to index.php to make it accessible. I haven't done a lot of PHP programming yet, so if my style is not too good, please tell me how I could make it better. > > 1) I want to render it more like a standard Wiki page. In particular, I > > want to make all the links active. Is there an appropriate function > > to output the HTML code for a link? > > In stdlib.php there are two functions called LinkExistingWikiWord() and > LinkUnknownWikiWord(). I guess this is what you're looking for. > Also, you can check the existence of a page with IsWikiPage(). This looks like exactly what I want. > Problem with sitemaps like these is, that they are quite expensive to > compute. If your wiki is small and has view visitors it doesn't matter much. > But my public wiki at senseis.xmp.net gets mirrored via wget at least twice > a week. And those !&@(? mirror every link they get, even the edit page and > similar stuff. They also would mirror the sitemap for every page. I'm not clear on this. Are you assuming that every page had a site-map link with itself at the root, or would this still be a problem if every page had an identical site-map link with the front page at the root? I never seriously intended the sitemap function to go on every page. I just made it flexible so that you could put the root wherever you wanted it. > > > 2) I am using ExtractWikiPageLinks to get a list of outgoing links from > > a page, but this also returns things I don't want: outside links, > > disabled links, and text of the form '[[...]'. What is the best way > > to filter these out? > > I think there's a bug in 1.1.9 which causes this behaviour. Try the latest > nightly build instead. Or replace it with the following function Ta. Malcolm -- Malcolm Ryan - mal...@cs... - http://www.cse.unsw.edu.au/~malcolmr/ AI Dept, CSE, UNSW, Australia, Phone: +61 2 9385-3988 Fax: +61 2 9385-1814 "Just as we are. At a total loss. Jesus is for losers. Broken at the foot of the cross." - Steve Taylor |
From: Steve W. <sw...@wc...> - 2001-01-11 15:46:34
|
On Thu, 11 Jan 2001, Malcolm Ryan wrote: > I have just recently started using PhpWiki to run a NomicWiki at > http://ccls1.cse.unsw.edu.au/~malcolmr/nomicwiki/index.php. It is working > quite well. > > Just for kicks, I added a "Site Map" function to the Wiki, which takes a > given starting page and does a breadth-first-search of the entire Wiki > using that page as the root. The result is remarkably readable (for my > Wiki, at least). Have a look at: > > http://ccls1.cse.unsw.edu.au/~malcolmr/nomicwiki/index.php?map=NomicWiki This is really cool. I would like to include it in PhpWiki! > Generating the site map is easy, but there are still a few problems: > > 1) I want to render it more like a standard Wiki page. In particular, I > want to make all the links active. Is there an appropriate function > to output the HTML code for a link? There are functions in stdlib.php for linking words: LinkExistingWikiWord($wikiword) LinkUnknownWikiWord($wikiword) LinkURL($url, $linktext) LinkImage($url, $alt) Will these do? > > 2) I am using ExtractWikiPageLinks to get a list of outgoing links from > a page, but this also returns things I don't want: outside links, > disabled links, and text of the form '[[...]'. What is the best way > to filter these out? Hmm. It would be bad to make you write filtering code, because that creates a dependency... when something about the linking scheme changes, you (or someone) would have to change the site map code as well. But at the moment I don't see an answer other than refactoring the code somehow. Perhaps in C style, we could introduce flags to ExtractWikiPageLinks to return all links, only external, only BumbyText, only brackets, etc. The function is only called once in any given DB library, so the change wouldn't be unmanagable. ~swain ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
From: Thomas K. <ka...@ph...> - 2001-01-11 16:32:20
|
Hi, > Perhaps in C style, we could introduce flags to ExtractWikiPageLinks to > return all links, only external, only BumbyText, only brackets, etc. The > function is only called once in any given DB library, so the change > wouldn't be unmanagable. You could ad an optional argument to the ExtractWikiPageLinks function which has a default value reproducing the actual behaviour. Then there is no need to change other libraries. Thomas p.s.: I made some extensions to phpwiki and am trying to do some more. Things I am very interesting in are - InterWiki : I implemented a NameSpace::WikiTag scheme to do this. - how can a single wiki site be multilingual ? There should be support to switch the language of the wiki as also to have multilingual wikiObjects - object oriented wiki aproach, to support more structuring I'm thinking about these ideas on www.openobject.de , unfortunately in german language. |
From: Malcolm R. <mal...@cs...> - 2001-01-12 04:05:19
|
On Thu, Jan 11, 2001 at 09:50:36AM +0100, Arno Hollosi wrote: > > > 1) I want to render it more like a standard Wiki page. In particular, I > > want to make all the links active. Is there an appropriate function > > to output the HTML code for a link? > > In stdlib.php there are two functions called LinkExistingWikiWord() and > LinkUnknownWikiWord(). I guess this is what you're looking for. > Also, you can check the existence of a page with IsWikiPage(). Okay, I have updated the sitemap code to use these functions, so now all the entries are links. The fix to ExtractWikiPageLinks has also made things neater. The only problem that apparently remains is the use of the [1], [2], [3] style references, which appear as "1", "2", "3" in the sitemap. Malcolm -- Malcolm Ryan - mal...@cs... - http://www.cse.unsw.edu.au/~malcolmr/ AI Dept, CSE, UNSW, Australia, Phone: +61 2 9385-3988 Fax: +61 2 9385-1814 "Just as we are. At a total loss. Jesus is for losers. Broken at the foot of the cross." - Steve Taylor |
From: Arno H. <aho...@xm...> - 2001-01-15 12:31:13
|
Malcom, > Okay, I have updated the sitemap code to use these functions, so now al= l > the entries are links. The fix to ExtractWikiPageLinks has also made > things neater. The only problem that apparently remains is the use of t= he > [1], [2], [3] style references, which appear as "1", "2", "3" in the > sitemap. Thanks for spotting the problem. I've corrected ExtractWikiPageLinks. Insert the following before the final "else": } elseif (preg_match("#^\d+$#", $URL)) { $link['type'] =3D "reference-$linktype"; =09 $link['link'] =3D $URL; /Arno |
From: Arno H. <aho...@xm...> - 2001-01-15 12:36:43
|
> Thanks for spotting the problem. I've corrected ExtractWikiPageLinks. > Insert the following before the final "else": > > } elseif (preg_match("#^\d+$#", $URL)) { > $link['type'] =3D "reference-$linktype"; > =09 $link['link'] =3D $URL; Actually, I've corrected the function ParseAndLink() which is called by=20 ExtractWikiPageLinks. Insert the elseif there. /Arno |