You can subscribe to this list here.
| 2000 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
(103) |
Jul
(105) |
Aug
(16) |
Sep
(16) |
Oct
(78) |
Nov
(36) |
Dec
(58) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2001 |
Jan
(100) |
Feb
(155) |
Mar
(84) |
Apr
(33) |
May
(22) |
Jun
(77) |
Jul
(36) |
Aug
(37) |
Sep
(183) |
Oct
(74) |
Nov
(235) |
Dec
(165) |
| 2002 |
Jan
(187) |
Feb
(183) |
Mar
(52) |
Apr
(10) |
May
(15) |
Jun
(19) |
Jul
(43) |
Aug
(90) |
Sep
(144) |
Oct
(144) |
Nov
(171) |
Dec
(78) |
| 2003 |
Jan
(113) |
Feb
(99) |
Mar
(80) |
Apr
(44) |
May
(35) |
Jun
(32) |
Jul
(34) |
Aug
(34) |
Sep
(30) |
Oct
(57) |
Nov
(97) |
Dec
(139) |
| 2004 |
Jan
(132) |
Feb
(223) |
Mar
(300) |
Apr
(221) |
May
(171) |
Jun
(286) |
Jul
(188) |
Aug
(107) |
Sep
(97) |
Oct
(106) |
Nov
(139) |
Dec
(125) |
| 2005 |
Jan
(200) |
Feb
(116) |
Mar
(68) |
Apr
(158) |
May
(70) |
Jun
(80) |
Jul
(55) |
Aug
(52) |
Sep
(92) |
Oct
(141) |
Nov
(86) |
Dec
(41) |
| 2006 |
Jan
(35) |
Feb
(62) |
Mar
(59) |
Apr
(52) |
May
(51) |
Jun
(61) |
Jul
(30) |
Aug
(36) |
Sep
(12) |
Oct
(4) |
Nov
(22) |
Dec
(34) |
| 2007 |
Jan
(49) |
Feb
(19) |
Mar
(37) |
Apr
(16) |
May
(9) |
Jun
(38) |
Jul
(17) |
Aug
(31) |
Sep
(16) |
Oct
(34) |
Nov
(4) |
Dec
(8) |
| 2008 |
Jan
(8) |
Feb
(16) |
Mar
(14) |
Apr
(6) |
May
(4) |
Jun
(5) |
Jul
(9) |
Aug
(36) |
Sep
(6) |
Oct
(3) |
Nov
(3) |
Dec
(3) |
| 2009 |
Jan
(14) |
Feb
(2) |
Mar
(7) |
Apr
(16) |
May
(2) |
Jun
(10) |
Jul
(1) |
Aug
(10) |
Sep
(11) |
Oct
(4) |
Nov
(2) |
Dec
|
| 2010 |
Jan
(1) |
Feb
|
Mar
(13) |
Apr
(11) |
May
(18) |
Jun
(44) |
Jul
(7) |
Aug
(2) |
Sep
(14) |
Oct
|
Nov
(6) |
Dec
|
| 2011 |
Jan
(2) |
Feb
(6) |
Mar
(3) |
Apr
(2) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2012 |
Jan
(11) |
Feb
(3) |
Mar
(11) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(4) |
Dec
|
| 2013 |
Jan
|
Feb
|
Mar
|
Apr
(3) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2014 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(4) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(8) |
Dec
(1) |
| 2015 |
Jan
(3) |
Feb
(2) |
Mar
|
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(2) |
| 2016 |
Jan
|
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2017 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
(5) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2021 |
Jan
|
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
(6) |
Sep
(3) |
Oct
|
Nov
|
Dec
|
| 2022 |
Jan
(11) |
Feb
(2) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2023 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(3) |
Dec
(3) |
| 2024 |
Jan
(7) |
Feb
(2) |
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2025 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
(1) |
Jun
|
Jul
(3) |
Aug
|
Sep
(5) |
Oct
|
Nov
|
Dec
|
|
From: Steve W. <sw...@pa...> - 2000-11-28 17:41:50
|
---------- Forwarded message ---------- Date: Tue, 28 Nov 2000 17:17:00 +0000 (GMT) From: Gary Benson <ga...@ee...> To: sw...@pa... Subject: PhpWiki My compliments on a FinePieceOfWork! |
|
From: Steve W. <sw...@pa...> - 2000-11-28 16:28:49
|
Me?!?! I think the bulk of the code is now written by you and the others, Arno! I think I'm mostly the secretary these days! (Which I don't mind so much either). Because of you and all the other contributors, and all the people that have downloaded and used PhpWiki, it's taken on a life of its own. I feel most of the credit should go to all of you and the users, honestly. cheers sw On Tue, 28 Nov 2000, Arno Hollosi wrote: > > > PhpWiki will be mentioned in a PHP book.. a "For Dummies" book no less! > > Congratulations, Steve :o) > Just goes to show that you have truly made an excellent application. > > > Btw, my wiki project called Sensei's Library is online now. > It has already gained substantial contributions. Some SL features not found > in PhpWiki: pagetype, keywords, guided tours, diff archive (admin can > revive older versions). http://senseis.xmp.net/ > > /Arno > http://wcsb.org/~swain/ | "In a calendar year, America's entire * * * * * * | recorded music industry has revenues * * * * * * | roughly equal to one month's sales by * * * * * * | IBM." --Philip Greenspun |
|
From: Arno H. <aho...@in...> - 2000-11-28 16:23:27
|
> PhpWiki will be mentioned in a PHP book.. a "For Dummies" book no less! Congratulations, Steve :o) Just goes to show that you have truly made an excellent application. Btw, my wiki project called Sensei's Library is online now. It has already gained substantial contributions. Some SL features not found in PhpWiki: pagetype, keywords, guided tours, diff archive (admin can revive older versions). http://senseis.xmp.net/ /Arno |
|
From: Steve W. <sw...@pa...> - 2000-11-28 15:35:58
|
PhpWiki will be mentioned in a PHP book.. a "For Dummies" book no less! sw http://wcsb.org/~swain/ | "In a calendar year, America's entire * * * * * * | recorded music industry has revenues * * * * * * | roughly equal to one month's sales by * * * * * * | IBM." --Philip Greenspun ---------- Forwarded message ---------- Date: Mon, 27 Nov 2000 19:39:10 -0500 From: Ralph Roberts <ra...@ab...> To: sw...@pa... Subject: PHPWiki Dear Steve: I downloaded and installed your PHPWiki today on the Alexander Books intranet. It works quite nicely--my congratulations on an elegant job of programming. I intend to mention it favorably in the book I'm now writing, PHP4 FOR DUMMIES. Keep up the good work! --Ralph ------------------------------ Ralph Roberts, CEO Alexander Books / Creativity, Inc. 65 Macedonia Road Alexander, NC 28701 1-800-472-0438 voice & fax tollfree U.S. & Canada 1-828-255-8719 voice & fax overseas http://abooks.com !!$5 BLOWOUT!! on STAN VEIT'S HISTORY OF THE PC ... click here>> http://1-b.net/historypc.html See Ralph's latest bestsellers at: http://abooks.com/rebol http://abooks.com/genealogy http://autograph-book.com And my company's online auctions: http://booksave.com Wholesale to the Public Book Auction http://blue-gray.com Civil War Auction http://tennesseestars.com All TENNESSEE THINGS Auction http://bookauction.net Buy Books Net Instead of Retail! http://a-aa.com/gem GEM & METALWORKERS AUCTION http://a-aa.com/nc DOWN HOME IN NORTH CAROLINA AUCTION http://sigs.net/auction Sigs.NET Autograph Auction |
|
From: Steve W. <sw...@wc...> - 2000-11-27 14:40:35
|
Best what? ;-) sw On Sun, 26 Nov 2000, Stephan Marzi - MIG wrote: > STEVE WAINSTEAD IS THE BEST > > Regards > STEVE MARZI > MIG CEO > -- > Mit freundlichen Grüßen / Best Regards > > Stephan Marzi > Marzi Internet Gruppe [Group] > http://www.MarziInternetGroup.com/ > The MIG project alliance. > > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > http://lists.sourceforge.net/mailman/listinfo/phpwiki-talk > ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: Stephan M. - M. <ma...@Ma...> - 2000-11-26 20:10:55
|
STEVE WAINSTEAD IS THE BEST Regards STEVE MARZI MIG CEO -- Mit freundlichen Grüßen / Best Regards Stephan Marzi Marzi Internet Gruppe [Group] http://www.MarziInternetGroup.com/ The MIG project alliance. |
|
From: Steve W. <sw...@wc...> - 2000-11-22 15:59:23
|
On Wed, 22 Nov 2000, Arno Hollosi wrote: > p.s. did you see the search patch I added for mySQL? I did! I haven't tested it out yet though. I read through the code to see how I was going to implement it on the other databases :-) sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: Arno H. <aho...@in...> - 2000-11-22 00:05:06
|
> This is a bit off, but I have been thinking about the problem on > Sourceforge with the query: > I was reading the docs for Postgresql this weekend and it supports UNION, > and INTERSECT. Yes, union or subselects are among the things that mySQL is really lacking. When it comes down to more sophisticated stuff, mySQL is basically an SQL frontend to a filesystem. > I think subselects are more efficient, if the database knows how to > optimize for them. I have two SQL books by SQL guru Joe Celko and it seems he favours subselects over joins as well. Anyway, right now there's no other way for mySQL. But your thinking is not too far off. /Arno p.s. did you see the search patch I added for mySQL? |
|
From: Steve W. <sw...@wc...> - 2000-11-21 23:07:02
|
Hi Arno, This is a bit off, but I have been thinking about the problem on Sourceforge with the query: select distinct hitcount.pagename, hitcount.hits from wikilinks, hitcount where (wikilinks.frompage=hitcount.pagename and wikilinks.topage='AddingPages') or (wikilinks.topage=hitcount.pagename and wikilinks.frompage='AddingPages') order by hitcount.hits desc, hitcount.pagename; Which returns that error about the size being too large. It's not a problem since you set the variable in MySQL, but I've been trying to think of a way around the Cartisian join, if one indeed occurs. I was reading the docs for Postgresql this weekend and it supports UNION, and INTERSECT. I think UNION might have worked in this case, but unfortunately MySQL doesn't support it. We could have done: select topage from wikilinks where frompage='AddingPages' union select frompage from wikilinks where topage='AddingPages' to get all the page names that link to AddingPages. That could be an inner select, and then we do: select pagename, hits from hitcount where pagename in ( select topage from wikilinks where frompage='AddingPages' union select frompage from wikilinks where topage='AddingPages' ) order by hits desc, pagename; I think subselects are more efficient, if the database knows how to optimize for them. Just thinking out loud, sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: Steve W. <sw...@wc...> - 2000-11-17 02:47:10
|
On Mon, 6 Nov 2000, Arno Hollosi wrote: > > Hi all, > > a friend of mine had an intersting idea about how to build a real > multilingual wiki. What do you think about it? Would something like > this work? > > ------------------ > > It could be very nice if the Wiki concept was enhanced with > multi language abilities. So people can add there translation > to some pages. But the problem is that if one of the translation > is modified the other must be.... > I have may be an idea : create a database of translation of lines > (the original one, not the displayed one) > and not of files. So when a page is computed, the translation > is done line by line with the untranslated part ``as is''. > So a the text file is stored in one or more language. > All the possible translations of lines are stored. > The translation to a language will translate the text file > in the target language, line untranslatable will stay > untranslated. The mSQL implementation is partly this way now... lines from the page are stored in individual rows of a page table. (This is because mSQL cannot search text blob types). So it wouldn't be hard to hack in, probably just add a new column indicating what language the line is in, defaulting to the local language. > > There is only one thing to add in the user interface, > it is the ``translate page'' command. This might be the tricky part to implement though... > It could usable, just an example : > - Somebody wrote an english page > (The text file is in english) > - Somebody view it as french (but in english on the screen > because it is untranslatable) > - He translates the page to write it in french WITHOUT > changing line breaking (with a form with many ENTRIES). If I follow, at runtime a form is generated with a TEXTAREA for every line in the page, along with a second TEXTAREA for translating the line. > (The text file could be in english or french, it doesn't matter) > - The english page is changed : for the french reader > there is some line of english in the french text > (The text file is in english because it is in this > language it was edited) > - A spanish translation is done > (The text file could be in english or spanish. It could > be also in french with some lines in english) > - The french page is edited by adding some lines > (The texte file is in french and english) > - The spanish and english reader see some text in french > - The french page is edited to remove some lines. > The lines vanish in the other language. > (The texte file is in french and english > The point is that the translator do it line by line and that > lines are not too short. > A + is that translation is done for all the files, > it is not linked to one file. It's interesting to think that all language versions of the pages are stored together, but in the long run keeping all the versions in sync would be hard. However if the Wiki were not too big, and the focus on only two languages, it might not be that hard. sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: Steve W. <sw...@wc...> - 2000-11-16 22:01:41
|
Today SF closed all my bug reports without resolving them, though the MySQL server is still unstable/unusable. I have filed a newer, more detailed report in case there was some confusion, plus a link to a test site that uses MySQL on SF so they could see the problems themselves. (The live site is still running on DBM files). sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain ---------- Forwarded message ---------- Date: Thu, 16 Nov 2000 13:34:21 -0800 From: no...@so... To: sw...@wc..., no...@so..., st...@so... Subject: [Bug #122627] MySQL server still unusable Bug #122627, was updated on 2000-Nov-16 13:34 Here is a current snapshot of the bug. Project: SourceForge Category: Project Database Server Status: Open Resolution: None Bug Group: PHP Programming Priority: 5 Summary: MySQL server still unusable Details: Thanks for closing all my bugs, but the problem was not resolved. Here is a link to demonstrate the problem: http://phpwiki.sourceforge.net/test/phpwiki/ I get this error: ---------------------------------- Warning: MySQL Connection Failed: Lost connection to MySQL server during query in /home/groups/phpwiki/htdocs/test/phpwiki/lib/mysql.php on line 32 WikiFatalError Cannot establish connection to database, giving up. MySQL error: ------------------------------- That is not the only error though; when it does connect to MySQL I get a missing file error, as previously reported: ---------------------------------- Inserting page AddingPages, version 1 from text file WikiFatalError Error writing page 'AddingPages' MySQL error: Can't find file: './phpwiki/wiki.frm' (errno: 24) ----------------------------------- Also I cannot connect from the command line on orbital: --------------------------------- [wainstead@orbital lib]$ !mysql mysql -h moby.p.sourceforge.net -u phpwiki -pnotshown phpwiki ERROR 2013: ^GLost connection to MySQL server during query [wainstead@orbital lib]$ -------------------------------- These problems have been going on since last Friday. There were no problems in the months before. Something is wrong with the MySQL server installation. For detailed info, follow this link: http://sourceforge.net/bugs/?func=detailbug&bug_id=122627&group_id=1 |
|
From: Arno H. <aho...@in...> - 2000-11-16 16:01:55
|
Jeff,
> $pagehash["content"] = preg_split('/[ \t\r]*\n/', chop($content));
> The problem seems to be that preg_split returns an empty array when the
> regex does not match anything in the string.
I cannot reproduce this on my machine. My PHP version (4.00) returns an
array with one entry for the first line.
Which PHP version are you using?
/Arno
|
|
From: Jeff M. <jef...@sy...> - 2000-11-16 11:17:11
|
I've just setup phpwiki 1.1.9 and it seems to be working really well apart
from on thing. When pages are entered on a single line they don't get saved
to the DB properly instead it's just saving an empty string.
I've tracked this down to line 85 in savepage.php which is the following.
$pagehash["content"] = preg_split('/[ \t\r]*\n/', chop($content));
The problem seems to be that preg_split returns an empty array when the
regex does not match anything in the string. I've added the following line
just after this line and this solves the problem by checking to see if the
array is empty and if it is setting it to the unsplit content
if( count($pagehash["content"])<1){
$pagehash["content"][1]=$content;
}
Do you see this as a resonable fix or is there a better approach.
|
|
From: Steve W. <sw...@wc...> - 2000-11-14 23:44:01
|
For your viewing pleasure, here is the bug report I've filed. At this point I've switched the wiki to DBM files, since there seems to be no immediate hope of resolution. I'm not too discouraged, since we share a database server with a lot of other development projects, and a lot of bad code is probably hitting MySQL. sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain ---------- Forwarded message ---------- Date: Tue, 14 Nov 2000 15:39:11 -0800 From: no...@so... To: sw...@wc..., no...@so... Subject: [Bug #122321] Web server cannot connect to database Bug #122321, was updated on 2000-Nov-13 08:51 Here is a current snapshot of the bug. Project: SourceForge Category: Project Database Server Status: Open Resolution: None Bug Group: PHP Programming Priority: 5 Summary: Web server cannot connect to database Details: Hi, For the last few days the MySQL server is mostly unavailable for my project (phpwiki). We print out the error whenever there is some database problem and in this case, the web server is simply not connecting to MySQL. http://phpwiki.sourceforge.net/phpwiki/ Also we cannot connect from the command line on orbital: [wainstead@orbital phpwiki]$ !727 mysql -h moby.p.sourceforge.net -u phpwiki -pnotshown phpwiki ERROR 2013: ^GLost connection to MySQL server during query [wainstead@orbital phpwiki]$ thanks guys... great article in PHPbuilder on MySQL/Postgresql. Follow-Ups: Date: 2000-Nov-13 12:08 By: Wainstead Comment: This is up again. Thanks. ------------------------------------------------------- Date: 2000-Nov-14 08:09 By: Wainstead Comment: This continues to be a major problem. ------------------------------------------------------- Date: 2000-Nov-14 15:39 By: Wainstead Comment: Since the database is continually not available, thus making our site unavailable, I've switched to DBM files for the data store. ------------------------------------------------------- For detailed info, follow this link: http://sourceforge.net/bugs/?func=detailbug&bug_id=122321&group_id=1 |
|
From: Steve W. <sw...@wc...> - 2000-11-13 20:03:33
|
After three service requests with the Sourceforge staff, they finally have shell access to orbital up again; the MySQL server appears stable again; and I was able to connect with the mysql client and repair the problem with the wikiscore table. It looks like an internal MySQL file for the wikiscore table was lost. I dropped the table and recreated it, which finally solved the errors we were getting. There is some mention of the problems they've had recently with MySQL in an article comparing MySQL to Postgresql: http://www.phpbuilder.com/ cheers sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: Markus G. <mg...@bi...> - 2000-11-13 14:14:00
|
Hello Steve, i discussed the german translation process with Arno and we decided that I will help Arno to translate the full phpwiki pages. I hope to get the most things to be translated this week. Btw. the problems with the german wiki are solved, thanks to Arno and you. - Markus -- BITPlan GmbH smart solutions - Meerbuscher Str. 58-60 - 40670 Meerbusch-Osterath Tel +49 2159 5236-0 - Fax +49 2159 5236-10 - mg...@bi... - http://www.bitplan.de Home Office: Markus Guske - Erftstrasse 17 - D-41564 Kaarst Tel +49 173 946 1880 - Fax +49 2131 769-195 - mg...@gu... Möchten auch Sie zum Team gehören? http://www.bitplan.com/de/Jobs.html |
|
From: Arno H. <aho...@in...> - 2000-11-13 11:31:48
|
> Arno is indeed working on a German translation. You can see the progress > either in the CVS repository or pull a nightly build off Sourceforge > I'm sure he'd like some help, if he hasn't finished yet. I am not too eager to translate those pages myself, so any help is greatly appreciated. As Steve said: check the CVS for the latest version. /Arno |
|
From: Steve W. <sw...@wc...> - 2000-11-12 19:22:38
|
Hi Markus, Arno is indeed working on a German translation. You can see the progress either in the CVS repository (http://cvs.sourceforge.net/cgi-bin/cvsweb.cgi/phpwiki/locale/de/?cvsroot=phpwiki&sortby=date) or pull a nightly build off Sourceforge (http://phpwiki.sourceforge.net/nightly/phpwiki.nightly.tar.gz). I'm sure he'd like some help, if he hasn't finished yet. thx sw On Sun, 12 Nov 2000, Markus Guske wrote: > Hello, > > is anybody working on a full german translation for the phpwiki initial pages? > > > Thanks in advance, > > - Markus > -- > BITPlan GmbH smart solutions - Meerbuscher Str. 58-60 - 40670 > Meerbusch-Osterath > Tel +49 2159 5236-0 - Fax +49 2159 5236-10 - mg...@bi... - > http://www.bitplan.de > > Home Office: Markus Guske - Erftstrasse 17 - D-41564 Kaarst > Tel +49 173 946 1880 - Fax +49 2131 769-195 - mg...@gu... > > Möchten auch Sie zum Team gehören? http://www.bitplan.com/de/Jobs.html > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > http://lists.sourceforge.net/mailman/listinfo/phpwiki-talk > ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: <Mar...@t-...> - 2000-11-12 18:42:10
|
Hello, is anybody working on a full german translation for the phpwiki initial pages? Thanks in advance, - Markus -- BITPlan GmbH smart solutions - Meerbuscher Str. 58-60 - 40670 Meerbusch-Osterath Tel +49 2159 5236-0 - Fax +49 2159 5236-10 - mg...@bi... - http://www.bitplan.de Home Office: Markus Guske - Erftstrasse 17 - D-41564 Kaarst Tel +49 173 946 1880 - Fax +49 2131 769-195 - mg...@gu... Möchten auch Sie zum Team gehören? http://www.bitplan.com/de/Jobs.html |
|
From: Steve W. <sw...@wc...> - 2000-11-09 04:23:29
|
On Tue, 7 Nov 2000, Arno Hollosi wrote: > > I think we should rethink the wikiscore table. > > While the used metric may be quite useful in a web-like environment, within > a wiki it is questionable. The problem is that people usually sign their > contributions with their WikiName, and thus user's wiki-homepages get a > very high wikiscore. And generally speaking homepages are not that > important. So the wikiscore metric fails. > > For more info see MeatballWiki > http://www.usemod.com/cgi-bin/mb.pl?IndexingScheme > and look at MostReferencedPages / MostLinkedPages / ShortestPathPages (The following started as a reply but becomes more and more pedantic as it goes, so I apologize for the tone... but sometimes you work these things out in your head as you write.) This relates the the Semantic Web article on the O'Reilly Network (http://www.xml.com/pub/2000/11/01/semanticweb/index.html?wwwrrr_rss). The problem is Wiki does not distinguish between pages: all pages are the same and have equal meaning, more or less. We are trying to make Wiki (the program, the machine) give meaning to pages created by humans. So we try graphing problems: how many pages does this page link to? how many pages link to this one? how many times has this page been edited? when was this page last edited? how long ago was this page created? how many times has this page been viewed? how many degrees of separation are there between this page and that page? how many link paths are there from this page to that page (if they do not link directly)? what is the shortest path from this page to that page? what pages have names similar to this one? But the machine cannot know ArnoHollosi is a personal page while WhyWikiWorks is a discussion, DesignPatterns describes an abstract concept and DesignPatternsBook describes a textbook published in 1994. So these approaches (and several more listed by Nicolas Roberts) have shortcomings. On c2.com they introduced CategoryDesignPatterns and TopicExtremeProgramming (the Category- and Topic- prefixes) to get around this problem; in the large, then, any Wiki needs a WikiLibrarian, someone who sorts, labels and classifies information. An Information Architect. With a Wiki, everyone who edits pages has to be a WikiLibrarian; i.e. it's a community effort. So we come back to another problem (really, an interesting aspect/feature but also a usability problem) of a WikiWikiWeb: a lot of the organization of the information is by social contract. People agree to social conventions like adding a Category- link at the bottom of their page. I read an article on Lotus Notes recently that reinforced a perception I've had for the last year, ever since I read Jon Udell's "Practical Internet Groupware": it's really hard to get people to adapt new social conventions, or in the case of Notes, learn new ways of doing things. It took email a relatively long time to make it into the workplace because people were used to phones, voice mail, faxes, post-it notes and so on. (Granted email penetrated the workplace pretty fast for a new technology, but then most businesses today still don't have email. My corner deli doesn't). So we can perhaps write a set of guidelines for using a Wiki, include it in pgsrc/, and trust the universe. We can provide a certain number of clues to the user though hitcount, wikiscore and so on. But I think our current model is limited to just that (and any groupware system, ultimately, is too). I remember reading Steven Levy's "Hackers: Heroes of the Computer Revolution," and one of the MIT hackers from the 1960's later remarks that he couldn't believe what they were trying to do on the hardware of the era; he felt they had been naive about what they could accomplish on a PDP-11. Perhaps we are looking at the limitations of a Wiki as well. > ShortestPath seems to be most interesting but too expensive to compute. > (unless someone comes up with a good incremental algorithm) If I read the page correctly it's the Traveling Salesman problem :-) > Thus I suggest we change the following for 1.2.0: > * drop wikiscore table > * related pages are reduced to incoming/outgoing links which are sorted by > hitcount. Now that I've read your code (finally!), and really understand what it's doing, I think we might what to keep it after all. All the count (the number in the parantheses) is, is: select the incoming links for this page and rank them by how many pages link to it select the outgoing links for this page and rank them by how many pages link to it but then again, I might be confused once again. I confess that ever since you added these, I have to stop and reason out what it is they are doing... which leads to your next question: > Btw, some people find the terms "incoming/outgoing links" confusing. Is > there a better way to describe these? They confuse me too :-) Let me see if I get it right: ---- Five pages that link to this one, that are themselves linked to the most by other pages: Five pages this links to, ranked by how many pages link to those pages: Five most popular pages that either link to this one or are linked to by this one: ---- I think the trouble is we don't know why ErikBagfors links to the PhpWiki page. We can guess, is all. I like the idea more and more of conventions like Category- and Topic-. Noone can sort info like a human. We can alternately provide tools to try to help users be good WikiLibrarians, and then the machine can infer meaning from the metadata they provide. That has its own pitfalls, like trying to get everyone to use the same keywords for <META> tags (so then search engines can infer from that too). Rather than think about how to make PhpWiki compute meaningful numbers based on the data, I am going to doodle for a while and think about what information is going to help users find useful information in a Wiki. That's what this is all about, in the end. Right off, a more sophisticated search engine comes to mind. The Meatball Wiki is quite interesting. cheers sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: Arno H. <aho...@in...> - 2000-11-08 16:04:25
|
Hi there, I have committed the new admin structure (the orginal idea for this is from Jeff.) Admins just use admin.php as entry point to the wiki. Please test it out and post your findings here. Note that there are changes in /templates, /pgsrc as well. I'd appreciate if translators send me the necessary updates. If you are using mySQL you can remove pages now. Admin is not yet complete -- dumpHTML, convert1.0, and rebuildDBM is missing. There are changes to the template syntax of ###IF### too. I'll update the template README soon. /Arno |
|
From: Arno H. <aho...@in...> - 2000-11-07 11:06:52
|
I think we should rethink the wikiscore table. While the used metric may be quite useful in a web-like environment, within a wiki it is questionable. The problem is that people usually sign their contributions with their WikiName, and thus user's wiki-homepages get a very high wikiscore. And generally speaking homepages are not that important. So the wikiscore metric fails. For more info see MeatballWiki http://www.usemod.com/cgi-bin/mb.pl?IndexingScheme and look at MostReferencedPages / MostLinkedPages / ShortestPathPages ShortestPath seems to be most interesting but too expensive to compute. (unless someone comes up with a good incremental algorithm) Thus I suggest we change the following for 1.2.0: * drop wikiscore table * related pages are reduced to incoming/outgoing links which are sorted by hitcount. Btw, some people find the terms "incoming/outgoing links" confusing. Is there a better way to describe these? /Arno |
|
From: Sandino A. <sa...@sa...> - 2000-11-07 05:10:46
|
Steve Wainstead wrote: > > What would be infinitely better is if we used some kind of generic DB > library, like Perl's DBI/DBD. Pear DB (or any other db abstraction lib) can be used, but most of the times the DB portability problem resides in the SQL differences between databases, so a DBMS-specific driver is always needed. -- Sandino Araico Sánchez Diga no a la piratería, use software libre. |
|
From: Steve W. <sw...@wc...> - 2000-11-07 04:19:07
|
Hi Arno, There is a request for more character support right now on http://phpwiki.sourceforge.net:80/phpwiki/index.php?PhpWikiBrainstorm. It's something I think you're more qualified for, since I only use the pithy little American character set. ;-) I think it's a good idea, but there must be a cleaner way than just listing all the characters in the regexp. cheers sw ...............................ooo0000ooo................................. Hear FM quality freeform radio through the Internet: http://wcsb.org/ home page: www.wcsb.org/~swain |
|
From: Arno H. <aho...@in...> - 2000-11-06 11:11:26
|
Hi all, a friend of mine had an intersting idea about how to build a real multilingual wiki. What do you think about it? Would something like this work? ------------------ It could be very nice if the Wiki concept was enhanced with multi language abilities. So people can add there translation to some pages. But the problem is that if one of the translation is modified the other must be.... I have may be an idea : create a database of translation of lines (the original one, not the displayed one) and not of files. So when a page is computed, the translation is done line by line with the untranslated part ``as is''. So a the text file is stored in one or more language. All the possible translations of lines are stored. The translation to a language will translate the text file in the target language, line untranslatable will stay untranslated. There is only one thing to add in the user interface, it is the ``translate page'' command. It could usable, just an example : - Somebody wrote an english page (The text file is in english) - Somebody view it as french (but in english on the screen because it is untranslatable) - He translates the page to write it in french WITHOUT changing line breaking (with a form with many ENTRIES). (The text file could be in english or french, it doesn't matter) - The english page is changed : for the french reader there is some line of english in the french text (The text file is in english because it is in this language it was edited) - A spanish translation is done (The text file could be in english or spanish. It could be also in french with some lines in english) - The french page is edited by adding some lines (The texte file is in french and english) - The spanish and english reader see some text in french - The french page is edited to remove some lines. The lines vanish in the other language. (The texte file is in french and english) The point is that the translator do it line by line and that lines are not too short. A + is that translation is done for all the files, it is not linked to one file. |