From: John K. <jo...@ke...> - 2003-11-10 14:49:27
|
At 1:05 pm +0100 10/11/03, Oliver Betz wrote: > > First blogs, and now wikis. There ain't no justice. > >Ack. Another reason for some kind of user management/authentication. > >I'm also afraid about vandalism, spam, illegal contents etc. > >Having some small and not very active Wikis for small groups, there >is no large community to check for changes, and sometimes I don't >visit the pages for weeks. I made a mod for 1.3.2 that I use on all my wiki installations. Whenever someone alters a page the server emails me the new page text, saying who changed it & adding a link to the diff page: http://phpwiki.sourceforge.net/phpwiki/EmailNotificationHack I just visited that page and note that some kind persons have updated it for 1.3.3 I find it a very useful security add-on, especially for my more open wiki sites - my sixth form college site took a beating a few weeks ago from a disgruntled student writing rude words on a dozen or so pages. Because I got immediate notification, I'd started reverting the changes to the first few pages before he'd even finished the later ones! As you can imagine, he gave up fairly promptly, seeing his clever alterations had all disappeared. I also find it handy to monitor submissions from wiki novices who maybe haven't got their heads round either the syntax or the structure of the site - I can quickly amend their changes before anyone sees them. Perhaps this idea might be included in the 1.4 distribution in some way? It gives people who've not encountered wiki before and feel a little nervous a reassuring safety blanket. In practice I find my wikis are rarely abused. John. -- ------------------------------------------- 01274 581519 / 07944 755613 jo...@ke... / http://www.kershaw.org AOL johnkershaw / Y! & MSN john_m_kershaw |
From: Sergio T. <ser...@ho...> - 2003-11-18 19:06:05
|
This brings up a question about backing up a Wiki. Perhaps more of a question that could be targetted to MySQL (I'm in process of learning more about MySQL nuances), but I'm wondering if anyone here has had any experiences and recommended methods for cloning an entire PHPWiki, in a way that is not specific to, say, MySQL (for example, I may want to make backup copies in another database type and then also be able to recover from that backup copy in case someone starts spamming my Wiki). Thanks for any suggestions... -Serj >From: "Oliver Betz" <ob...@de...> >To: php...@li... >Subject: Re: [Phpwiki-talk] Wiki spam, an emerging trend? >Date: Mon, 10 Nov 2003 13:05:53 +0100 > >Steve Wainstead wrote: > > > First blogs, and now wikis. There ain't no justice. > >Ack. Another reason for some kind of user management/authentication. > >I'm also afraid about vandalism, spam, illegal contents etc. > >Having some small and not very active Wikis for small groups, there >is no large community to check for changes, and sometimes I don't >visit the pages for weeks. > >Oliver > > >------------------------------------------------------- >This SF.Net email sponsored by: ApacheCon 2003, >16-19 November in Las Vegas. Learn firsthand the latest >developments in Apache, PHP, Perl, XML, Java, MySQL, >WebDAV, and more! http://www.apachecon.com/ >_______________________________________________ >Phpwiki-talk mailing list >Php...@li... >https://lists.sourceforge.net/lists/listinfo/phpwiki-talk _________________________________________________________________ MSN 8 helps eliminate e-mail viruses. Get 2 months FREE*. http://join.msn.com/?page=features/virus |
From: Frank S. <fra...@rn...> - 2003-11-19 09:27:12
|
> >>> "ser...@ho..." 11/18/03 19:05 >>> > > This brings up a question about backing up a Wiki. Perhaps more of a > question that could be targetted to MySQL (I'm in process of > learning morea > bout MySQL nuances), but I'm wondering if anyone here has had any > experiences and recommended methods for cloning an entire > PHPWiki, in a wayt > hat is not specific to, say, MySQL (for example, I may want > to make backupc > opies in another database type and then also be able to > recover from thatb > ackup copy in case someone starts spamming my Wiki). > > Thanks for any suggestions... The first way that springs to mind is a full zipdump. In case you didn't already know, it creates a zip file containing one file per page. In each file, in RFC 822 format, are all the stored revisions. The last time I tried restoring a wiki in this fashion (1.3.5pre as of June-ish this year) I had to do some hacking of loadsave.php to get rid of the conflict checking (I wanted a _proper_ restore, with all the old revisions intact). Unfortunately I didn't document the changes I made. (I subsequently reverted them - they weren't the sort of thing you'd want to have on a production wiki.) PhpWikiAdministration should demonstrate the correct magic URL to use. What I did (apart from the loadsave.php hacking) was create the new phpwiki location (in my case a mySQL db), start it up, manually kill all the pages automatically created by the pgsrc stuff, and create a single page with the magic URL. Then I pointed the textbox to the location of the zipdump & hit go. A good while later I had my (rather large) wiki living in its new home. One caveat is that you should, for a large wiki, set DEBUG to false in your index.php or you'll run out of memory after 150-odd pages. If memory serves correctly. It might have been the zipdump or the xhtmldump that had that behaviour... frank |
From: Sergio T. <ser...@ho...> - 2003-11-19 23:56:23
|
Thank you Frank for the suggestion. I had totally missed the Administrative aspect of the Wiki after I set it up recently (I must have been living on a different planet because of course I was aware of the administrator name and password when I had to modify index.php but as a PhPWiki newbie, I hadn't realized that I should have just tried to log in with the admin username and password as stored in the index.php file, as I had thought for some reason that it was used for something else "behind the scenes"). Once logged in, indeed I saw the Admin button and then found the page containing the options such as Zip dump and snapshot. I just recently tried all of the dump and snapshot options. I found something interesting which may be known about already, but I hope you don't mind me posting this observation. With regard to the XHTML Zip Snapshot option, I tried this a few different times from my Wiki and after receiving the zip file to my machine running Mac OS X 10.2 (Jaguar), I tried to unzip from the command line and got the following errors and warnings: ------------------------------------------- % unzip wikihtml.zip Archive: wikihtml.zip Created by PhpWiki 1.3.4 warning [wikihtml.zip]: 656 extra bytes at beginning or within zipfile (attempting to process anyway) file #1: bad zipfile offset (local header sig): 656 (attempting to re-compensate) extracting: AddingPages.html extracting: AllPages.html extracting: AllUsers.html . . . file #17: bad zipfile offset (local header sig): 133001 (attempting to re-compensate) file #17: bad zipfile offset (local header sig): 133001 file #18: bad zipfile offset (local header sig): 143172 file #19: bad zipfile offset (local header sig): 151260 . . . etc., etc. ------------------------------------------- The version of unzip that ships with Mac OS X 10.2.8 is: UnZip 5.50 of 17 February 2002, by Info-ZIP I haven't tried to unzip the same XHMTL file from the Wiki on another machine such as running Windows or Linux. I wonder if anyone else has had problems like this? Could be something needs to be configured with the Wiki on the server running the Wiki? Also, I noticed that the XHTML Dump is really more like the Zip Snapshot but not like the Zip Dump, in that only the current version of Wiki pages is dumped as XHTML files using XHTML Dump. I probably need to upgrade soon to PhPWiki per recent changes (maybe 1.3.5 or 1.3.6 has some of these dump and zip issues addressed? Has anyone tried?). Thanks, -Serj >From: "Frank Shearar" <fra...@rn...> >To: <php...@li...> >Subject: RE: [Phpwiki-talk] Wiki spam, an emerging trend? >Date: Wed, 19 Nov 2003 09:27:02 -0000 > > > >>> "ser...@ho..." 11/18/03 19:05 >>> > > > > This brings up a question about backing up a Wiki. Perhaps more of a > > question that could be targetted to MySQL (I'm in process of > > learning morea > > bout MySQL nuances), but I'm wondering if anyone here has had any > > experiences and recommended methods for cloning an entire > > PHPWiki, in a wayt > > hat is not specific to, say, MySQL (for example, I may want > > to make backupc > > opies in another database type and then also be able to > > recover from thatb > > ackup copy in case someone starts spamming my Wiki). > > > > Thanks for any suggestions... > >The first way that springs to mind is a full zipdump. In case you didn't >already know, it creates a zip file containing one file per page. In each >file, in RFC 822 format, are all the stored revisions. > >The last time I tried restoring a wiki in this fashion (1.3.5pre as of >June-ish this year) I had to do some hacking of loadsave.php to get rid of >the conflict checking (I wanted a _proper_ restore, with all the old >revisions intact). Unfortunately I didn't document the changes I made. (I >subsequently reverted them - they weren't the sort of thing you'd want to >have on a production wiki.) > >PhpWikiAdministration should demonstrate the correct magic URL to use. What >I did (apart from the loadsave.php hacking) was create the new phpwiki >location (in my case a mySQL db), start it up, manually kill all the pages >automatically created by the pgsrc stuff, and create a single page with >the magic URL. Then I pointed the textbox to the location of the zipdump & >hit go. A good while later I had my (rather large) wiki living in its new >home. > >One caveat is that you should, for a large wiki, set DEBUG to false in your >index.php or you'll run out of memory after 150-odd pages. If memory serves >correctly. It might have been the zipdump or the xhtmldump that had that >behaviour... > >frank > > > >------------------------------------------------------- >This SF.net email is sponsored by: SF.net Giveback Program. >Does SourceForge.net help you be more productive? Does it >help you create better code? SHARE THE LOVE, and help us help >YOU! Click Here: http://sourceforge.net/donate/ >_______________________________________________ >Phpwiki-talk mailing list >Php...@li... >https://lists.sourceforge.net/lists/listinfo/phpwiki-talk _________________________________________________________________ Tired of spam? Get advanced junk mail protection with MSN 8. http://join.msn.com/?page=features/junkmail |