From: Sergio T. <ser...@ho...> - 2003-11-19 23:56:23
|
Thank you Frank for the suggestion. I had totally missed the Administrative aspect of the Wiki after I set it up recently (I must have been living on a different planet because of course I was aware of the administrator name and password when I had to modify index.php but as a PhPWiki newbie, I hadn't realized that I should have just tried to log in with the admin username and password as stored in the index.php file, as I had thought for some reason that it was used for something else "behind the scenes"). Once logged in, indeed I saw the Admin button and then found the page containing the options such as Zip dump and snapshot. I just recently tried all of the dump and snapshot options. I found something interesting which may be known about already, but I hope you don't mind me posting this observation. With regard to the XHTML Zip Snapshot option, I tried this a few different times from my Wiki and after receiving the zip file to my machine running Mac OS X 10.2 (Jaguar), I tried to unzip from the command line and got the following errors and warnings: ------------------------------------------- % unzip wikihtml.zip Archive: wikihtml.zip Created by PhpWiki 1.3.4 warning [wikihtml.zip]: 656 extra bytes at beginning or within zipfile (attempting to process anyway) file #1: bad zipfile offset (local header sig): 656 (attempting to re-compensate) extracting: AddingPages.html extracting: AllPages.html extracting: AllUsers.html . . . file #17: bad zipfile offset (local header sig): 133001 (attempting to re-compensate) file #17: bad zipfile offset (local header sig): 133001 file #18: bad zipfile offset (local header sig): 143172 file #19: bad zipfile offset (local header sig): 151260 . . . etc., etc. ------------------------------------------- The version of unzip that ships with Mac OS X 10.2.8 is: UnZip 5.50 of 17 February 2002, by Info-ZIP I haven't tried to unzip the same XHMTL file from the Wiki on another machine such as running Windows or Linux. I wonder if anyone else has had problems like this? Could be something needs to be configured with the Wiki on the server running the Wiki? Also, I noticed that the XHTML Dump is really more like the Zip Snapshot but not like the Zip Dump, in that only the current version of Wiki pages is dumped as XHTML files using XHTML Dump. I probably need to upgrade soon to PhPWiki per recent changes (maybe 1.3.5 or 1.3.6 has some of these dump and zip issues addressed? Has anyone tried?). Thanks, -Serj >From: "Frank Shearar" <fra...@rn...> >To: <php...@li...> >Subject: RE: [Phpwiki-talk] Wiki spam, an emerging trend? >Date: Wed, 19 Nov 2003 09:27:02 -0000 > > > >>> "ser...@ho..." 11/18/03 19:05 >>> > > > > This brings up a question about backing up a Wiki. Perhaps more of a > > question that could be targetted to MySQL (I'm in process of > > learning morea > > bout MySQL nuances), but I'm wondering if anyone here has had any > > experiences and recommended methods for cloning an entire > > PHPWiki, in a wayt > > hat is not specific to, say, MySQL (for example, I may want > > to make backupc > > opies in another database type and then also be able to > > recover from thatb > > ackup copy in case someone starts spamming my Wiki). > > > > Thanks for any suggestions... > >The first way that springs to mind is a full zipdump. In case you didn't >already know, it creates a zip file containing one file per page. In each >file, in RFC 822 format, are all the stored revisions. > >The last time I tried restoring a wiki in this fashion (1.3.5pre as of >June-ish this year) I had to do some hacking of loadsave.php to get rid of >the conflict checking (I wanted a _proper_ restore, with all the old >revisions intact). Unfortunately I didn't document the changes I made. (I >subsequently reverted them - they weren't the sort of thing you'd want to >have on a production wiki.) > >PhpWikiAdministration should demonstrate the correct magic URL to use. What >I did (apart from the loadsave.php hacking) was create the new phpwiki >location (in my case a mySQL db), start it up, manually kill all the pages >automatically created by the pgsrc stuff, and create a single page with >the magic URL. Then I pointed the textbox to the location of the zipdump & >hit go. A good while later I had my (rather large) wiki living in its new >home. > >One caveat is that you should, for a large wiki, set DEBUG to false in your >index.php or you'll run out of memory after 150-odd pages. If memory serves >correctly. It might have been the zipdump or the xhtmldump that had that >behaviour... > >frank > > > >------------------------------------------------------- >This SF.net email is sponsored by: SF.net Giveback Program. >Does SourceForge.net help you be more productive? Does it >help you create better code? SHARE THE LOVE, and help us help >YOU! Click Here: http://sourceforge.net/donate/ >_______________________________________________ >Phpwiki-talk mailing list >Php...@li... >https://lists.sourceforge.net/lists/listinfo/phpwiki-talk _________________________________________________________________ Tired of spam? Get advanced junk mail protection with MSN 8. http://join.msn.com/?page=features/junkmail |