From: Frank S. <fra...@rn...> - 2003-11-19 09:27:12
|
> >>> "ser...@ho..." 11/18/03 19:05 >>> > > This brings up a question about backing up a Wiki. Perhaps more of a > question that could be targetted to MySQL (I'm in process of > learning morea > bout MySQL nuances), but I'm wondering if anyone here has had any > experiences and recommended methods for cloning an entire > PHPWiki, in a wayt > hat is not specific to, say, MySQL (for example, I may want > to make backupc > opies in another database type and then also be able to > recover from thatb > ackup copy in case someone starts spamming my Wiki). > > Thanks for any suggestions... The first way that springs to mind is a full zipdump. In case you didn't already know, it creates a zip file containing one file per page. In each file, in RFC 822 format, are all the stored revisions. The last time I tried restoring a wiki in this fashion (1.3.5pre as of June-ish this year) I had to do some hacking of loadsave.php to get rid of the conflict checking (I wanted a _proper_ restore, with all the old revisions intact). Unfortunately I didn't document the changes I made. (I subsequently reverted them - they weren't the sort of thing you'd want to have on a production wiki.) PhpWikiAdministration should demonstrate the correct magic URL to use. What I did (apart from the loadsave.php hacking) was create the new phpwiki location (in my case a mySQL db), start it up, manually kill all the pages automatically created by the pgsrc stuff, and create a single page with the magic URL. Then I pointed the textbox to the location of the zipdump & hit go. A good while later I had my (rather large) wiki living in its new home. One caveat is that you should, for a large wiki, set DEBUG to false in your index.php or you'll run out of memory after 150-odd pages. If memory serves correctly. It might have been the zipdump or the xhtmldump that had that behaviour... frank |