I just started using PHPWiki and would like to know how to do automated backups. In the hope of making this easier for myself, I chose to use flatfile storage. I have the wiki installed on Linux, and I'm fairly comfortable with zip and ruby, so as long as I understand what I'm doing, I'm fairly confident I can succeed.
However, I noticed that the standard backups that PHPWiki provides are not just zip files of the files in the flatfile system; each document has a MIME encoded file containing (I think) each major revision of that file.
Is there some kind of command line interface, or programmatic interface that I can plug into to run the backup in a cron job, for instance? If so, will it be self-evident how to do this if I use MySQL for storage instead of flat files?
If nothing else, I suppose I can zip up the data directory and its subdirectories. If I ever need to restore them, I can unzip them directly to the file system. Would that work? Or is there a better way?
Thanks for any help.
Keith
P.S. By the way, here's a little information about the structure of the data directory if anyone's interested...
The data directory has the following subdirectories:
latest_ver - latest_versions is the only file in this directory and seems to point to the current version number (and some other data unknown to me) of each file. This was the information stored for one document: i:2;s:28:"Running DNC on a Windows Box";
links - has a file for each document file. Here's the hex output of one of them, only 6 bytes long:
od -x Running+DNC+on+a+Windows+Box
0000000 3a61 3a30 7d7b
0000006
page_data - these files are binary; I suppose they have control information since the actual text is stored elsewhere.
ver_data - appears to contain a separate copy for each version of each file, e.g.:
> With a flatfile or dba db it would be as easy
> (easier, probably) to just back up the actual
> databases in your nightly backup. That's
> probably also true for the MySQL table files.
In a flat file system, would you suggest the following?:
the data directory and its subdirectories
the config.ini
Anything else? Also, I have a separate directory that contains images for the wiki pages. I'd back that up too. That's another question, I suppose; is there a way to upload an image for use in a wiki page using the wiki software? I do it manually now by uploading it to a directory on the web server and inserting an http link in the wiki page.
Thanks,
Keith
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Just as an addendum, I found that I could automate the backup by executing the following non-interactive command (the URL is the argument to the wget command; I mention this in case the page formatting here puts them on separate lines):
will do a dump into /tmp/wikidump, which you could have a cron script tar up. Something along those lines would probably work.
With a flatfile or dba db it would be as easy (easier, probably) to just back up the actual databases in your nightly backup. That's probably
also true for the MySQL table files.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi, all -
I just started using PHPWiki and would like to know how to do automated backups. In the hope of making this easier for myself, I chose to use flatfile storage. I have the wiki installed on Linux, and I'm fairly comfortable with zip and ruby, so as long as I understand what I'm doing, I'm fairly confident I can succeed.
However, I noticed that the standard backups that PHPWiki provides are not just zip files of the files in the flatfile system; each document has a MIME encoded file containing (I think) each major revision of that file.
Is there some kind of command line interface, or programmatic interface that I can plug into to run the backup in a cron job, for instance? If so, will it be self-evident how to do this if I use MySQL for storage instead of flat files?
If nothing else, I suppose I can zip up the data directory and its subdirectories. If I ever need to restore them, I can unzip them directly to the file system. Would that work? Or is there a better way?
Thanks for any help.
P.S. By the way, here's a little information about the structure of the data directory if anyone's interested...
The data directory has the following subdirectories:
latest_ver - latest_versions is the only file in this directory and seems to point to the current version number (and some other data unknown to me) of each file. This was the information stored for one document: i:2;s:28:"Running DNC on a Windows Box";
links - has a file for each document file. Here's the hex output of one of them, only 6 bytes long:
od -x Running+DNC+on+a+Windows+Box
0000000 3a61 3a30 7d7b
0000006
page_data - these files are binary; I suppose they have control information since the actual text is stored elsewhere.
ver_data - appears to contain a separate copy for each version of each file, e.g.:
Running+DNC+on+a+Windows+Box--1
Running+DNC+on+a+Windows+Box--2
Running+DNC+on+a+Windows+Box--3
Thanks,
Keith
Joel -
Thanks for the help. A couple more questions:
> With a flatfile or dba db it would be as easy
> (easier, probably) to just back up the actual
> databases in your nightly backup. That's
> probably also true for the MySQL table files.
In a flat file system, would you suggest the following?:
Anything else? Also, I have a separate directory that contains images for the wiki pages. I'd back that up too. That's another question, I suppose; is there a way to upload an image for use in a wiki page using the wiki software? I do it manually now by uploading it to a directory on the web server and inserting an http link in the wiki page.
Thanks,
Keith
Just as an addendum, I found that I could automate the backup by executing the following non-interactive command (the URL is the argument to the wget command; I mention this in case the page formatting here puts them on separate lines):
wget "http://localhost/phpwiki/index.php/PhpWikiAdministration?action=dumpserial&directory=/tmp/wikidump"
It executes the command and saves the output in a file in the current directory, which I renamed to x.html and viewed in a browser.
It didn't seem to matter that I wasn't logged in as admin. However, I got an error, which I am posting in a separate thread.
You can do a wiki dump directly if you're already logged in as the admin. If your wiki is at www.example.com, then
www.example.com/index.php/PhpWikiAdministration?action=dumpserial&directory=/tmp/wikidump
will do a dump into /tmp/wikidump, which you could have a cron script tar up. Something along those lines would probably work.
With a flatfile or dba db it would be as easy (easier, probably) to just back up the actual databases in your nightly backup. That's probably
also true for the MySQL table files.