this is my first posting, so: "Hallo at all!"
I'd like to run a shellskript per cronjob, that dumps the wiki as xhtml
and uploads to my internet-providers webspace per ftp.
So reading of the wiki can be done on "real" webspace while editing,
what will be done by only about 4-5 people, can be done on my pc at home.
So my question is, how can i do the dump from a shellskript.
Is that possible?
Thanks for any hints
Greets from Hamburg, Germany
From: Jim Cheetham <jim@in...> - 2004-08-08 10:02:03
On Aug 8, 2004, at 9:28 PM, rabautz wrote:
> So my question is, how can i do the dump from a shellskript.
> Is that possible?
Look at the 'wget' command, which will make a series of requests to a
website, following all the linked pages, and save the results in a
location of your choice.
There are other options, such as using 'lynx --dump' if you know the
names of the pages you want.