hello there,

backuppc is my brave friend for my notebook backups. now i wanted to use it for my webserver too, but looking for ftp support i only found some old devel posts and the notice in the roadmap.

a good starting point might be this piece of code:
http://reoback.sourceforge.net/

reobackup already does incremental backups via ftp and already has all the parse-ls logic (see sub backupMisc {}, sub scanDir{} in sources), so it might be a good starting point. code is understandable and license is gpl.

perhaps i'll code it some rainy day, perhaps someone else appreciates this info.

love and light!
aexl

ps: i'm not on the list, replies only via email...
pps: before coding i'll have to understand why we should have to tar files...
Probably the only choice with wget is to fetch all the files
(either via ftp or http) into a temporary directory, then run
tar on that directory and pipe it into BackupPC_tarExtract.


Email Archive: backuppc-devel (read-only)

[BackupPC-devel] Re: Backup via FTP
From: Craig Barratt <cbarratt@us...> - 2004-05-18 01:17
Fabrizio Balliano writes:

> I use backuppc and i think it's a gret product, i was thinking
> about a thing, is it possible to implement an ftp plugin to
> backup for example websites on hosting space that has no ssh.
>
> what do you think about this?

That would be great. A few people have asked about ftp and I would
like to include ftp in a future version of BackupPC.

There are two choices:

- you could create a new module called BackupPC::Xfer::FTP that
uses Net::FTP. I haven't really looked at Net::FTP. One question
with ftp is whether you can support incrementals. I think you can.
For example, you could use Net::FTP->ls to list each directory, and
get the file sizes and mtimes. That would allow incrementals to be
supported: only backup the files that have different sizes or mtimes
for an incremental. You also need the ls() function to recurse
directories. Your code would need to be robust to the different
formats returned by ls() on different clients.

- you could create a new module called BackupPC::Xfer::Wget that
uses wget. Wget can do both http and ftp. Certainly backing up
a web site via ftp is better than http, especially when there
is active content and not just static pages. But the benefit
of supporting http is that you could use it to backup config
pages of network hardware (eg: routers etc). So if a router
fails you have a copy of the config screens and settings.
(And the future tripwire feature on the todo list could tell
you if someone messed with the router settings.)

Probably the only choice with wget is to fetch all the files
(either via ftp or http) into a temporary directory, then run
tar on that directory and pipe it into BackupPC_tarExtract.

The advantage of using wget is you get both http and ftp.
The disadvantage is that you can't support incrementals
with wget, but you can with Net::FTP. Also people will
find wget harder to configure and run.

Anyhow, if you want to develop a BackupPC::Xfer::FTP or
BackupPC::Xfer::Wget plugin I would be happy to include it in a
future version. Now that partials are supported in 2.1.0beta,
there is some tricky code with cleaning up after aborting backups
so that a partial file is not kept and other cached attributes
are saved. Look at how this is done for the other Xfer modules
in 2.1.0beta. Please email me if you have questions.

Perhaps long term we will support both FTP and wget.
That gives a rich collection of transport methods: SMB,
rsync, tar, ftp and http.

Craig