On 11/10/2010 02:21 PM, Steve Jenkins wrote:
> I was actually looking for a way to do this today, too. We want to sync a
> number of different directories to the bucket we've created for backups, and
> keep the file structure intact, but also be able to exclude when
> For example, I want to be able to send the contents of the /etc dir on
> hostname "myserver" to s3://bucketwherewekeepbackups/servers/myserver/etc/,
> but I want to exclude two dirs: /etc/selinux and /etc/webmin. Then we want
> to sync the /home directory on myserver (and all subdirs) to
> s3://bucketwherewekeepbackups/servers/myserver/home/ and so on.
> Currently, it seems the only way to do that is one s3cmd script per
> directory we want to sync (which is how we're currently doing it).
> It would be great to be able to do something like:
> s3cmd sync --skip-existing --delete-remove --keep-local-paths -f
> inputargs.txt s3://s3bucketname/
> Then have inputargs.txt be something like:
> include: /etc/
> exclude: /etc/selinux
> exclude: /etc/webmin
> include: /home
> exclude: /home/bob
> The -f would tell s3cmd which input file to use, and the --keep-local-paths
> option would tell s3cmd to append the paths in the inputargs.txt file to the
> destination bucket.
> When using something like --keep-local-paths, the user would have to be sure
> to be in the correct dir before launching the command (or add a cd to their
> shell script) and be sure to use appropriate paths in their input file.
> From: dani [mailto:email@example.com
> Sent: Monday, November 08, 2010 8:20 PM
> To: firstname.lastname@example.org
> Subject: Re: [S3tools-general] Argument list in text file
> Yes, but each line is a new shell process and opens and closes the
> connection to the server.
> I'm hoping for something like the wget parameter -i, which reuses open
> On Mon, Nov 8, 2010 at 9:19 PM, Jobe Bittman<email@example.com
> I generally take my file list and feed it into xargs. Then I cut the file
> into chunks and copy across a few servers to run in parallel.
> cat files | xargs -I XXX echo ./s3cmd cp --acl-public XXX s3://mybucketXXX
> split -l 10000 outfile1
> sh xaa> xaa.out&
> sh xab> xab.out&
> It depends on your file structure. You could easily use a comma separated
> file and use awk to create outfile1
> On Mon, Nov 8, 2010 at 5:26 PM, dani<firstname.lastname@example.org
> I haven't found this in the documentation:
> Is there way to supply the list of files to upload in a text file? I've been
> doing this with a while loop, but each line starts a new process, so it's
> not very efficient.
> The Next 800 Companies to Lead America's Growth: New Video Whitepaper
> David G. Thomson, author of the best-selling book "Blueprint to a
> Billion" shares his insights and actions to help propel your
> business during the next growth cycle. Listen Now!
> S3tools-general mailing list