I generally take my file list and feed it into xargs. Then I cut the file into chunks and copy across a few servers to run in parallel.


cat files | xargs -I XXX echo ./s3cmd cp --acl-public XXX s3://mybucketXXX >> outfile1
split -l 10000 outfile1

sh xaa > xaa.out&
sh xab > xab.out&

It depends on your file structure. You could easily use a comma separated file and use awk to create outfile1

On Mon, Nov 8, 2010 at 5:26 PM, dani <danigot@rogers.com> wrote:
I haven't found this in the documentation:
Is there way to supply the list of files to upload in a text file? I've been doing this with a while loop, but each line starts a new process, so it's not very efficient.

The Next 800 Companies to Lead America's Growth: New Video Whitepaper
David G. Thomson, author of the best-selling book "Blueprint to a
Billion" shares his insights and actions to help propel your
business during the next growth cycle. Listen Now!
S3tools-general mailing list