It's not working for me for large files :(   When I try with a small file it works, but when I try with a file over 5G, I get this:

WARNING: Waiting 15 sec...
./myfileBig.txt -> s3://<bucket_name>/test/myfileBig.txt  [1 of 1]
        8192 of 21158596877     0% in    1s     6.31 kB/s  failed
ERROR: Upload of './myfileBig.txt' failed too many times. Skipping that file.

I am using the following version of s3cmd:

s3cmd version 1.0.0

Please help.  Thanks.

On Wed, Jul 11, 2012 at 5:04 PM, Dunie, Rob <> wrote:
As of the 1.0 version of s3cmd, you can upload file larger than 5G. 


From: Something Something []
Sent: Wednesday, July 11, 2012 6:58 PM
Subject: [S3tools-general] Transferring files over 5G to AWS


Is it possible to transfer a file bigger than 5G using s3cmd?  Somewhere I read that I can use command line arguments such as these:

x-amz-storage-class:REDUCED_REDUNDANCY s3_multipart_threshold:1073741824 s3_multipart_min_part_size:104857600"

Is that true?  Where can I find more info about this?  Tried s3cmd put --help but that didn't help.

Please help.  Thanks.

Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and
threat landscape has changed and how IT managers can respond. Discussions
will include endpoint security, mobile security and the latest in malware
S3tools-general mailing list