#101 S3cmd fails to upload 5G+ files

Malfunction
closed-fixed
nobody
s3cmd (119)
5
2015-02-09
2011-09-16
Anonymous
No

OS: CentOS 5.5
kernel: 2.6.18-194.32.1.el5
s3cmd version 1.0.0

S3cmd fails to upload files larger than 5Gb.
It returns message: "WARNING: Upload failed: /file.tar.gz ((32, 'Broken pipe'))" and then retries a lot of times on lower speed.

Would it be related to previous limit of 5 Gb? Current S3 limit per object is 5 Tb.

Thanks,
Maxim

Discussion

  • Gabriel PREDA

    Gabriel PREDA - 2012-12-30

    Amazon says:
    The total volume of data and number of objects you can store are unlimited. Individual Amazon S3 objects can range in size from 1 byte to 5T. The largest object that can be uploaded in a single PUT is 5G. For objects larger than 100M, customers should consider using the Multipart Upload capability.

    So the limit per object is indeed 5T but in a PUT operation you can only upload 5G at a time.

    - Can we have s3cmd at least refuse for the time being to upload files bigger than 5G ?
    - Can we have it use Multipart Upload capability for big files ?

     
  • Matt Domsch

    Matt Domsch - 2015-02-09
    • status: open --> closed-fixed
     

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:





No, thanks