Hi all!

It's been a looong time since the last major s3cmd release. It's been quiet but the project is not dead.

A lot has happened since s3cmd 1.0.0 - most importantly the development was moved to GIT and is now primarily hosted on GitHub (https://github.com/s3tools/s3cmd). This move has sparked an unexpected activity among the users(-now-developers) of s3cmd who begun to contribute new features in a much higher rate than I was used to while the code was hosted in SourceForge's SVN. s3cmd 1.1.0 will be a true community effort - thanks folks!

Here are the major changes...

Multipart Upload
It's by far the most requested feature and it's finally here. In this beta release it works only for the [put] command, not for [sync] because there are some technical difficulties yet to be sorted out.
Multipart is enabled by default and kicks in for files bigger than 15MB. You can set this treshold as low as 5MB (Amazon's limit) with --multipart-chunk-size-mb=5 or to any other value between 5 and 5120 MB.

Two "management" commands are still missing - [mplist] for listing all unfinished multipart uploads and [mpabort] for aborting those left-behind ones. S3cmd tries to cleanly abort the upload and free up all the resources on the S3 side if it's interrupted for any reason but there still may be occasions where an unfinished upload is left behind.

A number of contributors submitted patches adding this functionality over the past few months, thank you all! Unfortunately none of submissions was complete and clean enough to make it through. In the end I used Jerome Leclanche's (Adys) work as a starting point and based my work on that. Thanks Jerome!

CloudFront invalidation
Another often requested feature. s3cmd now can invalidate objects in CloudFront distributions immediately during [sync] with --cf-invalidate switch. With this switch all uploaded files will be invalidated from CloudFront which makes the content delivery network consistent with the primary S3 storage. It takes some time to process the invalidation - use [s3cmd cfinvalinfo cf://dist-id] command to query the status. See CloudFront documentation for details.

Static WebSite support
Jens Braeuer contributed a patch for creating and deleting "static websites" in S3/CF. See S3 documentation for details.

Config settings from Environment
Many users are not comfortable with keeping their access and secret keys in the config file. Ori Bar provided a patch that allows any config value to be taken from the process environment. For example to obtain the access key from the environment put this in your ~/.s3cfg file:
    access_key=$S3_ACCESS_KEY
The S3_ACCESS_KEY name is just an example - use anything you like, just prepend it with "$" sign. For instance
    access_key=$ABRACADABRA
is just as valid as $S3_ACCESS_KEY

MIME options made saner
The previous semantic of --default-mime-type and --guess-mime-type was a bit ... illogical. It actually caught me by surprise a couple of times, despite the fact that I was the author of such an ill-logic ;)
I believe it's a lot cleaner now: --guess-mime-type tries to guess the MIME type by the extension or, if available, by using the "magic" file (python-magic support contributed by Karsten Sperling). If guessing fails the type is set to binary/octet-stream or to whatever is set in --default-mime-type. Explicit setting including the mime-type parameters can be achieved with --mime-type="text/plain; charset=utf-8".

Download multiple files to stdout
Rob Wills contributed a support for downloading multiple files to standard output. I'm not quite sure what is it good for to be honest ;) Anyway the patch was straightforward and so I accepted it.

s3cmd --configure can test access to a specified bucket
Listing buckets requires the S3 ListAllMyBuckets permission which is typically not available to delegated IAM accounts. With this change, s3cmd --configure accepts an (optional) bucket uri as a parameter and if provided, the access check will just verify access to this bucket individually. Contributed by Mike Repass.

Minor changes
Obviously this release also comes with a number of smaller fixes and improvements, tweaked config values here and there, stability improvements, documentation fixes, etc. Too much to list here individually.

Contribute to s3cmd project on GitHub
All hands are welcome to contribute to the s3cmd project. Whether you can code new features in Python or add/improve documentation you're welcome to "fork" the https://github.com/s3tools/s3cmd project, do your changes in your private s3cmd workspace and once ready submit them for inclusion to the official repository by sending a "Pull Request" from the GitHub web interface. I'm not going into more details about Git, GitHub workflow, forks, pull requests, etc - it's well out of scope of this announcement. Google it, Try it, Like it and Contribute :)

Download s3cmd 1.1.0-beta2
Download the s3cmd 1.0.0-beta2 package from SourceForge:
http://sourceforge.net/projects/s3tools/files/s3cmd/1.1.0-beta2/s3cmd-1.1.0-beta2.tar.gz/download

Alternatively grab the latest source from the official git repository:
git clone git://github.com/s3tools/s3cmd.git

Donate to s3cmd project
The s3cmd script is a free software, no licence or fees are required to use it. However if you find s3cmd useful and worth a buck please make a donation. Any amount, small or big, will be greatly appreciated!
Follow this link for a PayPal or Credit Card donation: http://s3tools.org/donate

Etc…
As with any new release I’m keen to hear your feedback, positive or not. Please direct all emails to s3tools-general@lists.sourceforge.net as usual.

Enjoy s3cmd 1.1.0-beta2 :)

Michal