I need to copy several hundred thousand files from one bucket to another. There some tools like bucket explorer that allow you to define the number of "threads" to use, so it will copy many files at once, instead of one at a time. I tested it out and it is much faster.
Does s3cmd have the capability to do this? If not, any plans for the future?
Hi, at the moment it isn't possible, for the future it is indeed planned. You can however run multiple s3cmd instances for different prefixes in the source bucket, sort of a "manual multithreading".
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.