Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo
I'm running the following s3cmd:
./s3cmd put --dry-run --include '*.xml' --server-side-encryption --recursive ~/import/acnj/ s3://syus/production/import/acnj/
I'm expecting only xml files to match this, but I'm seeing all the files being included:
upload: /home/ec2-user/import/acnj/2010-12/acnj.csv -> s3://syus/production/import/acnj/2010-12/acnj.csv
upload: /home/ec2-user/import/acnj/2010-12/acnj.xml -> s3://syus/production/import/acnj/2010-12/acnj.xml
upload: /home/ec2-user/import/acnj/2011-01/acnj.csv -> s3://syus/production/import/acnj/2011-01/acnj.csv
upload: /home/ec2-user/import/acnj/2011-01/acnj.xml -> s3://syus/production/import/acnj/2011-01/acnj.xml
Am I missing something? I even tried the --include-from option and added *.xml to a file with the same result.
With current upstream github.com/s3tools/s3cmd master branch, this works if you also add a --exclude * argument. Excludes and includes are processed as thus: All files are included by default. If a file is excluded, exclude it. Then if an --include overrides the exclude, include it back again.
Therefore, if you don't exclude anything, the --include will do nothing.
You can explicitly exclude directories on sync local2remote by appending a '/' to the pattern. --exclude home/foo/bar/to-exclude/ will thus not upload anything at the to-exclude/ directory or its children.