s3cmd include option doesn't work

  • Prakash

    Prakash - 2014-03-28

    I'm running the following s3cmd:

    ./s3cmd put --dry-run --include '*.xml' --server-side-encryption --recursive ~/import/acnj/ s3://syus/production/import/acnj/

    I'm expecting only xml files to match this, but I'm seeing all the files being included:

    upload: /home/ec2-user/import/acnj/2010-12/acnj.csv -> s3://syus/production/import/acnj/2010-12/acnj.csv

    upload: /home/ec2-user/import/acnj/2010-12/acnj.xml -> s3://syus/production/import/acnj/2010-12/acnj.xml

    upload: /home/ec2-user/import/acnj/2011-01/acnj.csv -> s3://syus/production/import/acnj/2011-01/acnj.csv

    upload: /home/ec2-user/import/acnj/2011-01/acnj.xml -> s3://syus/production/import/acnj/2011-01/acnj.xml

    Am I missing something? I even tried the --include-from option and added *.xml to a file with the same result.

    Thank you.


  • Matt Domsch

    Matt Domsch - 2014-04-04

    With current upstream github.com/s3tools/s3cmd master branch, this works if you also add a --exclude * argument. Excludes and includes are processed as thus: All files are included by default. If a file is excluded, exclude it. Then if an --include overrides the exclude, include it back again.

    Therefore, if you don't exclude anything, the --include will do nothing.

    You can explicitly exclude directories on sync local2remote by appending a '/' to the pattern. --exclude home/foo/bar/to-exclude/ will thus not upload anything at the to-exclude/ directory or its children.


Log in to post a comment.

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:

JavaScript is required for this form.

No, thanks