Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo
I have over 3 TB of data to sync between a NAS server and Amazon S3 bucket
There are many folders and sub folders and these were originally imported via Amazon's import/export service.
What I would like to do is write a bash script that will run a sync on one particular folder and then wait for s3cmd to finish before it moves onto the next one.
Has anyone done anything similar, or can offer some tips?