Moving large files from s3 to s3 leaves them in their original folder.
The following command:
s3cmd mv s3://bucket/folder/* s3://bucker/folder2/
fails to move large files, it copies them correctly but it never deletes them. I found it moving files of about 800Mb.
I found out that the problem is in AWS response after copying which differs with large files, i don't know exactly the size at which it changes.
Tracking down the problem i found a solution, i leave it here so if anyone finds the same case.
I solved it changing stripNameSpace function on S3/Utils.py:
1 2 3 4 5 6 7 8 9 10
#!/usr/bin/python def stripNameSpace(xml): xml = xml.replace("\n","");# < This is the changed line r = re.compile('^(<?[^>]+?>\s?)(<\w+) xmlns=[\'"](http://[^\'\"]+)[\'"](.*)', re.MULTILINE) xmlns = None if r.match(xml): xmlns = r.match(xml).groups() xml = r.sub("\\1\\2\\4", xml) print xml return xml, xmlns
Sorry if don't provide a patch, but now haven't time to.
Hope helps someone, and i could be of help to improve s3tools, by the way, great product.
Log in to post a comment.