#141 moving large files from s3 to s3 leaves them in their original folder

Malfunction
open
nobody
move (1)
5
2013-11-03
2013-07-30
itomas
No

Moving large files from s3 to s3 leaves them in their original folder.

The following command:
s3cmd mv s3://bucket/folder/* s3://bucker/folder2/

fails to move large files, it copies them correctly but it never deletes them. I found it moving files of about 800Mb.

I found out that the problem is in AWS response after copying which differs with large files, i don't know exactly the size at which it changes.

Tracking down the problem i found a solution, i leave it here so if anyone finds the same case.

I solved it changing stripNameSpace function on S3/Utils.py:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
#!/usr/bin/python
def stripNameSpace(xml):
    xml = xml.replace("\n","");# < This is the changed line 
    r = re.compile('^(<?[^>]+?>\s?)(<\w+) xmlns=[\'"](http://[^\'\"]+)[\'"](.*)', re.MULTILINE)
    xmlns = None
    if r.match(xml):
        xmlns = r.match(xml).groups()[2]
        xml = r.sub("\\1\\2\\4", xml)
        print xml
    return xml, xmlns

Sorry if don't provide a patch, but now haven't time to.
Hope helps someone, and i could be of help to improve s3tools, by the way, great product.

Discussion

  • I am experiencing the same problem, file sizes are over 700MB, mv'd file is copied, but never deleted from source bucket.

    Thanks for the patch. Would like to try it but with an item of this severity it would be good to get input from the current maintainers, since stripNameSpace() changes are bound to affect other areas of the code?

    Thanks

     
  • Hi,

    The patch suggestion from itomas works on 1.5.0-alpha3, thanks itomas.

    s3cmd mv this that

    now works as expected. Will continue testing.

    Thanks