Work at SourceForge, help us to make it a better place! We have an immediate need for a Support Technician in our San Francisco or Denver office.

Close

Error after upgrade to 1.0.0

Anonymous
2011-01-15
2013-02-19

  • Anonymous
    2011-01-15

    My CentOS system updated s3cmd to 1.0.0  on the morning of 1/10/2011.

    My backup method is to upload the cpanel backup files to a "daily" directory in S3.
    Then I do an S3 to S3 copy the daily directory  to day appropriate weekly directory (such as weekly/tuesday) which is retained for the week.

    Every day since the upgrade, I've gotten this error when trying to make the copy. Each time it sometimes starts with _var_cpanel.tar.zip failing (which seems to work after a try or two) but always completely fails with emdcli.tar.gz

    INFO: Retrieving list of remote files for s3://xxxxxxxx/daily/ ...
    INFO: Applying --exclude/--include
    INFO: Summary: 53 remote files to copy
    WARNING: Retrying failed request: /dailys/Wednesday/dirs/_var_cpanel.tar.gz (The read operation timed out)
    WARNING: Waiting 3 sec...
    WARNING: Retrying failed request: /dailys/Monday/emdcli.tar.gz (The read operation timed out)
    WARNING: Waiting 3 sec...
    WARNING: Retrying failed request: /dailys/Monday/emdcli.tar.gz (The read operation timed out)
    WARNING: Waiting 6 sec...
    WARNING: Retrying failed request: /dailys/Monday/emdcli.tar.gz (The read operation timed out)
    WARNING: Waiting 9 sec...
    WARNING: Retrying failed request: /dailys/Monday/emdcli.tar.gz (The read operation timed out)
    WARNING: Waiting 12 sec...
    WARNING: Retrying failed request: /dailys/Monday/emdcli.tar.gz (The read operation timed out)
    WARNING: Waiting 15 sec...
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
       An unexpected error has occurred.
     Please report the following lines to:
      s3tools-bugs@lists.sourceforge.net
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    Problem: S3RequestErr: Request failed for: /dailys/Monday/emdcli.tar.gz
    S3cmd:   1.0.0
    Traceback (most recent call last):
     File "/usr/bin/s3cmd", line 2006, in ?
       main()
     File "/usr/bin/s3cmd", line 1950, in main
       cmd_func(args)
     File "/usr/bin/s3cmd", line 614, in cmd_cp
       subcmd_cp_mv(args, s3.object_copy, "copy", "File %(src)s copied to %(dst)s")
     File "/usr/bin/s3cmd", line 607, in subcmd_cp_mv
       response = process_fce(src_uri, dst_uri, extra_headers)
     File "/usr/lib/python2.4/site-packages/S3/S3.py", line 311, in object_copy
       response = self.send_request(request)
     File "/usr/lib/python2.4/site-packages/S3/S3.py", line 487, in send_request
       return self.send_request(request, body, retries - 1)
     File "/usr/lib/python2.4/site-packages/S3/S3.py", line 487, in send_request
       return self.send_request(request, body, retries - 1)
     File "/usr/lib/python2.4/site-packages/S3/S3.py", line 487, in send_request
       return self.send_request(request, body, retries - 1)
     File "/usr/lib/python2.4/site-packages/S3/S3.py", line 487, in send_request
       return self.send_request(request, body, retries - 1)
     File "/usr/lib/python2.4/site-packages/S3/S3.py", line 487, in send_request
       return self.send_request(request, body, retries - 1)
     File "/usr/lib/python2.4/site-packages/S3/S3.py", line 489, in send_request
       raise S3RequestError("Request failed for: %s" % resource['uri'])
    S3RequestError: Request failed for: /dailys/Monday/emdcli.tar.gz
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
       An unexpected error has occurred.
       Please report the above lines to:
      s3tools-bugs@lists.sourceforge.net
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    

    There are no errors during the upload.  No errors when deleting the weekly copies for that day before copying them over.

     
  • Michal Ludvig
    Michal Ludvig
    2011-01-17

    That's interesting. Are you on a slow link, e.g. ADSL?

    Try to put "socket_timeout=60" to your ~/.s3cfg and let me know if it helps.

     

  • Anonymous
    2011-01-19

    It's very odd since all the other files up to that point transfer fine.

    This is the typical transfer speed when uploading.
    (209098623 bytes in 164.6 seconds, 1240.79 kB/s)
    Don't think it's a speed problem.

    Are previous versions of S3cmd available?

     
  • Mike
    Mike
    2011-01-29

    No help on this?  S3cmd has not worked since the update. it has to do with the copy command.  Sync works fine, but copy fails.

    I did some more tests today and it's not file related. It always fails on the same file. So i deleted that file, and sure enough, it failed on the next file in the list.  I tried using the s3 browser and I was able to copy and delete the files like s3cmd does.  I tried copying different directory (with the same list of files) and it again fails every time on the same file count regardless of the file.

    I like your tool but it's unusable right now.

     
  • Mike
    Mike
    2011-01-29

    when I said s3 browser I mean the web based file browser from amazon.

    Additionally, I downloaded the previous version and renamed it s3cmd9 and modified my script to use it instead, but that didnt work at all. I suppose install is a little more in depth than just the s3cmd file.

     
  • Michal Ludvig
    Michal Ludvig
    2011-02-03

    Has anyone tried to put "socket_timeout=60" into ~/.s3cfg file? Or even larger value, 300 for example. Did it make any difference?

     
  • ludvigm, I had this same problem and changing the timeout worked for me, thanks bud!