(Not) fixed in 1.5.0-rc1, but at least that throws a warning and explanation.

https://github.com/s3tools/s3cmd/commit/11b176bbd2e317fe95041e250ca5a2300f3823c9

+ except MemoryError:
+ msg = """
+MemoryError! You have exceeded the amount of memory available for this process.
+This usually occurs when syncing >750,000 files on a 32-bit python instance.
+The solutions to this are:
+1) sync several smaller subtrees; or
+2) use a 64-bit python on a 64-bit OS with >8GB RAM
+ """
+ sys.stderr.write(msg)
+ sys.exit(1)
+



On Wed, Jul 23, 2014 at 2:19 PM, Dan Stynchula <dan@inside.com> wrote:
Triggered by the following command:
s3cmd sync --skip-existing --progress --recursive s3://<bucket name> s3://<other bucket name>


Problem: MemoryError:
S3cmd:   1.1.0-beta3

Traceback (most recent call last):
  File "/usr/bin/s3cmd", line 1800, in <module>
    main()
  File "/usr/bin/s3cmd", line 1741, in main
    cmd_func(args)
  File "/usr/bin/s3cmd", line 969, in cmd_sync
    return cmd_sync_remote2remote(args)
  File "/usr/bin/s3cmd", line 600, in cmd_sync_remote2remote
    src_list, dst_list, existing_list = compare_filelists(src_list, dst_list, src_remote = True, dst_remote = True)
  File "/usr/share/s3cmd/S3/FileLists.py", line 283, in compare_filelists
    debug("src_list.keys: %s" % src_list.keys())
MemoryError

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    An unexpected error has occurred.
    Please report the above lines to:
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


--

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds