I've run into this with later versions too. 32-bit python _will_ run out of memory like this if you have >800k files in your trees. You can either split your sync runs into smaller chunks (e.g. sync fewer directories, one after the other), or use a 64-bit python. 64-bit python consumes roughly 2x the memory of 32-bit python (all those objects have lots of pointers, which dominates the runtime size), but if you have a 64-bit system and can spare sufficient memory or swap, that works.
The s3tools project won't be fixing python's memory hogging problems on 64-bit, and we'd be hard-pressed to use less memory per file - if anything I've made it worse by storing _more_ metadata about files recently to speed up runtime by eliminating more unnecessary file transfers.