Downloading Sent folder agonizingly slow

Help
rglk
2009-03-16
2013-05-28
  • rglk

    rglk - 2009-03-16

    I'm using GetLive.pl,v 1.48 2009/02/14 with Perl v. 5.10.0 in Arch Linux (kernel 2.6.28).  My .getliverc config file is appended below.

    I wanted to download the messages in the Sent folders of two Hotmail Live accounts of mine.  One account had about 320 messages in that folder (total of around 530 MB), the other 27 messages (total of 33 MB).  Both accounts are old (pre-2000) Hotmail Live accounts that are now called "Windows Live" accounts.  For both accounts the browser GUI settings are: reading pane off, skip the Today page.

    Getting the email messages from the Sent folders was agonizingly slow with GetLive.  It took several minutes per message to "get the email message", even when it was just a one liner message.  The subsequent process of "sending mail to '/usr/bin/procmail'" was very fast, virtually instantaneous.  In one run of GetLive, though, that I did with the account with the fewer messages, a set of short messages without attachments were processed much more quickly, i.e. taking only a few seconds per message.

    Frequently also, GetLive died with the message: 'Unable to download email message."  Mostly this happened with messages that had a number of .jpg or .mp3 attachments and that rather, e.g. close to the upper size limit of 10 MB.  In these situations, I had to move the offending message to the Inbox folder and restart GetLive to get it over that stopping point.

    How frequently GetLive died while downloading the Sent folders was very variable: at times it died after practically every third or fourth message (when it was large, with attachments), and at other times 20 or more messages would be processed (including some large ones with attachments) before GetLive exited and had to be restarted.

    Also, some messages that caused it to die, upon restarting the program were then handled OK.

    While this process of downloading the Sent messages with GetLive was going on in this extremely slow fashion, I was able at the same time to login to Hotmail and browse the Sent and Inbox folders of the respective account and read their messages in the browser interface (Firefox) without any problems and with no delays.  I.e. the slowness was not caused by network congestion or temporary problems with the Hotmail servers.

    All in all, it took almost a day to download these two Sent folders, and it constantly required manual attention.  The resulting mbox files in /var/mail could be read by Thunderbird and the messages moved into the corresponding Sent folders there with relatively little problem.

    When I downloaded the Inbox folders from these two accounts, the processing time per message was quick, except of course for the big messages with many attachments.  I did that a few days earlier.

    For downloading newly arrived mail from the Inbox folders on the Hotmail server and for sending mail from these two Hotmail accounts I'm presently using Thunderbird, with the proper server settings for POP3 and SMTP.  That works very well.  Microsoft made POP3 and SMTP available to Windows Live Hotmail users for free earlier this year.  But to my knowledge these free features only apply to the Inbox folder; Thunderbird still cannot be used to download the messages in the Sent folder, and one has to use GetLive for that.

    Obviously, there is some serious timing problem with getting the messages from the Sent folder on the Hotmail server when I use GetLive.  What could be the cause and can it be fixed?

    === My .getliverc ===

    Mode=200902
    UserName = xxxxxxxxxx
    Password = xxxxxxxxxx
    Domain = hotmail.com
    Downloaded = /home/user/getlive/fetched
    RetryLimit = 2
    CurlBin = curl -k
    Processor = /usr/bin/procmail   
    Folder = Sent
    MarkRead = No
    Delete = No

     
    • Jos De Laender

      Jos De Laender - 2009-03-18

      I really do not know ....

      Sometimes I also observe a slowness that I cannot explain.

      Maybe , at one time, if the pain is big enough, I still might dive into this issue.

      However, I once started analyzing, and it looked like an issue of perl efficiency on handling large data chunks.
      Which would be *very* difficult to correct if staying with perl.

      Is there by any chance a big memory and CPU usage at the moment you observe this slowness ?

      Jos

       

Log in to post a comment.