Further limitation of memory usage needed
Status: Beta
Brought to you by:
donut
When getting headers of large usegroups the memory
usage exceeds 1Gbytes this is a huge amount of memory
for older machines that can be used as an offline
newsgroup reader. Why is the complete message cache
kept in memory ? Isn't it possible to already start
processing the posts during the header download and
write posts that are complete (when all parts are
available) to file ?