I've been playing with nget for a bit now and it rocks! I am envisionning building a graphical front-end to it to better scan through articles.
The only problem is that large groups (>1M headers) take very large amounts of memory (duh), so much that saving the cache and expiring articles is _very_ long. How realistic would it be to move the storage for nget to something like SQLite? Matthew? This is something I could help with...
Hi there. Sorry for the delay in replying, I kind of missed this.
I have been a bit wary of using SQL since I don't want to require the external daemon, but something like SQLite looks pretty good.
If you were just to simply use it to replace the cache storage, it should be pretty straightforward. That is (almost) all contained in cache.cc/cache.h. (Though coming up with a good/fast schema may be tricky due to the heirarchical nature of nget's data)
If you wanted to go a bit further and actually take advantage of SQL queries and stuff, that would be a bit more extensive of a change. And somewhat problematic due to the lack of regexps, but there could be other ways to use it (ie, only looking up what data is needed to do a test, doing sorting, etc).
There are also other possibilities:
Probably the simplest solution would be to split the storage of data for each server/group into multiple cache files, each one covering only a certain range of article numbers. Then on update each file could be
processed seperately, limiting the memory needed, and on searching the current meta-grouping code could be used to join them together on the fly.
In any case, help would certainly be appreciated as I haven't had much time to devote to this lately.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.