From: Jacek S. <arn...@gm...> - 2007-11-26 21:45:12
|
itinerants wrote: > Having added ... > fire(QueueManagerListener::AutoSearchingFor(), qi->getTarget()); > And logged and analysed what's going on with autoSearching, perhaps now is > the time for an improvement? > > Firstly, there is the problem of the 30 items - if nothing is found, and the > "queue of 30" isn't full, autosearching simply stops. That's just a bug - > it's happened to me and the only answer is to quit and relaunch the app, > something I'm loathe to do when I've waiting 24 hours to get onto a given > "difficult" download. Yes, that's a bug if it happens... > Even aside from that, the fact that a new candidate is found *randomly* > means that, theoretically, there may be items that you *never* search for - > they simply don't get randomly picked. Whilst I've never seen that happen > (which might be hard seeing as how I'd have to, by definition, wait > forever), I HAVE seen the same file searched for 3 times before a given file > *did* get searched for. Random to avoid searching for the same file from a rar-set or album or whatever since the file list is downloaded on match catching those files anyway... > However, going further, seeing that quite a lot seems to be happening at the > moment, I'd propose one of the following 2 improvements to the algorithm... > > 1) The pretty easy one... > A) Remove the 30 limit entirely, letting the "recent" list get as large as > it likes. > B) When "findAutoSearch" fails to return a search candidate, empty the > "recents" list. Next time around it'll start with a clean sheet. This easily > fixes both problems listed above. I've implemented this and it works just > dandy. Which potentially could use a lot of memory for a large queue (100000 files)...but then again, the queue itself will at that point use a lot of memory so maybe it's not so bad... > 2) The harder, but "better" one... > Build a list of everything that's eligible. > Order this list, using... > Priority (user nominated it, let it mean something here as well) > Number of existing sources (I've waited hours for it to pick something > with 0, yes zero, sources, whilst it gaily autosearched for things I had 60 > sources for already) > Crunch through the list. > When the list is empty, go to step 1 Won't work for large queues (recreating and ordering the list would be pretty costly...) 3) Use a mark & sweep kind of algorithm and mark each item with a flag (since the QueueItem already carries a flag field)...when all candidates are flagged, unflag all and restart...this would be pretty cheap computationally and solve the 3-in-a-row and max-30 problems since all files will be searched for before restarting...missing would be weight-by-priority, but that could maybe be solved as well (for example unmark only highest first time, then highest+high, then highest+high+normal etc )... Now all that's missing is a patch =) /J |