From: Igor S. <oz...@gr...> - 2003-03-10 18:25:23
|
Brian, At this point every client gets the same URL load. We plan to improve on this design either by giving larger data sets to the "more capable" clients, or by having the client get a new data set while still crawling the old one, thus minimizing idle crawl time. Cheers, Ozra. On Wed, 19 Feb 2003, Brian Heckathorne wrote: > > My request! > An option for amount of URLs to grab would be nice. My machine eats up 500 > URLs quickly. > > > > Brian > GJCN > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |