Re: URL cycling with staggered URLs
Status: Alpha
Brought to you by:
coroberti
From: Robert I. <cor...@gm...> - 2007-07-12 15:13:58
|
Hi John, On 6/24/07, Robert Iakobashvili <cor...@gm...> wrote: > > QUESTION/ SUGGESTION/ PATCH: > > > > What I want > > If I have N URLs and many clients, I would like curl-loader to > > if (process % N) == 0 then start on URL 0 > > if (process % N) == 1 then start on URL 1 > > if (process % N) == 2 then start on URL 2 > > (and then if I have more processes than URLs, wrap back to URL 0 > > when I reach URL N-1) > > > > Why do I want this? > > This means that if I have a large set of URLs (too big for server file cache), > > I can force the server to work hard at loading files from disk and get > > a more realisitic load for my server > We have the two features in our RoadMap/TODO list: > 2. An option to download a url not only once a cycle, but according to its > "Weight", probability. > 11. Usage of random time intervals, e.g 100-200 (from 100 to 200 msec); The options are inside the latest version 0.40. See for the tags FETCH_PROBABILITY and FETCH_PROBABILITY_ONCE for the first feature. Random timers taken from an interval are another feature. Best wishes. -- Sincerely, Robert Iakobashvili, coroberti %x40 gmail %x2e com ........................................................... http://curl-loader.sourceforge.net A web testing and traffic generation tool. |