RE: Best way to mimic the load we get normally.
Status: Alpha
Brought to you by:
coroberti
From: Greg P. <gr...@sl...> - 2009-06-08 19:11:29
|
>> Could you clarify a bit more, what do you have and what do you wish? >> Thanks! If I understand you correctly, you want to know the target load I'm looking to test? Really, we see about 6,000-10,000 concurrent users browsing around the site at peak time. I would like to start my testing at about 1000 users just clicking though a set of 8 urls, one page every few seconds. Just to get a sense of the query caching/ opcode caching performance. When I've run the test with as many as 3,000 clients in the configuration I sent, I was getting very good response times in the test, however, when I tried to click around the site in a browser I was getting much worse results. >> If you wish to have some picks in your load, you may place not a random, but some >> rigid time in TIMER_AFTER_URL_SLEEP= I did have this at first, I was just thinking that a small random timer between requests would be more realistic (and wouldn't hurt my cluster of web servers as much ;) ) But, am I correct in my assumption of what it's actually doing? Ramping up to 1000 users will start sending 1000 requests to each url in the list when it hits the max clients? I'm sorry if I'm just not asking the right questions. Thanks for your quick response, and help so far. Regards, Gregory Patmore Systems Architect Slingo Inc. 411 Hackensack Ave., Hackensack, NJ 07601 (P) 201.489.6727 - (F) 201.489.6728 http://www.slingo.com <http://www.slingo.com/> ________________________________ From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Monday, June 08, 2009 2:56 PM To: curl-loader-devel Subject: Re: Best way to mimic the load we get normally. Hi Gregory, On Mon, Jun 8, 2009 at 9:45 PM, Greg Patmore <gr...@sl...> wrote: The urls are all configured like so: URL=http://mydomain.com/firstpage.html URL_SHORT_NAME="firstpage" REQUEST_TYPE=GET TIMER_URL_COMPLETION=0 TIMER_AFTER_URL_SLEEP=0-5000 ... ect ... And the command I'm running the test with is: ./curl-loader -f ./conf-examples/1k-clients.conf -v -u -t 2 So when I run the test sometimes it seems that it's sending the number of client requests to each url at the same time, which for 8 URL entries would equate to around 8000 requests per second once it hits the max. What I'm trying to do is show a load of 1000 user's on the site at the same time, clicking to a new page every few seconds. Am I doing it wrong? Could you clarify a bit more, what do you have and what do you wish? Thanks! If you wish to have some picks in your load, you may place not a random, but some rigid time in TIMER_AFTER_URL_SLEEP= Ir you wish to increase an element of randomness in clicking pages, look at FETCH_PROBABILITY. Also, is it more realistic if I use the -r switch? No Gregory Patmore -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |