curl-loader-devel Mailing List for curl-loader - web application testing (Page 14)
Status: Alpha
Brought to you by:
coroberti
You can subscribe to this list here.
2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2007 |
Jan
|
Feb
(1) |
Mar
(7) |
Apr
(19) |
May
(25) |
Jun
(16) |
Jul
(59) |
Aug
(29) |
Sep
(18) |
Oct
(19) |
Nov
(7) |
Dec
(29) |
2008 |
Jan
(6) |
Feb
(18) |
Mar
(8) |
Apr
(27) |
May
(26) |
Jun
(5) |
Jul
(6) |
Aug
|
Sep
(9) |
Oct
(37) |
Nov
(61) |
Dec
(17) |
2009 |
Jan
(21) |
Feb
(25) |
Mar
(4) |
Apr
(2) |
May
(8) |
Jun
(15) |
Jul
(18) |
Aug
(23) |
Sep
(10) |
Oct
(16) |
Nov
(14) |
Dec
(22) |
2010 |
Jan
(23) |
Feb
(8) |
Mar
(18) |
Apr
(1) |
May
(34) |
Jun
(23) |
Jul
(11) |
Aug
(1) |
Sep
(13) |
Oct
(10) |
Nov
(2) |
Dec
(8) |
2011 |
Jan
|
Feb
(7) |
Mar
(24) |
Apr
(12) |
May
(3) |
Jun
(2) |
Jul
(2) |
Aug
|
Sep
(5) |
Oct
(20) |
Nov
(7) |
Dec
(11) |
2012 |
Jan
(12) |
Feb
(5) |
Mar
(16) |
Apr
(3) |
May
|
Jun
(5) |
Jul
(12) |
Aug
(6) |
Sep
|
Oct
|
Nov
(8) |
Dec
|
2013 |
Jan
(1) |
Feb
(3) |
Mar
(5) |
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(8) |
Dec
(4) |
2014 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
(11) |
Dec
(5) |
2015 |
Jan
(1) |
Feb
|
Mar
(11) |
Apr
(3) |
May
(1) |
Jun
(1) |
Jul
(4) |
Aug
(1) |
Sep
(7) |
Oct
(4) |
Nov
(2) |
Dec
|
2016 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Robert I. <cor...@gm...> - 2010-05-26 10:13:06
|
Hi Sajal, On Wed, May 26, 2010 at 11:52 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > I want to run a test against a web server (nginx) with traffic coming from > maximum possible clients , each with different IP address. I am running the > Curl Loader on a physical machine with 2 dual core processors (i.e. 4 > cores), and 8 GB of memory. The machine has 2 gigabit Ethernet cards. I > was just wondering how to make the most efficient utilization of these > resources. I have come up with a couple of options and would need some help > in choosing the most appropriate to get the maximum performance. > > 1. Use the entire resources together and run the curl loader scripts > with -t 4 and total number of clients to be somewhere around 150 K. > Considering each client consumes around 40 KB of memory, the theoreticalmaximum will be (8*1024*1024) / 40 somewhere around 210 K. But since we do > need some spare memory, hence we choose approx 150 K. > > 1. Looks good with exception that I have doubts regarding VM usage. 2. Ensure that you know how to configure ngnix to support for 150K connections. By default it is 1024. Note, that for each IP address you have 64K ports. Ensure, that ngnix machine can withstand such load. All the above questions are to ngnix mailing list. > > 1. > 2. Divide the resources into different parts using virtualization > and run one instance of curl loader on each Virtual Machine. For example. > have two virtual machines each with one dual core CPU (dedicated/reserved) > and 4 GB of memory (dedicated/reserved), and run one instance of curl loader > on each with -t 2 and number of clients to be approximately 75 K on each > machine. > > First option I recon would only use one Ethernet card, but the second > option would. > > > Also good, but see the previous option. > I also want to have a heavy load in terms of data rate or packets per > second so what should be the appropriate values for TIME_URL_COMPLETIONand TIME_AFTER_URL_SLEEP. > > Depends on your servers power, file size, etc Play with it. > > > Awaiting for the response. > > > > Thanks and Regards > > > ---- > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-26 08:52:59
|
Hi, I want to run a test against a web server (nginx) with traffic coming from maximum possible clients , each with different IP address. I am running the Curl Loader on a physical machine with 2 dual core processors (i.e. 4 cores), and 8 GB of memory. The machine has 2 gigabit Ethernet cards. I was just wondering how to make the most efficient utilization of these resources. I have come up with a couple of options and would need some help in choosing the most appropriate to get the maximum performance. 1. Use the entire resources together and run the curl loader scripts with -t 4 and total number of clients to be somewhere around 150 K. Considering each client consumes around 40 KB of memory, the theoretical maximum will be (8*1024*1024) / 40 somewhere around 210 K. But since we do need some spare memory, hence we choose approx 150 K. 2. Divide the resources into different parts using virtualization and run one instance of curl loader on each Virtual Machine. For example. have two virtual machines each with one dual core CPU (dedicated/reserved) and 4 GB of memory (dedicated/reserved), and run one instance of curl loader on each with -t 2 and number of clients to be approximately 75 K on each machine. First option I recon would only use one Ethernet card, but the second option would. I also want to have a heavy load in terms of data rate or packets per second so what should be the appropriate values for TIME_URL_COMPLETION and TIME_AFTER_URL_SLEEP. Awaiting for the response. Thanks and Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: SAJAL B. <s.b...@qu...> - 2010-05-21 05:00:34
|
Hi , I have already read this but I wanted to know if there is any theoretical maximum for my specifications. And secondly I am not sure why the documentation says that if you have N core machine then for multi-threading it should either be -t N or -t 2 N. I have done a test on a dual core machine and specified -t 6 and it worked so where does -t 2 N comes from ? Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Friday, 21 May 2010 2:41 PM To: curl-loader-devel Subject: Re: Regarding the multi threading option in curl loader On Fri, May 21, 2010 at 3:27 AM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: Hi, I have a couple of questions: 1. For a machine with one dual core Xeon processor and 4 GB of memory what is the theoretical maximum as the maximum number of clients (each with different IP address) ? 2. While running curl loader in the above mentioned environment (2 core processor and 4 GB of memory) why can the multi threading option be only be -t 2 or -t 4 ? Can I use something like -t 3 or -t 5 or something of this sort ? http://curl-loader.sourceforge.net/doc/faq.html#big-load http://curl-loader.sourceforge.net/high-load-hw/index.html Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-21 04:41:42
|
On Fri, May 21, 2010 at 3:27 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > I have a couple of questions: > > 1. For a machine with one dual core Xeon processor and 4 GB of memory > what is the theoretical maximum as the maximum number of clients (each > with different IP address) ? > 2. While running curl loader in the above mentioned environment (2 core > processor and 4 GB of memory) why can the multi threading option be only be > *-t 2* or *-t 4* ? Can I use something like -t 3 or -t 5 or something > of this sort ? > > http://curl-loader.sourceforge.net/doc/faq.html#big-load http://curl-loader.sourceforge.net/high-load-hw/index.html Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-21 04:40:44
|
On Fri, May 21, 2010 at 3:27 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > I have a couple of questions: > > 1. For a machine with one dual core Xeon processor and 4 GB of memory > what is the theoretical maximum as the maximum number of clients (each > with different IP address) ? > 2. While running curl loader in the above mentioned environment (2 core > processor and 4 GB of memory) why can the multi threading option be only be > *-t 2* or *-t 4* ? Can I use something like -t 3 or -t 5 or something > of this sort ? > > http://curl-loader.sourceforge.net/doc/faq.html#big-load http://curl-loader.sourceforge.net/high-load-hw/index.html Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-21 00:29:22
|
Hi, I have a couple of questions: 1. For a machine with one dual core Xeon processor and 4 GB of memory what is the theoretical maximum as the maximum number of clients (each with different IP address) ? 2. While running curl loader in the above mentioned environment (2 core processor and 4 GB of memory) why can the multi threading option be only be -t 2 or -t 4 ? Can I use something like -t 3 or -t 5 or something of this sort ? Thanks and Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Wednesday, 19 May 2010 12:20 AM To: curl-loader-devel Subject: Re: Regarding the multi threading option in curl loader Hi, On Tue, May 18, 2010 at 2:40 PM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: Hi, I want to know if one thread of curl loader can use more than one CPU if its been run in a machine with multi core CPU ? At a computer with N CPUs (or N-cores) run -t N or -t 2*N. Fore more details read the FAQs, README and http://curl-loader.sourceforge.net/high-load-hw/index.html -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-19 05:14:45
|
On Wed, May 19, 2010 at 3:12 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > Actually my question was can one thread of Curl Loader use more than one > CPU? Like I am using a dual core processor machine and when I don'tspecify the option -t 2 (since dual core) and monitor the CPU utilization > for different number of virtual client (1K, 2K, 5K etc), the CPU > utilization does not goes beyond 100%. It reaches around 98% with some 20Kclients and after that the programs just crashes. > 1. Monitor the memory usage; 2. Don't use a VM, but a real linux. Forget about Windows. > So I am assuming that given my specifications of the machine and my > configuration file this is some sort of upper bound. But what I wanted to > know is that if with these many clients (20K) its using almost one full > CPU (~ 98%) than by increasing the number of clients any further shouldn't > it use the other available CPU ? > No. |
From: SAJAL B. <s.b...@qu...> - 2010-05-19 00:20:52
|
Hi, Actually my question was can one thread of Curl Loader use more than one CPU? Like I am using a dual core processor machine and when I don't specify the option -t 2 (since dual core) and monitor the CPU utilization for different number of virtual client (1K, 2K, 5K etc), the CPU utilization does not goes beyond 100%. It reaches around 98% with some 20K clients and after that the programs just crashes. So I am assuming that given my specifications of the machine and my configuration file this is some sort of upper bound. But what I wanted to know is that if with these many clients (20K) its using almost one full CPU (~ 98%) than by increasing the number of clients any further shouldn't it use the other available CPU ? Hope I am clear in my question. Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Wednesday, 19 May 2010 12:20 AM To: curl-loader-devel Subject: Re: Regarding the multi threading option in curl loader Hi, On Tue, May 18, 2010 at 2:40 PM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: Hi, I want to know if one thread of curl loader can use more than one CPU if its been run in a machine with multi core CPU ? At a computer with N CPUs (or N-cores) run -t N or -t 2*N. Fore more details read the FAQs, README and http://curl-loader.sourceforge.net/high-load-hw/index.html -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: Pranav D. <pra...@gm...> - 2010-05-18 19:55:53
|
On Mon, May 17, 2010 at 10:28 PM, Robert Iakobashvili <cor...@gm...>wrote: > Hi Prahav, > > On Thu, May 13, 2010 at 12:21 AM, Pranav Desai <pra...@gm...>wrote: > >> Hello, >> >> I wanted to get the total resp times of the object individually, since we >> have large objects in the test cases for which the first-byte response time >> does not give the whole picture of the performance. >> >> I have attached a patch which does that, but I am not sure if I used the >> right place (load_next_step) to update the stats, so I would appreciate if >> you could review it. I can add the docs for it. Maybe someone else might be >> able to use it. >> >> Basically, it update the stat per url in load_next_step using the: >> >> curl_easy_getinfo (cctx->handle, CURLINFO_TOTAL_TIME, &resp_time); >> >> and update the ops log. So someone can parse the ops file to see how a >> particular URL performed over a period of time, or plot the overall >> performance of all the objects in a webpage/site, which is indicated by the >> last operational section in ops. >> >> It generates the following output in *.ops >> Operations: Success Failed Timed >> out Avg Resp. Time(secs) >> URL0: 1 8 0 0 0 >> 0 0.010 0.012 >> URL1: 1 8 0 0 0 >> 0 0.009 0.007 >> URL2: 1 8 0 0 0 >> 0 0.007 0.007 >> URL3: 1 7 0 0 0 >> 0 8.828 8.882 >> Operations: Success Failed Timed out >> Avg Resp. Time(secs) >> URL0: 2 10 0 0 0 >> 0 0.012 0.012 >> URL1: 2 10 0 0 0 >> 0 0.007 0.007 >> URL2: 2 10 0 0 0 >> 0 0.007 0.007 >> URL3: 1 8 0 0 0 >> 0 8.903 8.885 >> >> >> and a log entry in *.log, which I don't think is necessary, but I used it >> to validate the results so kept it in. >> 23869 1 3 1 !! END resptime=8.835 >> 25044 1 3 2 !! END resptime=8.935 >> 25892 2 0 1 !! END resptime=0.012 >> >> > > My feeling is that is has a more narrow usage. Let's wait till more people > will ask fot that. > > Thank you for the proposal and the good will to share! > > Sure. Let me know if you want me to change anything when needed. -- Pranav > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |
From: Robert I. <cor...@gm...> - 2010-05-18 14:20:25
|
Hi, On Tue, May 18, 2010 at 2:40 PM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > I want to know if one thread of curl loader can use more than one CPU if > its been run in a machine with multi core CPU ? > > At a computer with N CPUs (or N-cores) run -t N or -t 2*N. Fore more details read the FAQs, README and http://curl-loader.sourceforge.net/high-load-hw/index.html -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-18 11:41:06
|
Hi, I want to know if one thread of curl loader can use more than one CPU if its been run in a machine with multi core CPU ? Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: Robert I. <cor...@gm...> - 2010-05-18 05:33:29
|
Hi Sajal, On Thu, May 13, 2010 at 9:12 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > Can someone pin point the location to insert some sort of random function > to randomize the IP addresses assigned to each client within the given > range? > > Thanks > ---- > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > First, the IP addresses are created by curl-loader in kernel at a certain network adapter. When client objects are initialized, there is a process of binding of IP-addresses to a client. Find the place and take the IP-addresses in a random fashion instead the linear done now. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-18 05:28:32
|
Hi Prahav, On Thu, May 13, 2010 at 12:21 AM, Pranav Desai <pra...@gm...>wrote: > Hello, > > I wanted to get the total resp times of the object individually, since we > have large objects in the test cases for which the first-byte response time > does not give the whole picture of the performance. > > I have attached a patch which does that, but I am not sure if I used the > right place (load_next_step) to update the stats, so I would appreciate if > you could review it. I can add the docs for it. Maybe someone else might be > able to use it. > > Basically, it update the stat per url in load_next_step using the: > > curl_easy_getinfo (cctx->handle, CURLINFO_TOTAL_TIME, &resp_time); > > and update the ops log. So someone can parse the ops file to see how a > particular URL performed over a period of time, or plot the overall > performance of all the objects in a webpage/site, which is indicated by the > last operational section in ops. > > It generates the following output in *.ops > Operations: Success Failed Timed out > Avg Resp. Time(secs) > URL0: 1 8 0 0 0 > 0 0.010 0.012 > URL1: 1 8 0 0 0 > 0 0.009 0.007 > URL2: 1 8 0 0 0 > 0 0.007 0.007 > URL3: 1 7 0 0 0 > 0 8.828 8.882 > Operations: Success Failed Timed out > Avg Resp. Time(secs) > URL0: 2 10 0 0 0 > 0 0.012 0.012 > URL1: 2 10 0 0 0 > 0 0.007 0.007 > URL2: 2 10 0 0 0 > 0 0.007 0.007 > URL3: 1 8 0 0 0 > 0 8.903 8.885 > > > and a log entry in *.log, which I don't think is necessary, but I used it > to validate the results so kept it in. > 23869 1 3 1 !! END resptime=8.835 > 25044 1 3 2 !! END resptime=8.935 > 25892 2 0 1 !! END resptime=0.012 > > My feeling is that is has a more narrow usage. Let's wait till more people will ask fot that. Thank you for the proposal and the good will to share! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-13 06:14:41
|
Hi, Can someone pin point the location to insert some sort of random function to randomize the IP addresses assigned to each client within the given range? Thanks ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Wednesday, 12 May 2010 7:00 PM To: curl-loader-devel Subject: Re: Regarding the randomization of source IP and content fetched Hi, On Wed, May 12, 2010 at 11:55 AM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: The last question was more like if the current version of Curl Loader has ability to download recursively, or even just download everything that is referred to from a remote resource, be it a HTML page or a FTP directory listing, similar to -r parameter in wget. No, people are writing all the files and url explicitly. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: Pranav D. <pra...@gm...> - 2010-05-12 21:21:49
|
Hello, I wanted to get the total resp times of the object individually, since we have large objects in the test cases for which the first-byte response time does not give the whole picture of the performance. I have attached a patch which does that, but I am not sure if I used the right place (load_next_step) to update the stats, so I would appreciate if you could review it. I can add the docs for it. Maybe someone else might be able to use it. Basically, it update the stat per url in load_next_step using the: curl_easy_getinfo (cctx->handle, CURLINFO_TOTAL_TIME, &resp_time); and update the ops log. So someone can parse the ops file to see how a particular URL performed over a period of time, or plot the overall performance of all the objects in a webpage/site, which is indicated by the last operational section in ops. It generates the following output in *.ops Operations: Success Failed Timed out Avg Resp. Time(secs) URL0: 1 8 0 0 0 0 0.010 0.012 URL1: 1 8 0 0 0 0 0.009 0.007 URL2: 1 8 0 0 0 0 0.007 0.007 URL3: 1 7 0 0 0 0 8.828 8.882 Operations: Success Failed Timed out Avg Resp. Time(secs) URL0: 2 10 0 0 0 0 0.012 0.012 URL1: 2 10 0 0 0 0 0.007 0.007 URL2: 2 10 0 0 0 0 0.007 0.007 URL3: 1 8 0 0 0 0 8.903 8.885 and a log entry in *.log, which I don't think is necessary, but I used it to validate the results so kept it in. 23869 1 3 1 !! END resptime=8.835 25044 1 3 2 !! END resptime=8.935 25892 2 0 1 !! END resptime=0.012 Thanks -- Pranav |
From: Pranav D. <pra...@gm...> - 2010-05-12 17:13:16
|
On Wed, May 12, 2010 at 12:28 AM, Robert Iakobashvili <cor...@gm...>wrote: > Hi Prahav, > > > On Tue, May 11, 2010 at 11:53 PM, Pranav Desai <pra...@gm...>wrote: > >> >> >>>> I have tested it a bit with my testbed and it seems to work as expected, >>>> so if you find any problems or if I have missed something please let me >>>> know. >>>> >>> >>> Thanks a lot. >>> Could you, please, provide patches for the man page and the README? >>> >>> >> Re-attached with the docs. >> >> > Thanks, applied. Added you to our THANKS list. > > > cool ... thanks ! > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |
From: SAJAL B. <s.b...@qu...> - 2010-05-12 09:15:05
|
OK ! Will do that. Thanks for your help. Cheers, ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Wednesday, 12 May 2010 7:12 PM To: curl-loader-devel Subject: Re: Regarding the libcurl error Hi, On Wed, May 12, 2010 at 12:03 PM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: Hi, I am running the load with -v and most of the entries in <batch_name>.log are giving About to connect () to <Web Server IP address> as the message. Only a few (266 out of 1000 to be precise) are reading OK. I think if this would have been a routing issue, I should not be getting those 266 successful requests ? Correct, This mean either network or server-side problems (like a listening queue of your server's listen socket is full or network is slow - your VM is an issue). Please, make a wireshark capture, and analyze. Nobody can help you from remote. Note, that Apache is not really a strong server unless configured very specially. Thanks and Regards --- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...<mailto:cor...@gm...>] Sent: Wednesday, 12 May 2010 6:58 P To: curl-loader-devel Subject: Re: Regarding the libcurl error Hi Sajal, On Wed, May 12, 2010 at 11:46 AM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: Hi, Can anyone explain what does err:1 thing in <batch_name>.ctx means and how can I resolve this error. I am getting some 700 of this messages per 1K clients (each with unique IP address) that I am sending requests from. I am performing this test against a simple apache based web server running on Ubuntu server. Each client is asking for index.html page from that web server. run load with -v and read in <batch_name>.log Such errors are normally TCP/DNS errors. You should ensure, that you have routing from your client addresses to servers and DNS resolving. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... ------------------------------------------------------------------------------ _______________________________________________ curl-loader-devel mailing list cur...@li...<mailto:cur...@li...> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-12 09:12:31
|
Hi, On Wed, May 12, 2010 at 12:03 PM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > I am running the load with *-v * and most of the entries in <batch_name>.log > are giving *About to connect ()* to <Web Server IP address> as the > message. Only a few (266 out of 1000 to be precise) are reading *OK. * > I think if this would have been a routing issue, I should not be getting > those 266 successful requests ? > Correct, This mean either network or server-side problems (like a listening queue of your server's listen socket is full or network is slow - your VM is an issue). Please, make a wireshark capture, and analyze. Nobody can help you from remote. Note, that Apache is not really a strong server unless configured very specially. > > Thanks and Regards > --- > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > ------------------------------ > *From:* Robert Iakobashvili [cor...@gm...] > *Sent:* Wednesday, 12 May 2010 6:58 P > *To:* curl-loader-devel > *Subject:* Re: Regarding the libcurl error > > Hi Sajal, > > On Wed, May 12, 2010 at 11:46 AM, SAJAL BHATIA <s.b...@qu...>wrote: > >> Hi, >> >> Can anyone explain what does *err:1 *thing in *<batch_name>.ctx* means >> and how can I resolve this error. I am getting some 700 of this messages >> per 1K clients (each with unique IP address) that I am sending requests >> from. I am performing this test against a simple apache based web server >> running on Ubuntu server. Each client is asking for index.html page from >> that web server. >> >> > run load with -v and read in <batch_name>.log > > Such errors are normally TCP/DNS errors. You should ensure, that > you have routing from your client addresses to servers and DNS resolving. > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-12 09:07:23
|
Hi, I am running the load with -v and most of the entries in <batch_name>.log are giving About to connect () to <Web Server IP address> as the message. Only a few (266 out of 1000 to be precise) are reading OK. I think if this would have been a routing issue, I should not be getting those 266 successful requests ? Thanks and Regards --- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Wednesday, 12 May 2010 6:58 P To: curl-loader-devel Subject: Re: Regarding the libcurl error Hi Sajal, On Wed, May 12, 2010 at 11:46 AM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: Hi, Can anyone explain what does err:1 thing in <batch_name>.ctx means and how can I resolve this error. I am getting some 700 of this messages per 1K clients (each with unique IP address) that I am sending requests from. I am performing this test against a simple apache based web server running on Ubuntu server. Each client is asking for index.html page from that web server. run load with -v and read in <batch_name>.log Such errors are normally TCP/DNS errors. You should ensure, that you have routing from your client addresses to servers and DNS resolving. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-12 09:01:55
|
Thanks ! ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Wednesday, 12 May 2010 7:00 PM To: curl-loader-devel Subject: Re: Regarding the randomization of source IP and content fetched Hi, On Wed, May 12, 2010 at 11:55 AM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: The last question was more like if the current version of Curl Loader has ability to download recursively, or even just download everything that is referred to from a remote resource, be it a HTML page or a FTP directory listing, similar to -r parameter in wget. No, people are writing all the files and url explicitly. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-12 09:00:10
|
Hi, On Wed, May 12, 2010 at 11:55 AM, SAJAL BHATIA <s.b...@qu...> wrote: > > The last question was more like if the current version of Curl > Loader has ability to *download recursively*, or even just download > everything that is referred to from a remote resource, be it a HTML page or > a FTP directory listing, similar to *-r *parameter in *wget. * > No, people are writing all the files and url explicitly. > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-12 08:58:51
|
Hi Sajal, On Wed, May 12, 2010 at 11:46 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > Can anyone explain what does *err:1 *thing in *<batch_name>.ctx* means and > how can I resolve this error. I am getting some 700 of this messages per 1Kclients (each with unique IP address) that I am sending requests from. I > am performing this test against a simple apache based web server running > on Ubuntu server. Each client is asking for index.html page from that web > server. > > run load with -v and read in <batch_name>.log Such errors are normally TCP/DNS errors. You should ensure, that you have routing from your client addresses to servers and DNS resolving. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-12 08:58:16
|
Hi, Thanks for your reply. The last question was more like if the current version of Curl Loader has ability to download recursively, or even just download everything that is referred to from a remote resource, be it a HTML page or a FTP directory listing, similar to -r parameter in wget. Thanks and Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Tuesday, 11 May 2010 9:23 PM To: curl-loader-devel Subject: Re: Regarding the randomization of source IP and content fetched Hi Sajal, On Tue, May 11, 2010 at 8:29 AM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: * Can we randomize the source IP within a given range. I mean if we specify the IP address range to be from 172.168.1.1/16<http://172.168.1.1/16> to 172.168.255.254/16<http://172.168.255.254/16>, and have maximum number of clients to be 10 K, then can we assign random IP addresses to these 5 K client from this range Not in the current version, but it is a rather small development. * * Another question is related to randomizing the fetched content. Can we make each of these clients to fetch some random pages from a web site? If you'll take the version from the subversion, it has URL probability feature restored. Look into the URL-probability. * * Can we ask any 'n' number of clients from these 10 K clients to recursively fetch a given web page 'x' number of times? There is no support for sub-batches (groups). Y can either arrange another client computer with such purpose, or to play with URL-probability tag. * The main motive is to create a web based data set as close as possible to genuine traffic. Sure, good questions. Patches are welcomed! Thanks and Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ------------------------------------------------------------------------------ _______________________________________________ curl-loader-devel mailing list cur...@li...<mailto:cur...@li...> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-12 08:46:50
|
Hi, Can anyone explain what does err:1 thing in <batch_name>.ctx means and how can I resolve this error. I am getting some 700 of this messages per 1K clients (each with unique IP address) that I am sending requests from. I am performing this test against a simple apache based web server running on Ubuntu server. Each client is asking for index.html page from that web server. Thanks and Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: Robert I. <cor...@gm...> - 2010-05-12 07:28:31
|
Hi Prahav, On Tue, May 11, 2010 at 11:53 PM, Pranav Desai <pra...@gm...>wrote: > > >>> I have tested it a bit with my testbed and it seems to work as expected, >>> so if you find any problems or if I have missed something please let me >>> know. >>> >> >> Thanks a lot. >> Could you, please, provide patches for the man page and the README? >> >> > Re-attached with the docs. > > Thanks, applied. Added you to our THANKS list. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |