curl-loader-devel Mailing List for curl-loader - web application testing (Page 15)
Status: Alpha
Brought to you by:
coroberti
You can subscribe to this list here.
2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2007 |
Jan
|
Feb
(1) |
Mar
(7) |
Apr
(19) |
May
(25) |
Jun
(16) |
Jul
(59) |
Aug
(29) |
Sep
(18) |
Oct
(19) |
Nov
(7) |
Dec
(29) |
2008 |
Jan
(6) |
Feb
(18) |
Mar
(8) |
Apr
(27) |
May
(26) |
Jun
(5) |
Jul
(6) |
Aug
|
Sep
(9) |
Oct
(37) |
Nov
(61) |
Dec
(17) |
2009 |
Jan
(21) |
Feb
(25) |
Mar
(4) |
Apr
(2) |
May
(8) |
Jun
(15) |
Jul
(18) |
Aug
(23) |
Sep
(10) |
Oct
(16) |
Nov
(14) |
Dec
(22) |
2010 |
Jan
(23) |
Feb
(8) |
Mar
(18) |
Apr
(1) |
May
(34) |
Jun
(23) |
Jul
(11) |
Aug
(1) |
Sep
(13) |
Oct
(10) |
Nov
(2) |
Dec
(8) |
2011 |
Jan
|
Feb
(7) |
Mar
(24) |
Apr
(12) |
May
(3) |
Jun
(2) |
Jul
(2) |
Aug
|
Sep
(5) |
Oct
(20) |
Nov
(7) |
Dec
(11) |
2012 |
Jan
(12) |
Feb
(5) |
Mar
(16) |
Apr
(3) |
May
|
Jun
(5) |
Jul
(12) |
Aug
(6) |
Sep
|
Oct
|
Nov
(8) |
Dec
|
2013 |
Jan
(1) |
Feb
(3) |
Mar
(5) |
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(8) |
Dec
(4) |
2014 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
(11) |
Dec
(5) |
2015 |
Jan
(1) |
Feb
|
Mar
(11) |
Apr
(3) |
May
(1) |
Jun
(1) |
Jul
(4) |
Aug
(1) |
Sep
(7) |
Oct
(4) |
Nov
(2) |
Dec
|
2016 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Pranav D. <pra...@gm...> - 2010-05-11 20:53:53
|
On Tue, May 11, 2010 at 10:58 AM, Robert Iakobashvili <cor...@gm...>wrote: > Hi Prahav, > > On Tue, May 11, 2010 at 8:41 PM, Pranav Desai <pra...@gm...>wrote: > >> Hi! >> >> I needed to ignore the content length for certain URLs (videos) for my >> test case. Without it curl-loader will report it as an error. I have >> attached a small patch for that. I thought it might be useful to someone >> else. It basically does this for the specified URL. >> >> curl_easy_setopt (handle, CURLOPT_IGNORE_CONTENT_LENGTH, 1); >> >> I have tested it a bit with my testbed and it seems to work as expected, >> so if you find any problems or if I have missed something please let me >> know. >> > > Thanks a lot. > Could you, please, provide patches for the man page and the README? > > Re-attached with the docs. > > >> >> Thanks >> -- Pranav >> >> >> ------------------------------------------------------------------------------ >> >> >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |
From: Robert I. <cor...@gm...> - 2010-05-11 17:58:56
|
Hi Prahav, On Tue, May 11, 2010 at 8:41 PM, Pranav Desai <pra...@gm...>wrote: > Hi! > > I needed to ignore the content length for certain URLs (videos) for my test > case. Without it curl-loader will report it as an error. I have attached a > small patch for that. I thought it might be useful to someone else. It > basically does this for the specified URL. > > curl_easy_setopt (handle, CURLOPT_IGNORE_CONTENT_LENGTH, 1); > > I have tested it a bit with my testbed and it seems to work as expected, so > if you find any problems or if I have missed something please let me know. > Thanks a lot. Could you, please, provide patches for the man page and the README? > > Thanks > -- Pranav > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Pranav D. <pra...@gm...> - 2010-05-11 17:41:36
|
Hi! I needed to ignore the content length for certain URLs (videos) for my test case. Without it curl-loader will report it as an error. I have attached a small patch for that. I thought it might be useful to someone else. It basically does this for the specified URL. curl_easy_setopt (handle, CURLOPT_IGNORE_CONTENT_LENGTH, 1); I have tested it a bit with my testbed and it seems to work as expected, so if you find any problems or if I have missed something please let me know. Thanks -- Pranav |
From: Robert I. <cor...@gm...> - 2010-05-11 11:23:20
|
Hi Sajal, On Tue, May 11, 2010 at 8:29 AM, SAJAL BHATIA <s.b...@qu...> wrote: > > > - Can we randomize the source IP within a given range. I mean if we > specify the IP address range to be from 172.168.1.1/16 to > 172.168.255.254/16, and have maximum number of clients to be 10 K, then > can we assign random IP addresses to these 5 K client from this range > > Not in the current version, but it is a rather small development. > > - > - Another question is related to randomizing the fetched content. Can > we make each of these clients to fetch some random pages from a web site? > > If you'll take the version from the subversion, it has URL probability feature restored. Look into the URL-probability. > > - > - Can we ask any 'n' number of clients from these 10 K clients to > recursively fetch a given web page 'x' number of times? > > There is no support for sub-batches (groups). Y can either arrange another client computer with such purpose, or to play with URL-probability tag. > > - > > The main motive is to create a web based data set as close as possible > to genuine traffic. > > > Sure, good questions. Patches are welcomed! > Thanks and Regards > ---- > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-11 05:29:43
|
Hi, I have a some questions: * Can we randomize the source IP within a given range. I mean if we specify the IP address range to be from 172.168.1.1/16 to 172.168.255.254/16, and have maximum number of clients to be 10 K, then can we assign random IP addresses to these 5 K client from this range * Another question is related to randomizing the fetched content. Can we make each of these clients to fetch some random pages from a web site? * Can we ask any 'n' number of clients from these 10 K clients to recursively fetch a given web page 'x' number of times? The main motive is to create a web based data set as close as possible to genuine traffic. Thanks and Regards ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: Robert I. <cor...@gm...> - 2010-05-10 08:30:54
|
Hi, On Mon, May 10, 2010 at 10:54 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > Thanks for such a prompt response. I know it would be less powerful on VMas compared to a real Machine but I need to test it in a virtual environment > first to make sure its suits my purpose before running it on a dedicated > real machine. You mentioned that it will need several web-servers against > it, can you elaborate a little on this, as to why would it require several > web-servers against it. > Doubts, if a single server will withstand such a serious load. Apache is not strong, read our FAQs, pages, ngnix, etc other servers are more powerful, but this is just an advise. > > One more thing I wanted to ask, its like I want really stress my > static apache2 web server (possibly bring it down), can you roughly > suggest the configurations for this as in terms of number of clients, ramp > up, number of threads etc. I understand this question is highly network > dependent but I just want to have a rough idea. I will give the > specifications of the environment I am working in. > It depends. Start from some low number and increase. You need a good monitoring of your server. > > Both source and destination are running on same machine physical machinebut as different VMs. > Source VM has 4 virtual CPUs and 4 GB of Virtual Memory and is > running Fedora12 (2.3.62). Target machine has 2 Virtual CPUs and 512 MB > of Virtual Memory running Ubuntu 8.04 (server). > > Thanks ! > ---- > Sajal Bhatia() > Research Masters Student > QUT, Brisbane > AUSTRALIA > > ------------------------------ > *From:* Robert Iakobashvili [cor...@gm...] > *Sent:* Monday, 10 May 2010 5:01 PM > *To:* curl-loader-devel > *Subject:* Re: Regarding the command line option for number of threads > > Hi Sajal, > > On Mon, May 10, 2010 at 9:53 AM, SAJAL BHATIA <s.b...@qu...> wrote: > >> Hi, >> >> I have a questions regrading the command line option -t <number of >> threads>. I am using a virtual machine with 4 virtual CPUs >> > > We never tried it on VMs. > Obviously, it will be less powerful than on a real linux. > > >> , so if I mention -t 4 in the command line while executing the curl >> loader and I have mentioned 20K clients (within a specified address >> range) in my configuration file, then will each of these 4 threads have 20Kclients or will it be 20Kclients distributed between 4 threads (or 5Kper thread). If so (later), will the distribution be like first 5Kto first thread and next 5Kto second and so forth? >> > > I will be 5K users at each thread. > I do not believe that a VM could work with 20K users. > > Y will also need a lot of memory. Y will also need several web-servers > against it. > > Please, read the FAQ below: > http://curl-loader.sourceforge.net/high-load-hw/index.html > > > > >> >> Thanks >> >> ---- >> Sajal Bhatia >> Research Masters Student >> QUT, Brisbane >> AUSTRALIA >> >> >> ------------------------------------------------------------------------------ >> >> >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-10 07:54:30
|
Hi, Thanks for such a prompt response. I know it would be less powerful on VM as compared to a real Machine but I need to test it in a virtual environment first to make sure its suits my purpose before running it on a dedicated real machine. You mentioned that it will need several web-servers against it, can you elaborate a little on this, as to why would it require several web-servers against it. One more thing I wanted to ask, its like I want really stress my static apache2 web server (possibly bring it down), can you roughly suggest the configurations for this as in terms of number of clients, ramp up, number of threads etc. I understand this question is highly network dependent but I just want to have a rough idea. I will give the specifications of the environment I am working in. Both source and destination are running on same machine physical machine but as different VMs. Source VM has 4 virtual CPUs and 4 GB of Virtual Memory and is running Fedora12 (2.3.62). Target machine has 2 Virtual CPUs and 512 MB of Virtual Memory running Ubuntu 8.04 (server). Thanks ! ---- Sajal Bhatia() Research Masters Student QUT, Brisbane AUSTRALIA ________________________________ From: Robert Iakobashvili [cor...@gm...] Sent: Monday, 10 May 2010 5:01 PM To: curl-loader-devel Subject: Re: Regarding the command line option for number of threads Hi Sajal, On Mon, May 10, 2010 at 9:53 AM, SAJAL BHATIA <s.b...@qu...<mailto:s.b...@qu...>> wrote: Hi, I have a questions regrading the command line option -t <number of threads>. I am using a virtual machine with 4 virtual CPUs We never tried it on VMs. Obviously, it will be less powerful than on a real linux. , so if I mention -t 4 in the command line while executing the curl loader and I have mentioned 20K clients (within a specified address range) in my configuration file, then will each of these 4 threads have 20K clients or will it be 20K clients distributed between 4 threads (or 5K per thread). If so (later), will the distribution be like first 5K to first thread and next 5K to second and so forth? I will be 5K users at each thread. I do not believe that a VM could work with 20K users. Y will also need a lot of memory. Y will also need several web-servers against it. Please, read the FAQ below: http://curl-loader.sourceforge.net/high-load-hw/index.html Thanks ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA ------------------------------------------------------------------------------ _______________________________________________ curl-loader-devel mailing list cur...@li...<mailto:cur...@li...> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-05-10 07:01:59
|
Hi Sajal, On Mon, May 10, 2010 at 9:53 AM, SAJAL BHATIA <s.b...@qu...> wrote: > Hi, > > I have a questions regrading the command line option -t <number of > threads>. I am using a virtual machine with 4 virtual CPUs > We never tried it on VMs. Obviously, it will be less powerful than on a real linux. > , so if I mention -t 4 in the command line while executing the curl loader > and I have mentioned 20K clients (within a specified address range) in my > configuration file, then will each of these 4 threads have 20K clients or > will it be 20K clients distributed between 4 threads (or 5K per thread). > If so (later), will the distribution be like first 5K to first thread and > next 5K to second and so forth? > I will be 5K users at each thread. I do not believe that a VM could work with 20K users. Y will also need a lot of memory. Y will also need several web-servers against it. Please, read the FAQ below: http://curl-loader.sourceforge.net/high-load-hw/index.html > > Thanks > > ---- > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: SAJAL B. <s.b...@qu...> - 2010-05-10 06:53:33
|
Hi, I have a questions regrading the command line option -t <number of threads>. I am using a virtual machine with 4 virtual CPUs, so if I mention -t 4 in the command line while executing the curl loader and I have mentioned 20K clients (within a specified address range) in my configuration file, then will each of these 4 threads have 20K clients or will it be 20K clients distributed between 4 threads (or 5K per thread). If so (later), will the distribution be like first 5K to first thread and next 5K to second and so forth? Thanks ---- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: Gann, L. <leo...@si...> - 2010-04-19 11:47:16
|
Hallo, Ich will BASIC-Authentication (RFC2617) testen mit variablen user-credentials im HTTP-Header in einer einzigen webpage. Ich konnte bislang nur eine Art FORM_BASED-Authentication mit variablen user-credentials (mit FORM_USAGE_TYPE=RECORDS_FROM_FILE) erfolgreich testen. D.h. ich will folgendes testen können: WEB_AUTH_METHOD=BASIC WEB_AUTH_CREDENTIALS=?RECORDS_FROM_FILE? (user und password sollen aus einem file gelesen werden können) Danke, Leo Gann. |
From: Robert I. <cor...@gm...> - 2010-03-27 14:58:40
|
Dear Harivinod Marimganti, On Sat, Mar 27, 2010 at 5:34 PM, harivinod marimganti <sig...@gm...>wrote: > Hi , > > I have a question regarding the IPSpoofing Functionality , i have tested > the CURL loader in my Localsystem , its working fine , but i am unable to > implement the same with a internet website url. > > Is this possible with curl loader. > Am i missing some thing. > > The config file which i created is(The ips are not mine, they are some > random pick) > > ########### GENERAL SECTION ################################ > BATCH_NAME= custom_hdrs2 > CLIENTS_NUM_MAX =50 > INTERFACE =eth1 > NETMASK=20 > IP_ADDR_MIN= 192.168.10.10 > IP_ADDR_MAX= 192.168.10.100 #Actually - this is for self-control > CYCLES_NUM= 1 > URLS_NUM = 1 > > > ########### URL SECTION #################################### > > URL= http://www.curl.xxxxx.in/default.aspx > URL_SHORT_NAME=" Index-url" > REQUEST_TYPE=GET > HEADER="User-Agent: the string of my favorite browser" > HEADER="Referer: The second custom header" > TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by > cancelling url fetch on timeout > TIMER_AFTER_URL_SLEEP =50 > > > Thank and Regards > > > This is a popular issue. This is up to you to ensure, that the addresses, that you are using IP_ADDR_MIN IP_ADDR_MAX are routable to the external devices. The addersses , that you are using are not routable addresses. Please, talk to you system admin or network admin. > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > |
From: harivinod m. <sig...@gm...> - 2010-03-27 14:34:43
|
Hi , I have a question regarding the IPSpoofing Functionality , i have tested the CURL loader in my Localsystem , its working fine , but i am unable to implement the same with a internet website url. Is this possible with curl loader. Am i missing some thing. The config file which i created is(The ips are not mine, they are some random pick) ########### GENERAL SECTION ################################ BATCH_NAME= custom_hdrs2 CLIENTS_NUM_MAX =50 INTERFACE =eth1 NETMASK=20 IP_ADDR_MIN= 192.168.10.10 IP_ADDR_MAX= 192.168.10.100 #Actually - this is for self-control CYCLES_NUM= 1 URLS_NUM = 1 ########### URL SECTION #################################### URL= http://www.curl.xxxxx.in/default.aspx URL_SHORT_NAME=" Index-url" REQUEST_TYPE=GET HEADER="User-Agent: the string of my favorite browser" HEADER="Referer: The second custom header" TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP =50 Thank and Regards |
From: Emil W. <emi...@sn...> - 2010-03-25 09:36:04
|
Hi developers, this is a feature request. I would like to request the feature to be able to make my HTTP Basic authentication requests using credentials (randomly) from and external file. This feature already exists for GET/POST form based submits (FORM_RECORDS_FILE) but the systems I need to lead test do not support this as an authentication method. Please feel free to reply or ask any questions necessary, regards, -- Emil Waijers |
From: Robert I. <cor...@gm...> - 2010-03-22 05:29:59
|
Hi Val, On Mon, Mar 22, 2010 at 6:13 AM, Val Shkolnikov <va...@nv...> wrote: > >> Applied the patch, thanks! >> >>> >>> I also noticed that the tag FETCH_PROBAILITY_ONCE is not used, but will >>> let you decide whether it's worth to keep. >>> >> >> Please, look at the latest version in svn. Thanks! >> >> The array of url_fetch_decision was allocated per client only, when >> FETCH_PROBAILITY_ONCE was defined. >> The array is used to cache the fetching decision and to decrease >> the calls to random() for high-load testing. >> >> In most of the cases, however, it will be of no use, and it was corrected >> not to use url_fetch_decision array, when it is not supposed to be >> allocated. >> >> > > Have you a chance to look at the version in svn and try the changes made? > Thanks! > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > Sorry, a bit busy. I might have a chance to integrate the new code into > mine next week. > /Val > Sure, take your time. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Val S. <va...@nv...> - 2010-03-22 04:13:14
|
Hi Robert, From: Robert Iakobashvili <cor...@gm...> >To: curl-loader-devel <cur...@li...> >Sent: Sun, March 21, 2010 12:02:05 AM >Subject: Re: url probability bug fix and random number use update > >>Hi Val, > > >On Sun, Mar 14, 2010 at 8:54 AM, Robert Iakobashvili <cor...@gm...> wrote: > >>>Hi Val, >> >> >>On Mon, Mar 8, 2010 at 7:16 AM, Val Shkolnikov <va...@nv...> wrote: >> >>>>>Hi Robert, >>>>>> There is a major bug in the curl-loader regarding use of the FETCH_PROBABILITY tag. In loader_fsm.c, svn rev 574, line 1359 is >>> >>>>>>cctx->url_curr_index = (size_t) url_next; >>> >>>>>>The url index is updated but not the url pointer. The result of this bug is that all urls are selected regardless of the probability somewhat uniformly. >>> >>>>>>While debugging this I also reworked the use of probability in the curl-loader to produce better pseudo-random numbers and also added a tag RANDOM_SEED that lets you control the seed. I am attaching the patch against svn rev 574. To use it >>> >>>>>>cd curl-loader >>>>>>patch -p3 <curl-loader.patch.574+vs6 >>> >> >>Applied the patch, thanks! >> >>>>> >>> >>>>>>I also noticed that the tag FETCH_PROBAILITY_ONCE is not used, but will let you decide whether it's worth to keep. >>> >> >>Please, look at the latest version in svn. Thanks! >> >>The array of url_fetch_decision was allocated per client only, when FETCH_PROBAILITY_ONCE was defined. >>>> >>The array is used to cache the fetching decision and to decrease >>the calls to random() for high-load testing. >> >>In most of the cases, however, it will be of no use, and it was corrected >>not to use url_fetch_decision array, when it is not supposed to be allocated. >>>> >> >Have you a chance to look at the version in svn and try the changes made? >Thanks! > >-- >Truly, >Robert Iakobashvili, Ph.D. >...................................................................... >www.ghotit.com >Assistive technology that understands you >...................................................................... >Sorry, a bit busy. I might have a chance to integrate the new code into mine next week. /Val |
From: Robert I. <cor...@gm...> - 2010-03-21 07:02:13
|
Hi Val, On Sun, Mar 14, 2010 at 8:54 AM, Robert Iakobashvili <cor...@gm...>wrote: > Hi Val, > > On Mon, Mar 8, 2010 at 7:16 AM, Val Shkolnikov <va...@nv...> wrote: > >> Hi Robert, >> There is a major bug in the curl-loader regarding use of the >> FETCH_PROBABILITY tag. In loader_fsm.c, svn rev 574, line 1359 is >> >> cctx->url_curr_index = (size_t) url_next; >> >> The url index is updated but not the url pointer. The result of this bug >> is that all urls are selected regardless of the probability somewhat >> uniformly. >> >> While debugging this I also reworked the use of probability in the >> curl-loader to produce better pseudo-random numbers and also added a tag >> RANDOM_SEED that lets you control the seed. I am attaching the patch >> against svn rev 574. To use it >> >> cd curl-loader >> patch -p3 <curl-loader.patch.574+vs6 >> > > Applied the patch, thanks! > >> >> I also noticed that the tag FETCH_PROBAILITY_ONCE is not used, but will >> let you decide whether it's worth to keep. >> > > Please, look at the latest version in svn. Thanks! > > The array of url_fetch_decision was allocated per client only, when > FETCH_PROBAILITY_ONCE was defined. > The array is used to cache the fetching decision and to decrease > the calls to random() for high-load testing. > > In most of the cases, however, it will be of no use, and it was corrected > not to use url_fetch_decision array, when it is not supposed to be > allocated. > > Have you a chance to look at the version in svn and try the changes made? Thanks! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-03-14 06:55:58
|
Hi Val, On Tue, Mar 9, 2010 at 7:26 AM, Val Shkolnikov <va...@nv...> wrote: > Hi Robert, > sorry for the flood of fixes but this particular problem annoyed me for a > while, and I finally got around to it. Attached is a fix for the above tag > parsing that caused sporadic errors. The reason was that the strtok_r > delimiter argument is supposed to be a string, not a character address. The > same bug is also in another part of the same source (comma). I also added > line number printing to the error diagnostic. > Regards, > /Val > Thanks, applied! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-03-14 06:55:00
|
Hi Val, On Mon, Mar 8, 2010 at 7:16 AM, Val Shkolnikov <va...@nv...> wrote: > Hi Robert, > There is a major bug in the curl-loader regarding use of the > FETCH_PROBABILITY tag. In loader_fsm.c, svn rev 574, line 1359 is > > cctx->url_curr_index = (size_t) url_next; > > The url index is updated but not the url pointer. The result of this bug > is that all urls are selected regardless of the probability somewhat > uniformly. > > While debugging this I also reworked the use of probability in the > curl-loader to produce better pseudo-random numbers and also added a tag > RANDOM_SEED that lets you control the seed. I am attaching the patch > against svn rev 574. To use it > > cd curl-loader > patch -p3 <curl-loader.patch.574+vs6 > Applied the patch, thanks! > > I also noticed that the tag FETCH_PROBAILITY_ONCE is not used, but will let > you decide whether it's worth to keep. > Please, look at the latest version in svn. Thanks! The array of url_fetch_decision was allocated per client only, when FETCH_PROBAILITY_ONCE was defined. The array is used to cache the fetching decision and to decrease the calls to random() for high-load testing. In most of the cases, however, it will be of no use, and it was corrected not to use url_fetch_decision array, when it is not supposed to be allocated. > > Regards, > /Val > Best wishes! Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-03-09 05:29:56
|
Hi Val, On Tue, Mar 9, 2010 at 7:26 AM, Val Shkolnikov <va...@nv...> wrote: > Hi Robert, > sorry for the flood of fixes but this particular problem annoyed me for a > while, and I finally got around to it. Attached is a fix for the above tag > parsing that caused sporadic errors. The reason was that the strtok_r > delimiter argument is supposed to be a string, not a character address. The > same bug is also in another part of the same source (comma). I also added > line number printing to the error diagnostic. > Regards, > /Val > > > Correct fix, Thanks! > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > |
From: Val S. <va...@nv...> - 2010-03-09 05:26:10
|
Hi Robert, sorry for the flood of fixes but this particular problem annoyed me for a while, and I finally got around to it. Attached is a fix for the above tag parsing that caused sporadic errors. The reason was that the strtok_r delimiter argument is supposed to be a string, not a character address. The same bug is also in another part of the same source (comma). I also added line number printing to the error diagnostic. Regards, /Val |
From: Robert I. <cor...@gm...> - 2010-03-08 05:50:07
|
Hi Val, On Mon, Mar 8, 2010 at 7:16 AM, Val Shkolnikov <va...@nv...> wrote: > Hi Robert, > There is a major bug in the curl-loader regarding use of the > FETCH_PROBABILITY tag. In loader_fsm.c, svn rev 574, line 1359 is > > cctx->url_curr_index = (size_t) url_next; > > The url index is updated but not the url pointer. The result of this bug > is that all urls are selected regardless of the probability somewhat > uniformly. > Thank you. I'll investigate, when it was broken. > > While debugging this I also reworked the use of probability in the > curl-loader to produce better pseudo-random numbers and also added a tag > RANDOM_SEED that lets you control the seed. I am attaching the patch > against svn rev 574. To use it > > cd curl-loader > patch -p3 <curl-loader.patch.574+vs6 > > I also noticed that the tag FETCH_PROBAILITY_ONCE is not used, but will let > you decide whether it's worth to keep. > > Regards, > /Val Thank you very much for your reproting and the patch. I will look into these issues. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Val S. <va...@nv...> - 2010-03-08 05:43:11
|
Hi Robert, There is a major bug in the curl-loader regarding use of the FETCH_PROBABILITY tag. In loader_fsm.c, svn rev 574, line 1359 is cctx->url_curr_index = (size_t) url_next; The url index is updated but not the url pointer. The result of this bug is that all urls are selected regardless of the probability somewhat uniformly. While debugging this I also reworked the use of probability in the curl-loader to produce better pseudo-random numbers and also added a tag RANDOM_SEED that lets you control the seed. I am attaching the patch against svn rev 574. To use it cd curl-loader patch -p3 <curl-loader.patch.574+vs6 I also noticed that the tag FETCH_PROBAILITY_ONCE is not used, but will let you decide whether it's worth to keep. Regards, /Val |
From: 王鹏 <sjz...@gm...> - 2010-03-03 15:24:54
|
Got it, thanks very much for your help. On Wed, Mar 3, 2010 at 11:17 PM, Robert Iakobashvili <cor...@gm...>wrote: > Dear WP, > > > On Wed, Mar 3, 2010 at 5:07 PM, 王鹏 <sjz...@gm...> wrote: > >> Hi Robert, >> >> Thanks for your quickly response, for curl-loader, I have search docs, but >> can not find parameters/tags which receive thousands users to realize >> multiple proxy auth, could you point out how to set it? >> >> Thanks, >> WP >> > > Got it. You wish to test the authentication of proxy with > thousand of users, where each user possesses his own and unique > credentials. > > Sorry, we do not have such functionality for proxy, whereas it exists > for non-proxy. > > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > Download Intel® Parallel Studio Eval > Try the new software tools for yourself. Speed compiling, find bugs > proactively, and fine-tune applications for parallel performance. > See why Intel Parallel Studio got high marks during beta. > http://p.sf.net/sfu/intel-sw-dev > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |
From: Robert I. <cor...@gm...> - 2010-03-03 15:17:15
|
Dear WP, On Wed, Mar 3, 2010 at 5:07 PM, 王鹏 <sjz...@gm...> wrote: > Hi Robert, > > Thanks for your quickly response, for curl-loader, I have search docs, but > can not find parameters/tags which receive thousands users to realize > multiple proxy auth, could you point out how to set it? > > Thanks, > WP > Got it. You wish to test the authentication of proxy with thousand of users, where each user possesses his own and unique credentials. Sorry, we do not have such functionality for proxy, whereas it exists for non-proxy. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: 王鹏 <sjz...@gm...> - 2010-03-03 15:08:00
|
Hi Robert, Thanks for your quickly response, for curl-loader, I have search docs, but can not find parameters/tags which receive thousands users to realize multiple proxy auth, could you point out how to set it? Thanks, WP On Wed, Mar 3, 2010 at 10:09 PM, Robert Iakobashvili <cor...@gm...>wrote: > Hi WP, > > > On Wed, Mar 3, 2010 at 3:56 PM, 王鹏 <sjz...@gm...> wrote: > >> Hi Everyone, >> >> I get a question about authenticate mutiple users to an explicit proxy. I >> don't know how curl-loader specify username and password from conf file so >> that I can proxy auth with multiple users instead of only set one user >> credential in the tag PROXY_AUTH_CREDENTIALS >> >> FAQ does not mention that scenario, looks like it only have config for >> forms, such as FORM_USAGE_TYPE= "RECORDS_FROM_FILE" >> >> I can auth successfully with one user. >> PROXY_AUTH_METHOD=NTLM >> PROXY_AUTH_CREDENTIALS=Username:password >> >> >> Any assistance would be greatly appreciated. >> >> Thanks, >> WP >> > libcurl may be broken for NTLM. > All other methods of authentication can be used for thousands of users and > more. > > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > Download Intel® Parallel Studio Eval > Try the new software tools for yourself. Speed compiling, find bugs > proactively, and fine-tune applications for parallel performance. > See why Intel Parallel Studio got high marks during beta. > http://p.sf.net/sfu/intel-sw-dev > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |