curl-loader-devel Mailing List for curl-loader - web application testing (Page 23)
Status: Alpha
Brought to you by:
coroberti
You can subscribe to this list here.
2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2007 |
Jan
|
Feb
(1) |
Mar
(7) |
Apr
(19) |
May
(25) |
Jun
(16) |
Jul
(59) |
Aug
(29) |
Sep
(18) |
Oct
(19) |
Nov
(7) |
Dec
(29) |
2008 |
Jan
(6) |
Feb
(18) |
Mar
(8) |
Apr
(27) |
May
(26) |
Jun
(5) |
Jul
(6) |
Aug
|
Sep
(9) |
Oct
(37) |
Nov
(61) |
Dec
(17) |
2009 |
Jan
(21) |
Feb
(25) |
Mar
(4) |
Apr
(2) |
May
(8) |
Jun
(15) |
Jul
(18) |
Aug
(23) |
Sep
(10) |
Oct
(16) |
Nov
(14) |
Dec
(22) |
2010 |
Jan
(23) |
Feb
(8) |
Mar
(18) |
Apr
(1) |
May
(34) |
Jun
(23) |
Jul
(11) |
Aug
(1) |
Sep
(13) |
Oct
(10) |
Nov
(2) |
Dec
(8) |
2011 |
Jan
|
Feb
(7) |
Mar
(24) |
Apr
(12) |
May
(3) |
Jun
(2) |
Jul
(2) |
Aug
|
Sep
(5) |
Oct
(20) |
Nov
(7) |
Dec
(11) |
2012 |
Jan
(12) |
Feb
(5) |
Mar
(16) |
Apr
(3) |
May
|
Jun
(5) |
Jul
(12) |
Aug
(6) |
Sep
|
Oct
|
Nov
(8) |
Dec
|
2013 |
Jan
(1) |
Feb
(3) |
Mar
(5) |
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(8) |
Dec
(4) |
2014 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
(11) |
Dec
(5) |
2015 |
Jan
(1) |
Feb
|
Mar
(11) |
Apr
(3) |
May
(1) |
Jun
(1) |
Jul
(4) |
Aug
(1) |
Sep
(7) |
Oct
(4) |
Nov
(2) |
Dec
|
2016 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Robert I. <cor...@gm...> - 2009-02-24 20:20:28
|
Hi Aron, On Tue, Feb 24, 2009 at 9:56 PM, Bellorado, Aron <abe...@ve...>wrote: > I am using curl-loader 0.47 against an Apache web server running HTTP > 1.1. With the Apache web server “KeepAlive” config parameter set to OFF, > curl-loader properly ramp ups the number of clients based on the specific > batch file parameters as expected with the Apache web server closing every > connection after each transaction is made. The attached capture file > “withKeepAliveOff.cap” shows this proper behavior using the curl-loader > batch file shown below. > > When Apache was configured with “KeepAlive” set to ON, the web server does > not close the TCP connection after a transaction is complete leaving the > connection open for the client for subsequent transactions. When > curl-loader was run an Apache web server with Keep Alive set to ON, > curl-loader does not ramp up the clients correctly not incrementing IP > address appropriately during the ramp up time. Curl-loader appears to be > reusing existing client IP addresses (possibly because the connections are > still open) instead of using incremented IP addresses, although the > curl-loader log files show the IP addresses are being incremented properly. > The attached capture file “withKeepAliveOn.cap” shows this incorrect > behavior using the same curl-loader batch file shown below. Any help would > be greatly appreciated. > > This behavior is inherited from libcurl. Try to configure the connections not to be re-used (there is a tag for that), but it is not necessarily will be helpful to keep KA connections with apache. > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > |
From: Robert I. <cor...@gm...> - 2009-02-13 05:25:28
|
Hi Tomas, On Fri, Feb 13, 2009 at 3:08 AM, Tomas Weinfurt <twe...@ya...> wrote: > I was looking for option how to split load via multiple physical > interfaces. I would like to generate more than 1gbit but INTERFACE option > seems to take only one name. > > Any idea? > > It looks like there should be no fundamental problem with since curl-loader > already supports multiple destinations. I was wondering if it would make > sense to make interface/IP info list/array if it would make sense to have > option to override it in URL section. > > Tomas > Well, it is just an open source code, where you are more to welcome to add your patches and to publish them for other users and, probably, a mainline integration. A load of more than 1 GB is not an issue, if you have a strong enough HW, where the below link may be of some assistance: http://curl-loader.sourceforge.net/high-load-hw/index.html Without any patching you can run two instances of curl-loader from two console terminals, where each will be loading via it's own network interface. It will work even better on PCs with more than a single CPU/core, and I would recommend to add a small patch of CPU-affinity to make each load CPU-independent. Take care, that you have enough memory. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Tomas W. <twe...@ya...> - 2009-02-13 01:08:10
|
I was looking for option how to split load via multiple physical interfaces. I would like to generate more than 1gbit but INTERFACE option seems to take only one name. Any idea? It looks like there should be no fundamental problem with since curl-loader already supports multiple destinations. I was wondering if it would make sense to make interface/IP info list/array if it would make sense to have option to override it in URL section. Tomas |
From: Ivor O. <ivo...@co...> - 2009-02-12 05:41:44
|
Security:[SEC=UNCLASSIFIED] Thanks Frank , you were correct about the IP addresses. I didn't realize that they had to be valid routable addresses within our network. It is connecting fine now. >>> <Fra...@ma...> 10/02/09 16:38 >>> Hi Ivor, There is no issue for curl-loader to fetch https:// URL's with selfsigned ceritifcates. I used curl-loader against a SSL-apache2 with self-signed certs just fine. curl-loader does not check for certificates, it applies the equivalent of curl -k: loader.c: curl_easy_setopt (handle, CURLOPT_SSL_VERIFYPEER, 0); loader.c: curl_easy_setopt (handle, CURLOPT_SSL_VERIFYHOST, 0); However I fouond the most likely reasons for getting a URL timeout: 1. Your source IP's do not match. Before specifying any additional IP's set the local interface IP first as the only one and try it. I.e. if eth0= 194.90.71.215, set: IP_ADDR_MIN=194.90.71.215 IP_ADDR_MAX=194.90.71.215 2. Proxy issues. Depending on your environment, libcurl might or might not pick up the proxy. Check with curl-loader options -d -v -u the output of the log file and check especially that the target URL used by curl-loader is the one you intend to use. Set/unset the proxy using the libcurl environment variables or the -x option in curl-loader. Also, it is good to start with a non-forms html page, something as simple as a static index.html. This way you can narrow down the isses until your config works. Running a parallel tcpdump window to check the connection might help, too, although with SSL you don't see the data. Best Regards, Frank "Ivor Oorloff" <ivo...@co...> 02/10/2009 08:43 AM Please respond to curl-loader-devel <cur...@li...> To <cur...@li...> cc Subject Curl-loader certificate handling [SEC=UNCLASSIFIED] Security:[SEC=UNCLASSIFIED] Hi, I was hoping to find a way to use curl-loader for load testing https websites that have a "testing" server certificate. ie. the server certificate is self generated with no valid or trusted root CA. This can be achieved in curl with the -k or --insecure option which ignores CA confirmation. Is there a way to trigger this option in curl-loader? I have run the same URL in curl with the -k option and can access it with no problem. However curl-loader will not access the site. (Works fine for non-ssl sites) Curl and Curl-loader appear to use the same libraries: - oracle@cedar-dev:~/curl-loader-0.46> ldd curl-loader linux-gate.so.1 => (0xffffe000) libdl.so.2 => /lib/libdl.so.2 (0xb7f6f000) libpthread.so.0 => /lib/libpthread.so.0 (0xb7f5a000) librt.so.1 => /lib/librt.so.1 (0xb7f51000) libz.so.1 => /lib/libz.so.1 (0xb7f3f000) libssl.so.0.9.8 => /usr/lib/libssl.so.0.9.8 (0xb7f01000) libcrypto.so.0.9.8 => /usr/lib/libcrypto.so.0.9.8 (0xb7dd9000) libc.so.6 => /lib/libc.so.6 (0xb7cb7000) /lib/ld-linux.so.2 (0xb7f89000) oracle@cedar-dev:~/curl-loader-0.46> which curl /usr/bin/curl oracle@cedar-dev:~/curl-loader-0.46> ldd /usr/bin/curl linux-gate.so.1 => (0xffffe000) libcurl.so.3 => /usr/lib/libcurl.so.3 (0xb7f7e000) libidn.so.11 => /usr/lib/libidn.so.11 (0xb7f4e000) libssl.so.0.9.8 => /usr/lib/libssl.so.0.9.8 (0xb7f11000) libcrypto.so.0.9.8 => /usr/lib/libcrypto.so.0.9.8 (0xb7de9000) libdl.so.2 => /lib/libdl.so.2 (0xb7de4000) libz.so.1 => /lib/libz.so.1 (0xb7dd2000) libc.so.6 => /lib/libc.so.6 (0xb7cb0000) /lib/ld-linux.so.2 (0xb7fc8000) Curl-loader gets a timeout condition as below:- cedar-dev:/apps/oracle/curl-loader-0.46 # more login_uas_logoff_cycling.log 0 1 (194.90.71.215) :== Info: About to connect() to pssamembersuat.comsuper.gov .au port 443 (#0) : eff-url: , url: 0 1 (194.90.71.215) :== Info: Trying 152.91.36.101... : eff-url: , url: 0 1 (194.90.71.215) :== Info: Bind local address to 194.90.71.215 : eff-url: , url: 0 1 (194.90.71.215) :== Info: Local port: 45786 : eff-url: , url: 0 1 (194.90.71.215) !! ERROR: Connection time-out : eff-url: , url: 0 1 (194.90.71.215) :== Info: Closing connection #0 : eff-url: , url: cedar-dev:/apps/oracle/curl-loader-0.46 # Conf file is: ########### GENERAL SECTION ################################ BATCH_NAME=login_uas_logoff_cycling CLIENTS_NUM_MAX = 1 INTERFACE=eth1 NETMASK=24 IP_ADDR_MIN=194.90.71.215 IP_ADDR_MAX=194.90.71.216 CYCLES_NUM= 1 URLS_NUM=2 ########### URL SECTION ################################## ### Login URL - only once for each client # GET-part URL= https://pssamembersuat.comsuper.gov.au/ICSLogin/?" https://pssamembersuat.co msuper.gov.au/comsuper_uat/members/login/_ac_login_p1/AC/_pid/login_p1?action=lo gin" URL_SHORT_NAME="Login-GET" URL_DONT_CYCLE = 1 REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 # In msec. Now it is enforced by cancelling url fetch o n timeout TIMER_AFTER_URL_SLEEP =0 # POST-part URL="" URL_USE_CURRENT= 1 URL_SHORT_NAME="Login-POST" URL_DONT_CYCLE = 1 USERNAME=100000001614011961 PASSWORD=123123 REQUEST_TYPE=POST FORM_USAGE_TYPE= SINGLE_USER FORM_STRING= username=%s&password=%s # Means the same credentials for all client s/users TIMER_URL_COMPLETION = 0 # In msec. Now it is enforced by cancelling url fetch o n timeout TIMER_AFTER_URL_SLEEP =500 Any help would be appreciated. Ivor ------------------------------------------------------------------------------ Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM) software. With Adobe AIR, Ajax developers can use existing skills and code to build responsive, highly engaging applications that combine the power of local resources and data with the reach of the web. Download the Adobe AIR SDK and Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com _______________________________________________ curl-loader-devel mailing list cur...@li... https://lists.sourceforge.net/lists/listinfo/curl-loader-devel As a ComSuper employee, you are responsible for ensuring any email content you store or pass on, conforms to the ComSuper email policy and guidelines(as found on the ComSuper Intranet). You are also responsible for ensuring that any solicited email conforms to this policy, particularly those concerning official conduct. Further processing maybe undertaken to ensure email complies with ComSuper policy and guidelines. Please note that if you misuse the ComSuper IT computing facilities a formal investigation under the ComSuper procedures for Managing Breaches of APS Code of Conduct maybe instituted. In the event a breach is found, a range of possible sanctions apply, including possible termination of employment. |
From: Robert I. <cor...@gm...> - 2009-02-10 05:49:15
|
Hi Ivor On Tue, Feb 10, 2009 at 1:43 AM, Ivor Oorloff <ivo...@co...>wrote: > Security:[SEC=UNCLASSIFIED] > > Hi, > I was hoping to find a way to use curl-loader for load testing https > websites that have a "testing" server certificate. ie. the server > certificate is self generated with no valid or trusted root CA. This can > be achieved in curl with the -k or --insecure option which ignores CA > confirmation. Is there a way to trigger this option in curl-loader? > > I have run the same URL in curl with the -k option and can access it with > no problem. > However curl-loader will not access the site. (Works fine for non-ssl > sites) > Curl and Curl-loader appear to use the same libraries: - > > Curl-loader gets a timeout condition as below:- > > cedar-dev:/apps/oracle/curl-loader-0.46 # more login_uas_logoff_cycling.log > 0 1 (194.90.71.215) :== Info: About to connect() to > pssamembersuat.comsuper.gov > .au port 443 (#0) > : eff-url: , url: > 0 1 (194.90.71.215) :== Info: Trying 152.91.36.101... : eff-url: , url: > 0 1 (194.90.71.215) :== Info: Bind local address to 194.90.71.215 > : eff-url: , url: > 0 1 (194.90.71.215) :== Info: Local port: 45786 > : eff-url: , url: > 0 1 (194.90.71.215) !! ERROR: Connection time-out > : eff-url: , url: > 0 1 (194.90.71.215) :== Info: Closing connection #0 > : eff-url: , url: > According to the error it looks more like a networking problem. Try to run the loader adding to the command-line -v (verbouse) and post the logged errors. Thanks! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: <Fra...@ma...> - 2009-02-10 05:46:39
|
Hi Ivor, There is no issue for curl-loader to fetch https:// URL's with selfsigned ceritifcates. I used curl-loader against a SSL-apache2 with self-signed certs just fine. curl-loader does not check for certificates, it applies the equivalent of curl -k: loader.c: curl_easy_setopt (handle, CURLOPT_SSL_VERIFYPEER, 0); loader.c: curl_easy_setopt (handle, CURLOPT_SSL_VERIFYHOST, 0); However I fouond the most likely reasons for getting a URL timeout: 1. Your source IP's do not match. Before specifying any additional IP's set the local interface IP first as the only one and try it. I.e. if eth0= 194.90.71.215, set: IP_ADDR_MIN=194.90.71.215 IP_ADDR_MAX=194.90.71.215 2. Proxy issues. Depending on your environment, libcurl might or might not pick up the proxy. Check with curl-loader options -d -v -u the output of the log file and check especially that the target URL used by curl-loader is the one you intend to use. Set/unset the proxy using the libcurl environment variables or the -x option in curl-loader. Also, it is good to start with a non-forms html page, something as simple as a static index.html. This way you can narrow down the isses until your config works. Running a parallel tcpdump window to check the connection might help, too, although with SSL you don't see the data. Best Regards, Frank "Ivor Oorloff" <ivo...@co...> 02/10/2009 08:43 AM Please respond to curl-loader-devel <cur...@li...> To <cur...@li...> cc Subject Curl-loader certificate handling [SEC=UNCLASSIFIED] Security:[SEC=UNCLASSIFIED] Hi, I was hoping to find a way to use curl-loader for load testing https websites that have a "testing" server certificate. ie. the server certificate is self generated with no valid or trusted root CA. This can be achieved in curl with the -k or --insecure option which ignores CA confirmation. Is there a way to trigger this option in curl-loader? I have run the same URL in curl with the -k option and can access it with no problem. However curl-loader will not access the site. (Works fine for non-ssl sites) Curl and Curl-loader appear to use the same libraries: - oracle@cedar-dev:~/curl-loader-0.46> ldd curl-loader linux-gate.so.1 => (0xffffe000) libdl.so.2 => /lib/libdl.so.2 (0xb7f6f000) libpthread.so.0 => /lib/libpthread.so.0 (0xb7f5a000) librt.so.1 => /lib/librt.so.1 (0xb7f51000) libz.so.1 => /lib/libz.so.1 (0xb7f3f000) libssl.so.0.9.8 => /usr/lib/libssl.so.0.9.8 (0xb7f01000) libcrypto.so.0.9.8 => /usr/lib/libcrypto.so.0.9.8 (0xb7dd9000) libc.so.6 => /lib/libc.so.6 (0xb7cb7000) /lib/ld-linux.so.2 (0xb7f89000) oracle@cedar-dev:~/curl-loader-0.46> which curl /usr/bin/curl oracle@cedar-dev:~/curl-loader-0.46> ldd /usr/bin/curl linux-gate.so.1 => (0xffffe000) libcurl.so.3 => /usr/lib/libcurl.so.3 (0xb7f7e000) libidn.so.11 => /usr/lib/libidn.so.11 (0xb7f4e000) libssl.so.0.9.8 => /usr/lib/libssl.so.0.9.8 (0xb7f11000) libcrypto.so.0.9.8 => /usr/lib/libcrypto.so.0.9.8 (0xb7de9000) libdl.so.2 => /lib/libdl.so.2 (0xb7de4000) libz.so.1 => /lib/libz.so.1 (0xb7dd2000) libc.so.6 => /lib/libc.so.6 (0xb7cb0000) /lib/ld-linux.so.2 (0xb7fc8000) Curl-loader gets a timeout condition as below:- cedar-dev:/apps/oracle/curl-loader-0.46 # more login_uas_logoff_cycling.log 0 1 (194.90.71.215) :== Info: About to connect() to pssamembersuat.comsuper.gov .au port 443 (#0) : eff-url: , url: 0 1 (194.90.71.215) :== Info: Trying 152.91.36.101... : eff-url: , url: 0 1 (194.90.71.215) :== Info: Bind local address to 194.90.71.215 : eff-url: , url: 0 1 (194.90.71.215) :== Info: Local port: 45786 : eff-url: , url: 0 1 (194.90.71.215) !! ERROR: Connection time-out : eff-url: , url: 0 1 (194.90.71.215) :== Info: Closing connection #0 : eff-url: , url: cedar-dev:/apps/oracle/curl-loader-0.46 # Conf file is: ########### GENERAL SECTION ################################ BATCH_NAME=login_uas_logoff_cycling CLIENTS_NUM_MAX = 1 INTERFACE=eth1 NETMASK=24 IP_ADDR_MIN=194.90.71.215 IP_ADDR_MAX=194.90.71.216 CYCLES_NUM= 1 URLS_NUM=2 ########### URL SECTION ################################## ### Login URL - only once for each client # GET-part URL= https://pssamembersuat.comsuper.gov.au/ICSLogin/?" https://pssamembersuat.co msuper.gov.au/comsuper_uat/members/login/_ac_login_p1/AC/_pid/login_p1?action=lo gin" URL_SHORT_NAME="Login-GET" URL_DONT_CYCLE = 1 REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 # In msec. Now it is enforced by cancelling url fetch o n timeout TIMER_AFTER_URL_SLEEP =0 # POST-part URL="" URL_USE_CURRENT= 1 URL_SHORT_NAME="Login-POST" URL_DONT_CYCLE = 1 USERNAME=100000001614011961 PASSWORD=123123 REQUEST_TYPE=POST FORM_USAGE_TYPE= SINGLE_USER FORM_STRING= username=%s&password=%s # Means the same credentials for all client s/users TIMER_URL_COMPLETION = 0 # In msec. Now it is enforced by cancelling url fetch o n timeout TIMER_AFTER_URL_SLEEP =500 Any help would be appreciated. Ivor ------------------------------------------------------------------------------ Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM) software. With Adobe AIR, Ajax developers can use existing skills and code to build responsive, highly engaging applications that combine the power of local resources and data with the reach of the web. Download the Adobe AIR SDK and Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com _______________________________________________ curl-loader-devel mailing list cur...@li... https://lists.sourceforge.net/lists/listinfo/curl-loader-devel |
From: Ivor O. <ivo...@co...> - 2009-02-09 23:44:26
|
Security:[SEC=UNCLASSIFIED] Hi, I was hoping to find a way to use curl-loader for load testing https websites that have a "testing" server certificate. ie. the server certificate is self generated with no valid or trusted root CA. This can be achieved in curl with the -k or --insecure option which ignores CA confirmation. Is there a way to trigger this option in curl-loader? I have run the same URL in curl with the -k option and can access it with no problem. However curl-loader will not access the site. (Works fine for non-ssl sites) Curl and Curl-loader appear to use the same libraries: - oracle@cedar-dev:~/curl-loader-0.46> ldd curl-loader linux-gate.so.1 => (0xffffe000) libdl.so.2 => /lib/libdl.so.2 (0xb7f6f000) libpthread.so.0 => /lib/libpthread.so.0 (0xb7f5a000) librt.so.1 => /lib/librt.so.1 (0xb7f51000) libz.so.1 => /lib/libz.so.1 (0xb7f3f000) libssl.so.0.9.8 => /usr/lib/libssl.so.0.9.8 (0xb7f01000) libcrypto.so.0.9.8 => /usr/lib/libcrypto.so.0.9.8 (0xb7dd9000) libc.so.6 => /lib/libc.so.6 (0xb7cb7000) /lib/ld-linux.so.2 (0xb7f89000) oracle@cedar-dev:~/curl-loader-0.46> which curl /usr/bin/curl oracle@cedar-dev:~/curl-loader-0.46> ldd /usr/bin/curl linux-gate.so.1 => (0xffffe000) libcurl.so.3 => /usr/lib/libcurl.so.3 (0xb7f7e000) libidn.so.11 => /usr/lib/libidn.so.11 (0xb7f4e000) libssl.so.0.9.8 => /usr/lib/libssl.so.0.9.8 (0xb7f11000) libcrypto.so.0.9.8 => /usr/lib/libcrypto.so.0.9.8 (0xb7de9000) libdl.so.2 => /lib/libdl.so.2 (0xb7de4000) libz.so.1 => /lib/libz.so.1 (0xb7dd2000) libc.so.6 => /lib/libc.so.6 (0xb7cb0000) /lib/ld-linux.so.2 (0xb7fc8000) Curl-loader gets a timeout condition as below:- cedar-dev:/apps/oracle/curl-loader-0.46 # more login_uas_logoff_cycling.log 0 1 (194.90.71.215) :== Info: About to connect() to pssamembersuat.comsuper.gov .au port 443 (#0) : eff-url: , url: 0 1 (194.90.71.215) :== Info: Trying 152.91.36.101... : eff-url: , url: 0 1 (194.90.71.215) :== Info: Bind local address to 194.90.71.215 : eff-url: , url: 0 1 (194.90.71.215) :== Info: Local port: 45786 : eff-url: , url: 0 1 (194.90.71.215) !! ERROR: Connection time-out : eff-url: , url: 0 1 (194.90.71.215) :== Info: Closing connection #0 : eff-url: , url: cedar-dev:/apps/oracle/curl-loader-0.46 # Conf file is: ########### GENERAL SECTION ################################ BATCH_NAME=login_uas_logoff_cycling CLIENTS_NUM_MAX = 1 INTERFACE=eth1 NETMASK=24 IP_ADDR_MIN=194.90.71.215 IP_ADDR_MAX=194.90.71.216 CYCLES_NUM= 1 URLS_NUM=2 ########### URL SECTION ################################## ### Login URL - only once for each client # GET-part URL= https://pssamembersuat.comsuper.gov.au/ICSLogin/?"https://pssamembersuat.co msuper.gov.au/comsuper_uat/members/login/_ac_login_p1/AC/_pid/login_p1?action=lo gin" URL_SHORT_NAME="Login-GET" URL_DONT_CYCLE = 1 REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 # In msec. Now it is enforced by cancelling url fetch o n timeout TIMER_AFTER_URL_SLEEP =0 # POST-part URL="" URL_USE_CURRENT= 1 URL_SHORT_NAME="Login-POST" URL_DONT_CYCLE = 1 USERNAME=100000001614011961 PASSWORD=123123 REQUEST_TYPE=POST FORM_USAGE_TYPE= SINGLE_USER FORM_STRING= username=%s&password=%s # Means the same credentials for all client s/users TIMER_URL_COMPLETION = 0 # In msec. Now it is enforced by cancelling url fetch o n timeout TIMER_AFTER_URL_SLEEP =500 Any help would be appreciated. Ivor |
From: Robert I. <cor...@gm...> - 2009-01-30 09:43:58
|
Hi Sergei, On Fri, Jan 30, 2009 at 10:42 AM, Sergei Sh <jun...@na...> wrote: > /Hi! > I try to use curl-loader for such requests: > curl -i -X POST -H "Content-Type: text/json" -d > '[{"command":"connect","refid":1},{"command":"post", "refid":2, > "channel":"chat", "data":"hello world 3"}]' http://media.mysite.com/qrpc > > How to make URL section? I don't need any form and data fields. With > this section I get error post_data is NULL > > > URL=http://media.qik.com/qrpc > URL_SHORT_NAME="1" > HEADER="Content-Type: text/json" > HEADER="Expect: " > HEADER='[{"command":"connect","refid":1},{"command":"post", "refid":2, > "channel":"chat", "data":"hello world 3"}]' > REQUEST_TYPE=POST > LOG_RESP_BODIES=1 > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP =0 > > I tried FORM and MULTIPART FORM and other options, no success. > Thanks > -- > Sergei Sh > > Please, try to use the latest curl-loader version with instructions from Alex, provided by the the HOTOS file in doc directory. http://curl-loader.svn.sourceforge.net/viewvc/curl-loader/trunk/curl-loader/doc/HOWTOS?view=markup -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Sergei Sh <jun...@na...> - 2009-01-30 09:00:07
|
/Hi! I try to use curl-loader for such requests: curl -i -X POST -H "Content-Type: text/json" -d '[{"command":"connect","refid":1},{"command":"post", "refid":2, "channel":"chat", "data":"hello world 3"}]' http://media.mysite.com/qrpc How to make URL section? I don't need any form and data fields. With this section I get error post_data is NULL URL=http://media.qik.com/qrpc URL_SHORT_NAME="1" HEADER="Content-Type: text/json" HEADER="Expect: " HEADER='[{"command":"connect","refid":1},{"command":"post", "refid":2, "channel":"chat", "data":"hello world 3"}]' REQUEST_TYPE=POST LOG_RESP_BODIES=1 TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP =0 I tried FORM and MULTIPART FORM and other options, no success. Thanks -- Sergei Sh / |
From: Robert I. <cor...@gm...> - 2009-01-29 05:00:27
|
Hi David, On Wed, Jan 28, 2009 at 11:50 PM, David Hotchkiss <dho...@is...>wrote: > Robert, > > I did a fresh compile of curl-loader v0.44 (aka stable) Thanks. It seems, that we should fix it. However, I am not sure about the schedule, sorry. Don't use this option in meanwhile. Sincerely, Robert |
From: David H. <dho...@is...> - 2009-01-28 22:18:02
|
Robert, I did a fresh compile of curl-loader v0.44 (aka stable) and got the following results in three consecutive runs: Test total duration was 82 seconds and CAPS average 2: H/F Req:404,1xx:0,2xx:404,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:1ms,D-2xx:1ms,Ti:1346B/s,To:620B/s H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s Operations: Success Failed Timed out URL0:local 0 178 0 0 0 0 URL1:ACE 4 226 0 0 0 0 Test total duration was 68 seconds and CAPS average 2: H/F Req:338,1xx:0,2xx:338,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:1ms,D-2xx:1ms,Ti:1359B/s,To:625B/s H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s Operations: Success Failed Timed out URL0:local 5 141 0 0 0 0 URL1:ACE 8 197 0 0 0 0 Test total duration was 72 seconds and CAPS average 2: H/F Req:355,1xx:0,2xx:355,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:1ms,D-2xx:1ms,Ti:1347B/s,To:620B/s H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s Operations: Success Failed Timed out URL0:local 0 165 0 0 0 0 URL1:ACE 1 190 0 0 0 0 These seem closer to the v0.47 results than the expected frequencies. Shall I test earlier versions? Thanks, //david On Wed, 2009-01-28 at 01:25 +0200, Robert Iakobashvili wrote: > Hi David, > > On Wed, Jan 28, 2009 at 1:05 AM, David Hotchkiss <dhotchkiss@iso- > ne.com> wrote: > Hi Robert, > > Thanks for your quick response. Per your suggestion, I > slightly > modified './conf-examples/fetch-probability.conf' as follows: > =-=-=-= > ########### GENERAL SECTION ################################ > > BATCH_NAME= fetch-prob > CLIENTS_NUM_MAX=5 > INTERFACE =eth0 > NETMASK=255.255.240.0 > IP_ADDR_MIN= 10.56.1.1 > IP_ADDR_MAX= 10.56.15.254 > > CYCLES_NUM= 100 > URLS_NUM= 2 > > ########### URL SECTION #################################### > > URL=http://localhost:8080/index.html > URL_SHORT_NAME="local" > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP = 1000 > FETCH_PROBABILITY = 70 > > URL=http://localhost:8080/one.html > > URL_SHORT_NAME="ACE" > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP = 1000 > FETCH_PROBABILITY = 30 > =-=-=-= > > With five clients and 100 cycles, I would expect that out of a > potential > 500 requests for each URL that the actual requests made during > a curl- > loader run would be something like this: > > URL:local ~350 total requests ( (5 clients * 100 cycles) * > 0.7) > URL:ACE ~150 total requests ( (5 clients * 100 cycles) * > 0.3) > > Here are the results from three different runs: > > 1) > Test total duration was 62 seconds and CAPS average 2: > H/F Req:309,1xx:0,2xx:309,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:1ms,D-2xx:1ms,Ti:1362B/s,To:627B/s > H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s > Operations: Success Failed Timed out > > URL0:local 6 132 0 0 0 0 > URL1:ACE 8 177 0 0 0 0 > > 2) > Test total duration was 69 seconds and CAPS average 1: > H/F Req:343,1xx:0,2xx:343,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:1ms,D-2xx:1ms,Ti:1360B/s,To:625B/s > H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s > Operations: Success Failed Timed out > > URL0:local 0 134 0 0 0 0 > URL1:ACE 3 209 0 0 0 0 > > 3) > Test total duration was 77 seconds and CAPS average 2: > H/F Req:382,1xx:0,2xx:382,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:1ms,D-2xx:1ms,Ti:1356B/s,To:624B/s > H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s > Operations: Success Failed Timed out > > URL0:local 5 168 0 0 0 0 > URL1:ACE 7 214 0 0 0 0 > > As you can see, the outcomes are significantly different from > what I > expected. > > Can you help shed some light on what is going on? Is the error > in my > expectations, the configuration, or curl-loader? > > Thanks, > > //david > > > Looks like something may be broken in this version of curl-loader. > Can you try some earlier varsion, e.g. curl-loader stable? > Thanks. > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > |
From: Richard B. <ric...@se...> - 2009-01-28 11:24:57
|
Unfortunately, due to harmonisation requirements on this project, that is not an option. We won't therefore be using your software I'm afraid. Good luck ________________________________ From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Tue 27/01/2009 17:55 To: curl-loader-devel Subject: Re: [Curl-loader-devel] Compile problem - can't find openssl Dear Richard, On Tue, Jan 27, 2009 at 7:20 PM, Richard Brice <ric...@se...> wrote: CURL-LOADER VERSION: 0.47, December 2, 2008 HW DETAILS: CPU/S and memory are must: i686 LINUX DISTRIBUTION and KERNEL (uname -r): $ uname -svr CYGWIN_NT-5.1 1.5.25(0.156/4/2) 2008-07-05 09:05 GCC VERSION (gcc -v): Thread model: posix gcc version 3.4.4 (cygming special, gdc 0.12, using dmd 0.125) Thank you for your PRF form. curl-loader is not supporting cygwin, ming and other emulations as well as VMs. It is explained in FAQs. You need a PC with a real linux. Sincerely, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... ______________________________________________________________________ Reg Office: SELEX Systems Integration Ltd, Broad Oak Business Park, Portsmouth, PO3 5PQ, UK. Registered in England - Number 05321115 This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ |
From: Robert I. <cor...@gm...> - 2009-01-27 23:25:32
|
Hi David, On Wed, Jan 28, 2009 at 1:05 AM, David Hotchkiss <dho...@is...>wrote: > Hi Robert, > > Thanks for your quick response. Per your suggestion, I slightly > modified './conf-examples/fetch-probability.conf' as follows: > =-=-=-= > ########### GENERAL SECTION ################################ > > BATCH_NAME= fetch-prob > CLIENTS_NUM_MAX=5 > INTERFACE =eth0 > NETMASK=255.255.240.0 > IP_ADDR_MIN= 10.56.1.1 > IP_ADDR_MAX= 10.56.15.254 > CYCLES_NUM= 100 > URLS_NUM= 2 > > ########### URL SECTION #################################### > > URL=http://localhost:8080/index.html > URL_SHORT_NAME="local" > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP = 1000 > FETCH_PROBABILITY = 70 > > URL=http://localhost:8080/one.html > URL_SHORT_NAME="ACE" > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP = 1000 > FETCH_PROBABILITY = 30 > =-=-=-= > > With five clients and 100 cycles, I would expect that out of a potential > 500 requests for each URL that the actual requests made during a curl- > loader run would be something like this: > > URL:local ~350 total requests ( (5 clients * 100 cycles) * 0.7) > URL:ACE ~150 total requests ( (5 clients * 100 cycles) * 0.3) > > Here are the results from three different runs: > > 1) > Test total duration was 62 seconds and CAPS average 2: > H/F Req:309,1xx:0,2xx:309,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:1ms,D-2xx:1ms,Ti:1362B/s,To:627B/s > H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s > Operations: Success Failed Timed out > URL0:local 6 132 0 0 0 0 > URL1:ACE 8 177 0 0 0 0 > > 2) > Test total duration was 69 seconds and CAPS average 1: > H/F Req:343,1xx:0,2xx:343,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:1ms,D-2xx:1ms,Ti:1360B/s,To:625B/s > H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s > Operations: Success Failed Timed out > URL0:local 0 134 0 0 0 0 > URL1:ACE 3 209 0 0 0 0 > > 3) > Test total duration was 77 seconds and CAPS average 2: > H/F Req:382,1xx:0,2xx:382,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:1ms,D-2xx:1ms,Ti:1356B/s,To:624B/s > H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- > Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s > Operations: Success Failed Timed out > URL0:local 5 168 0 0 0 0 > URL1:ACE 7 214 0 0 0 0 > > As you can see, the outcomes are significantly different from what I > expected. > > Can you help shed some light on what is going on? Is the error in my > expectations, the configuration, or curl-loader? > > Thanks, > > //david > Looks like something may be broken in this version of curl-loader. Can you try some earlier varsion, e.g. curl-loader stable? Thanks. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: David H. <dho...@is...> - 2009-01-27 23:05:33
|
Hi Robert, Thanks for your quick response. Per your suggestion, I slightly modified './conf-examples/fetch-probability.conf' as follows: =-=-=-= ########### GENERAL SECTION ################################ BATCH_NAME= fetch-prob CLIENTS_NUM_MAX=5 INTERFACE =eth0 NETMASK=255.255.240.0 IP_ADDR_MIN= 10.56.1.1 IP_ADDR_MAX= 10.56.15.254 CYCLES_NUM= 100 URLS_NUM= 2 ########### URL SECTION #################################### URL=http://localhost:8080/index.html URL_SHORT_NAME="local" REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 1000 FETCH_PROBABILITY = 70 URL=http://localhost:8080/one.html URL_SHORT_NAME="ACE" REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 1000 FETCH_PROBABILITY = 30 =-=-=-= With five clients and 100 cycles, I would expect that out of a potential 500 requests for each URL that the actual requests made during a curl- loader run would be something like this: URL:local ~350 total requests ( (5 clients * 100 cycles) * 0.7) URL:ACE ~150 total requests ( (5 clients * 100 cycles) * 0.3) Here are the results from three different runs: 1) Test total duration was 62 seconds and CAPS average 2: H/F Req:309,1xx:0,2xx:309,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:1ms,D-2xx:1ms,Ti:1362B/s,To:627B/s H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s Operations: Success Failed Timed out URL0:local 6 132 0 0 0 0 URL1:ACE 8 177 0 0 0 0 2) Test total duration was 69 seconds and CAPS average 1: H/F Req:343,1xx:0,2xx:343,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:1ms,D-2xx:1ms,Ti:1360B/s,To:625B/s H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s Operations: Success Failed Timed out URL0:local 0 134 0 0 0 0 URL1:ACE 3 209 0 0 0 0 3) Test total duration was 77 seconds and CAPS average 2: H/F Req:382,1xx:0,2xx:382,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:1ms,D-2xx:1ms,Ti:1356B/s,To:624B/s H/F/S Req:0,1xx:0,2xx:0,3xx:0,4xx:0,5xx:0,Err:0,T- Err:0,D:0ms,D-2xx:0ms,Ti:0B/s,To:0B/s Operations: Success Failed Timed out URL0:local 5 168 0 0 0 0 URL1:ACE 7 214 0 0 0 0 As you can see, the outcomes are significantly different from what I expected. Can you help shed some light on what is going on? Is the error in my expectations, the configuration, or curl-loader? Thanks, //david On Sun, 2009-01-25 at 19:17 +0200, Robert Iakobashvili wrote: > Hi David, > > On Fri, Jan 23, 2009 at 8:24 PM, David Hotchkiss <dhotchkiss@iso- > ne.com> wrote: > CURL-LOADER VERSION: 0.47, December 2, 2008 > > > > ########### URL SECTION #################################### > > URL=http://localhost:8080/one.html > URL_SHORT_NAME="one" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > URL=http://localhost:8080/two.html > URL_SHORT_NAME="two" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > URL=http://localhost:8080/three.html > URL_SHORT_NAME="three" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > =-=-=-=-= Configuration without FETCH_PROBABILITY: END =-=- > =-=-=-= > > =-=-=-=-= Configuration with FETCH_PROBABILITY: BEGIN =-=-=-=- > =-= > > ########### GENERAL SECTION ################################ > BATCH_NAME= test > CLIENTS_NUM_MAX=1 > CLIENTS_NUM_START=1 > INTERFACE=eth0 > NETMASK=255.255.240.0 > IP_ADDR_MIN= 10.56.1.1 > IP_ADDR_MAX= 10.56.15.254 > CYCLES_NUM=10 > URLS_NUM=3 > > ########### URL SECTION #################################### > > URL=http://localhost:8080/one.html > URL_SHORT_NAME="one" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > URL=http://localhost:8080/two.html > URL_SHORT_NAME="two" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > FETCH_PROBABILITY=10 > > URL=http://localhost:8080/three.html > URL_SHORT_NAME="three" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > > > QUESTION/ SUGGESTION/ PATCH: > Is the observed behavior the intended outcome? or should the > expected > request frequencies be something like this: > > URL0: 10 requests > URL1: ~1 request(s) > URL2: 10 requests > > for the second configuration? > > Thank you for the PRF form filled in due course. > > Your understanding of the intended behavior is correct. > Our documentation does not describe the feature clear enough. > > What user is supposed to do is to specify several URLs each with > FETCH_PROBABILITY, so that the sum of all the probabilities will be > 100. > > For the outcome, that you are expecting, it might be something like: > > URL1: > FETCH_PROBABILITY 45 > > URL2: > FETCH_PROBABILITY 5 > > URL3: > FETCH_PROBABILITY 45 > > I would measure them using more cycles, at least 100. > > You might see some usage examples in directory conf-examples > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > |
From: Robert I. <cor...@gm...> - 2009-01-27 17:55:34
|
Dear Richard, On Tue, Jan 27, 2009 at 7:20 PM, Richard Brice < ric...@se...> wrote: > CURL-LOADER VERSION: 0.47, December 2, 2008 > > HW DETAILS: CPU/S and memory are must: i686 > > LINUX DISTRIBUTION and KERNEL (uname -r): > > $ uname -svr > CYGWIN_NT-5.1 1.5.25(0.156/4/2) 2008-07-05 09:05 > > GCC VERSION (gcc -v): > > Thread model: posix > gcc version 3.4.4 (cygming special, gdc 0.12, using dmd 0.125) > > Thank you for your PRF form. curl-loader is not supporting cygwin, ming and other emulations as well as VMs. It is explained in FAQs. You need a PC with a real linux. Sincerely, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Richard B. <ric...@se...> - 2009-01-27 17:20:50
|
CURL-LOADER VERSION: 0.47, December 2, 2008 HW DETAILS: CPU/S and memory are must: i686 LINUX DISTRIBUTION and KERNEL (uname -r): $ uname -svr CYGWIN_NT-5.1 1.5.25(0.156/4/2) 2008-07-05 09:05 GCC VERSION (gcc -v): Thread model: posix gcc version 3.4.4 (cygming special, gdc 0.12, using dmd 0.125) COMPILATION AND MAKING OPTIONS (if defaults changed): unchanged COMMAND-LINE: $make CONFIGURATION-FILE (The most common source of problems): Place the file inline here: DOES THE PROBLEM AFFECT: COMPILATION? Yes LINKING? Not there yet EXECUTION? Not there yet OTHER (please specify)? Have you run $make cleanall prior to $make ? yes DESCRIPTION: The make cannot find <bits/types.h>. My installation does not have that file. Where can I get it? QUESTION/ SUGGESTION/ PATCH: ________________________________ From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Tue 27/01/2009 16:57 To: curl-loader-devel Subject: Re: [Curl-loader-devel] Compile problem - can't find openssl Hi Richard, On Tue, Jan 27, 2009 at 3:59 PM, Richard Brice <ric...@se...> wrote: I am having an issue compiling curl-loader. I think the issue is with finding openssl. What directory am I suppose to point to? I edited 'openssldir.sh' and have pointed the OPENSSLDIR variable to a number of places but can't seem to get it to where it wants to be. I assume that variable should be pointing to the OpenSSL directory that has the OpenSSL include directory? At and rate, this the the error I get. Any help would be most appreciated gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 -D_FILE_OFFSET_BITS=64 -g -I. -I./inc -I/usr/ssl/include -c -o obj/batch.o batch.c Assembler messages: Fatal error: can't create obj/batch.o: No such file or directory In file included from batch.c:23: fdsetsize.h:27:24: bits/types.h: No such file or directory make: *** [obj/batch.o] Error 2 Since we do not have your Problem-Report-Form, it is very difficult to help you, but I will try. 1. see, that you have in curl-loader-<version> directory a subdirectory named obj If you do not have create by $mkdir obj 2. Adjust Makefile variables to point to the openssl headers and libraries. If you want to specify an openssl development directory with include files (e.g. crypto.h), export environment variable OPENSSLDIR with the value of that directory. For example: $export OPENSSLDIR=the-full-path-to-the-directory 3. If you need more assistance, please, kindly respect the mailing list rules and post the filled PRF, that you can find in your curl-loader-<version> directory. Best wishes! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... ______________________________________________________________________ Reg Office: SELEX Systems Integration Ltd, Broad Oak Business Park, Portsmouth, PO3 5PQ, UK. Registered in England - Number 05321115 This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ |
From: Robert I. <cor...@gm...> - 2009-01-27 16:57:58
|
Hi Richard, On Tue, Jan 27, 2009 at 3:59 PM, Richard Brice < ric...@se...> wrote: > I am having an issue compiling curl-loader. I think the issue is with > finding openssl. What directory am I suppose to point to? I edited > 'openssldir.sh' and have pointed the OPENSSLDIR variable to a number of > places but can't seem to get it to where it wants to be. I assume that > variable should be pointing to the OpenSSL directory that has the OpenSSL > include directory? > > At and rate, this the the error I get. Any help would be most appreciated > > gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 > -D_FILE_OFFSET_BITS=64 -g -I. -I./inc -I/usr/ssl/include -c -o obj/batch.o > batch.c > Assembler messages: > Fatal error: can't create obj/batch.o: No such file or directory > In file included from batch.c:23: > fdsetsize.h:27:24: bits/types.h: No such file or directory > make: *** [obj/batch.o] Error 2 > > Since we do not have your Problem-Report-Form, it is very difficult to help you, but I will try. 1. see, that you have in curl-loader-<version> directory a subdirectory named obj If you do not have create by $mkdir obj 2. Adjust Makefile variables to point to the openssl headers and libraries. If you want to specify an openssl development directory with include files (e.g. crypto.h), export environment variable OPENSSLDIR with the value of that directory. For example: $export OPENSSLDIR=the-full-path-to-the-directory 3. If you need more assistance, please, kindly respect the mailing list rules and post the filled PRF, that you can find in your curl-loader-<version> directory. Best wishes! > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > |
From: Richard B. <ric...@se...> - 2009-01-27 14:32:50
|
I am having an issue compiling curl-loader. I think the issue is with finding openssl. What directory am I suppose to point to? I edited 'openssldir.sh' and have pointed the OPENSSLDIR variable to a number of places but can't seem to get it to where it wants to be. I assume that variable should be pointing to the OpenSSL directory that has the OpenSSL include directory but that doesn't appear to work? At and rate, this the the error I get. Any help would be most appreciated gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 -D_FILE_OFFSET_BITS=64 -g -I. -I./inc -I/usr/ssl/include -c -o obj/batch.o batch.c Assembler messages: Fatal error: can't create obj/batch.o: No such file or directory In file included from batch.c:23: fdsetsize.h:27:24: bits/types.h: No such file or directory make: *** [obj/batch.o] Error 2 ______________________________________________________________________ Reg Office: SELEX Systems Integration Ltd, Broad Oak Business Park, Portsmouth, PO3 5PQ, UK. Registered in England - Number 05321115 This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ |
From: Richard B. <ric...@se...> - 2009-01-27 14:26:39
|
I am having an issue compiling curl-loader. I think the issue is with finding openssl. What directory am I suppose to point to? I edited 'openssldir.sh' and have pointed the OPENSSLDIR variable to a number of places but can't seem to get it to where it wants to be. I assume that variable should be pointing to the OpenSSL directory that has the OpenSSL include directory? At and rate, this the the error I get. Any help would be most appreciated gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 -D_FILE_OFFSET_BITS=64 -g -I. -I./inc -I/usr/ssl/include -c -o obj/batch.o batch.c Assembler messages: Fatal error: can't create obj/batch.o: No such file or directory In file included from batch.c:23: fdsetsize.h:27:24: bits/types.h: No such file or directory make: *** [obj/batch.o] Error 2 ______________________________________________________________________ Reg Office: SELEX Systems Integration Ltd, Broad Oak Business Park, Portsmouth, PO3 5PQ, UK. Registered in England - Number 05321115 This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ |
From: Robert I. <cor...@gm...> - 2009-01-25 17:18:02
|
Hi David, On Fri, Jan 23, 2009 at 8:24 PM, David Hotchkiss <dho...@is...>wrote: > CURL-LOADER VERSION: 0.47, December 2, 2008 > > > > ########### URL SECTION #################################### > > URL=http://localhost:8080/one.html > URL_SHORT_NAME="one" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > URL=http://localhost:8080/two.html > URL_SHORT_NAME="two" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > URL=http://localhost:8080/three.html > URL_SHORT_NAME="three" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > =-=-=-=-= Configuration without FETCH_PROBABILITY: END =-=-=-=-=-= > > =-=-=-=-= Configuration with FETCH_PROBABILITY: BEGIN =-=-=-=-=-= > > ########### GENERAL SECTION ################################ > BATCH_NAME= test > CLIENTS_NUM_MAX=1 > CLIENTS_NUM_START=1 > INTERFACE=eth0 > NETMASK=255.255.240.0 > IP_ADDR_MIN= 10.56.1.1 > IP_ADDR_MAX= 10.56.15.254 > CYCLES_NUM=10 > URLS_NUM=3 > > ########### URL SECTION #################################### > > URL=http://localhost:8080/one.html > URL_SHORT_NAME="one" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > URL=http://localhost:8080/two.html > URL_SHORT_NAME="two" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > FETCH_PROBABILITY=10 > > URL=http://localhost:8080/three.html > URL_SHORT_NAME="three" > REQUEST_TYPE=GET > TIMER_AFTER_URL_SLEEP=200 > > > > QUESTION/ SUGGESTION/ PATCH: > Is the observed behavior the intended outcome? or should the expected > request frequencies be something like this: > > URL0: 10 requests > URL1: ~1 request(s) > URL2: 10 requests > > for the second configuration? > Thank you for the PRF form filled in due course. Your understanding of the intended behavior is correct. Our documentation does not describe the feature clear enough. What user is supposed to do is to specify several URLs each with FETCH_PROBABILITY, so that the sum of all the probabilities will be 100. For the outcome, that you are expecting, it might be something like: URL1: FETCH_PROBABILITY 45 URL2: FETCH_PROBABILITY 5 URL3: FETCH_PROBABILITY 45 I would measure them using more cycles, at least 100. You might see some usage examples in directory conf-examples -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: David H. <dho...@is...> - 2009-01-23 18:25:11
|
CURL-LOADER VERSION: 0.47, December 2, 2008 HW DETAILS: CPU/S and memory are must: 4 core CPU in single physical slot, 16636588 kB RAM LINUX DISTRIBUTION and KERNEL (uname -r): 2.6.18-53.el5PAE GCC VERSION (gcc -v): gcc version 4.1.2 20070626 (Red Hat 4.1.2-14) COMPILATION AND MAKING OPTIONS (if defaults changed): COMMAND-LINE: curl-loader -f test.conf CONFIGURATION-FILE (The most common source of problems): Place the file inline here: =-=-=-=-= Configuration without FETCH_PROBABILITY: BEGIN =-=-=-=-=-= ########### GENERAL SECTION ################################ BATCH_NAME= test CLIENTS_NUM_MAX=1 CLIENTS_NUM_START=1 INTERFACE=eth0 NETMASK=255.255.240.0 IP_ADDR_MIN= 10.56.1.1 IP_ADDR_MAX= 10.56.15.254 CYCLES_NUM=10 URLS_NUM=3 ########### URL SECTION #################################### URL=http://localhost:8080/one.html URL_SHORT_NAME="one" REQUEST_TYPE=GET TIMER_AFTER_URL_SLEEP=200 URL=http://localhost:8080/two.html URL_SHORT_NAME="two" REQUEST_TYPE=GET TIMER_AFTER_URL_SLEEP=200 URL=http://localhost:8080/three.html URL_SHORT_NAME="three" REQUEST_TYPE=GET TIMER_AFTER_URL_SLEEP=200 =-=-=-=-= Configuration without FETCH_PROBABILITY: END =-=-=-=-=-= =-=-=-=-= Configuration with FETCH_PROBABILITY: BEGIN =-=-=-=-=-= ########### GENERAL SECTION ################################ BATCH_NAME= test CLIENTS_NUM_MAX=1 CLIENTS_NUM_START=1 INTERFACE=eth0 NETMASK=255.255.240.0 IP_ADDR_MIN= 10.56.1.1 IP_ADDR_MAX= 10.56.15.254 CYCLES_NUM=10 URLS_NUM=3 ########### URL SECTION #################################### URL=http://localhost:8080/one.html URL_SHORT_NAME="one" REQUEST_TYPE=GET TIMER_AFTER_URL_SLEEP=200 URL=http://localhost:8080/two.html URL_SHORT_NAME="two" REQUEST_TYPE=GET TIMER_AFTER_URL_SLEEP=200 FETCH_PROBABILITY=10 URL=http://localhost:8080/three.html URL_SHORT_NAME="three" REQUEST_TYPE=GET TIMER_AFTER_URL_SLEEP=200 =-=-=-=-= Configuration with FETCH_PROBABILITY: END =-=-=-=-=-= DOES THE PROBLEM AFFECT: COMPILATION? No LINKING? No EXECUTION? Yes OTHER (please specify)? Have you run $make cleanall prior to $make ? Yes DESCRIPTION: Using the FETCH_PROBABILITY tag has counter-intuitive side effects. When the configuration file for a single client to run ten cycles, for three urls, and contains no FETCH_PROBABILITY tags, each url is requested 10 times. Final screen for the first configuration file: Operations: Success Failed Timed out URL0:one 1 10 0 0 0 0 URL1:two 1 10 0 0 0 0 URL2:three 1 10 0 0 0 0 When a single FETCH_PROBABILITY with a value of "10" is introduced for any single url-subsection, the number of requests for all URLs is reduced and the url-subsection containing the FETCH_PROBABILITY tag is requested more times relative to the subsections that do not contain a tag. In addition, the value of the FETCH_PROBABILITY does not seem to predict the actual number of requests for the URL in that subgroup. For, example I would expect that CYCLES_NUM=10 and FETCH_PROBABILITY=10 would result in approximately 1 request, instead I often see 4-10. Final screen for the second configuration file w/ FETCH_PROBABILITY=10 defined for "URL1:two": Operations: Success Failed Timed out URL0:one 3 3 0 0 0 0 URL1:two 4 4 0 0 0 0 URL2:three 1 1 0 0 0 0 The same thing happens if the other subsections are tagged with FETCH_PROBABILITY=100. QUESTION/ SUGGESTION/ PATCH: Is the observed behavior the intended outcome? or should the expected request frequencies be something like this: URL0: 10 requests URL1: ~1 request(s) URL2: 10 requests for the second configuration? |
From: Robert I. <cor...@gm...> - 2009-01-20 05:00:30
|
Hi Thomas, On Mon, Jan 19, 2009 at 11:40 PM, Thomas Tran <odi...@gm...> wrote: > Is there a way to specify the number of CAPS to be used during the test > run, because I need to determine the performance impact of a third party > software when it install on our server vs when it not install on the server. > Therefore, I need to make sure the number of CAPS remain the same on each > test run. > There is no way now to specify CAPS. You can see the CAPS in statistics and at screen output, but load does not ensure or keep the CAPS currently. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Thomas T. <odi...@gm...> - 2009-01-19 22:49:07
|
Hi Guys, Thank you so much for your hard work on curl loader. I've got a quick questions.. Is there a way to specify the number of CAPS to be used during the test run, because I need to determine the performance impact of a third party software when it install on our server vs when it not install on the server. Therefore, I need to make sure the number of CAPS remain the same on each test run. Here is my configuration file ########### GENERAL SECTION ################################ BATCH_NAME=100 Client Test CLIENTS_NUM_MAX=100 CLIENTS_NUM_START=50 CLIENTS_RAMPUP_INC=10 INTERFACE =eth1 NETMASK=16 IP_ADDR_MIN=172.20.254.1 IP_ADDR_MAX=172.20.254.10 IP_SHARED_NUM=10 CYCLES_NUM=-1 URLS_NUM=1 ########### URL SECTION #################################### URL=http://172.20.1.1/index.html URL_SHORT_NAME="Homepage" REQUEST_TYPE=GET TIMER_URL_COMPLETION =0 TIMER_AFTER_URL_SLEEP =0 |
From: Robert I. <cor...@gm...> - 2009-01-18 20:33:25
|
Hi Frank, On Wed, Jan 14, 2009 at 12:45 PM, <Fra...@ma...> wrote: > > Dear Robert, > > I changed the proxy string size to 1024, did some more tests and updated > the docs. > Please find the requested diff attached, plus a tar of the files I changed, > just in case. > Patch applied with some minor changes. Y have been added to our THANKS file. Best wishes! > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > |
From: Robert I. <cor...@gm...> - 2009-01-15 17:17:37
|
Hi Frank, On Wed, Jan 14, 2009 at 12:45 PM, <Fra...@ma...> wrote: > > Dear Robert, > > I changed the proxy string size to 1024, did some more tests and updated > the docs. > Please find the requested diff attached, plus a tar of the files I changed, > just in case. > Thank you very much. Your patch looks great! We will integrate it next week > > Also, many thanks for your work on providing such a useful tool! > It came very handy today to stress-test and tune our web server. You are welcome! >Cheers, >Frank Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |