curl-loader-devel Mailing List for curl-loader - web application testing (Page 22)
Status: Alpha
Brought to you by:
coroberti
You can subscribe to this list here.
2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2007 |
Jan
|
Feb
(1) |
Mar
(7) |
Apr
(19) |
May
(25) |
Jun
(16) |
Jul
(59) |
Aug
(29) |
Sep
(18) |
Oct
(19) |
Nov
(7) |
Dec
(29) |
2008 |
Jan
(6) |
Feb
(18) |
Mar
(8) |
Apr
(27) |
May
(26) |
Jun
(5) |
Jul
(6) |
Aug
|
Sep
(9) |
Oct
(37) |
Nov
(61) |
Dec
(17) |
2009 |
Jan
(21) |
Feb
(25) |
Mar
(4) |
Apr
(2) |
May
(8) |
Jun
(15) |
Jul
(18) |
Aug
(23) |
Sep
(10) |
Oct
(16) |
Nov
(14) |
Dec
(22) |
2010 |
Jan
(23) |
Feb
(8) |
Mar
(18) |
Apr
(1) |
May
(34) |
Jun
(23) |
Jul
(11) |
Aug
(1) |
Sep
(13) |
Oct
(10) |
Nov
(2) |
Dec
(8) |
2011 |
Jan
|
Feb
(7) |
Mar
(24) |
Apr
(12) |
May
(3) |
Jun
(2) |
Jul
(2) |
Aug
|
Sep
(5) |
Oct
(20) |
Nov
(7) |
Dec
(11) |
2012 |
Jan
(12) |
Feb
(5) |
Mar
(16) |
Apr
(3) |
May
|
Jun
(5) |
Jul
(12) |
Aug
(6) |
Sep
|
Oct
|
Nov
(8) |
Dec
|
2013 |
Jan
(1) |
Feb
(3) |
Mar
(5) |
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(8) |
Dec
(4) |
2014 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
(11) |
Dec
(5) |
2015 |
Jan
(1) |
Feb
|
Mar
(11) |
Apr
(3) |
May
(1) |
Jun
(1) |
Jul
(4) |
Aug
(1) |
Sep
(7) |
Oct
(4) |
Nov
(2) |
Dec
|
2016 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Todd C. <tc...@in...> - 2009-05-05 18:24:04
|
Hi, The scenario is to send a GET request for a page and then send 2 POST requests per client IP. There are 9 concurrent clients running. What I find is that not all the POST requests get send before the curl-loader exits. While running the test, I capture the packets. From the trace, I find that for some of the client(s), the TCP connection is closed before the client is able to complete all the transactions. Please note that it is always the client initiates the TCP connection close (send FIN packet). I have the following questions: 1. Why client closes the TCP connection before it completes all transactions? Is there a way to tell the client not to close the TCP connection until all transactions of the client is complete? 2. Is there a way to tell the client not to close the TCP connection at all with a "timeout value"? This is because in my case the server will initiate the close by sending the FIN packet. The purpose of the "timeout value" is that the client can initiate the close connection only after the timeout value passed. Below is my curl-loader configuration: ############## GENERAL SECTION ###################### BATCH_NAME=AuthDHCP CLIENTS_NUM_MAX=9 CLIENTS_NUM_START=9 CLIENTS_RAMPUP_INC=9 INTERFACE=eth1 NETMASK=24 IP_ADDR_MIN=10.34.30.111 IP_ADDR_MAX=10.34.30.119 USER_AGENT="Post Test" URLS_NUM=3 CYCLES_NUM=1 ################ URL SECION ########################### URL="http://10.35.1.171:443/cgi-bin/dispatcher.cgi" URL_SHORT_NAME="GET.form" REQUEST_TYPE=GET TIMER_URL_COMPLETION=120000 TIMER_AFTER_URL_SLEEP=0 RESPONSE_STATUS_ERRORS="-404" ####################################################### # The 'press continue' ####################################################### URL="" URL_USE_CURRENT=1 URL_SHORT_NAME="POST.continue" REQUEST_TYPE=POST FORM_USAGE_TYPE="AS_IS" FORM_STRING=__action__=pre_auth_user TIMER_URL_COMPLETION=120000 TIMER_AFTER_URL_SLEEP=0 ####################################################### # The 'login' ####################################################### URL="" URL_USE_CURRENT=1 URL_SHORT_NAME="POST.login" USERNAME=test PASSWORD=test REQUEST_TYPE=POST FORM_USAGE_TYPE=SINGLE_USER FORM_STRING=username=%s&password=%s&__action__=auth_user TIMER_URL_COMPLETION=120000 TIMER_AFTER_URL_SLEEP=0 Thank you for your help! Todd |
From: Robert I. <cor...@gm...> - 2009-05-03 13:53:31
|
Hi Todd, On Thu, Apr 30, 2009 at 5:32 PM, Todd Chu <tc...@in...> wrote: > After untar the curl-loader-0.48, I just did “make”. Then, I got the > following error: > > make[3]: Entering directory > `/home/tchu/Download/curl-loader-0.48/build/curl/src' > > /bin/sh ../libtool --tag=CC --mode=link gcc -DCURL_MAX_WRITE_SIZE=4096 -g0 > -O2 -Wno-system-headers -o curl main.o hugehelp.o urlglob.o writeout.o > writeenv.o getpass.o homedir.o curlutil.o strtoofft.o strdup.o rawstr.o > ../lib/libcurl.la -lz > > libtool: link: gcc -DCURL_MAX_WRITE_SIZE=4096 -g0 -O2 -Wno-system-headers > -o curl main.o hugehelp.o urlglob.o writeout.o writeenv.o getpass.o > homedir.o curlutil.o strtoofft.o strdup.o rawstr.o > /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a -lssh2 -lssl > -lcrypto -lrt -lz > > /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In > function `ssh_block2waitfor': > > ssh.c:(.text+0x23a): undefined reference to > `libssh2_session_block_directions' > > /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In > function `ssh_statemach_act': > > ssh.c:(.text+0x1de1): undefined reference to `libssh2_sftp_seek64' > > ssh.c:(.text+0x2121): undefined reference to `libssh2_sftp_seek64' > > ssh.c:(.text+0x2715): undefined reference to `libssh2_sftp_seek64' > > /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In > function `ssh_easy_statemach': > > ssh.c:(.text+0x422b): undefined reference to > `libssh2_session_block_directions' > > I am running Fedora 10 and I have the libssh2-1.1 installed. I appreciate > if someone could help me. > > Regards, > > Todd > > Normally, Problem-Reporting-Form is expected. Something mismatched in Fedora-10 libssh2 and libcurl, that we are building. Therefore, please, try to modify the Makefile, target LIBCURL and instead : --without-libidn \ --disable-ldap \ to place: --without-libidn \ *--without-libssh2 \* --disable-ldap \ Note, that since this is a target of a Makefile every line is preceded by a tabulation. Further run: $make cleanall $make -- Truly, Robert Iakobashvili |
From: Todd C. <tc...@in...> - 2009-04-30 14:32:59
|
Hi, After untar the curl-loader-0.48, I just did "make". Then, I got the following error: make[3]: Entering directory `/home/tchu/Download/curl-loader-0.48/build/curl/src' /bin/sh ../libtool --tag=CC --mode=link gcc -DCURL_MAX_WRITE_SIZE=4096 -g0 -O2 -Wno-system-headers -o curl main.o hugehelp.o urlglob.o writeout.o writeenv.o getpass.o homedir.o curlutil.o strtoofft.o strdup.o rawstr.o ../lib/libcurl.la -lz libtool: link: gcc -DCURL_MAX_WRITE_SIZE=4096 -g0 -O2 -Wno-system-headers -o curl main.o hugehelp.o urlglob.o writeout.o writeenv.o getpass.o homedir.o curlutil.o strtoofft.o strdup.o rawstr.o /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a -lssh2 -lssl -lcrypto -lrt -lz /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In function `ssh_block2waitfor': ssh.c:(.text+0x23a): undefined reference to `libssh2_session_block_directions' /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In function `ssh_statemach_act': ssh.c:(.text+0x1de1): undefined reference to `libssh2_sftp_seek64' ssh.c:(.text+0x2121): undefined reference to `libssh2_sftp_seek64' ssh.c:(.text+0x2715): undefined reference to `libssh2_sftp_seek64' /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In function `ssh_easy_statemach': ssh.c:(.text+0x422b): undefined reference to `libssh2_session_block_directions' collect2: ld returned 1 exit status make[3]: *** [curl] Error 1 make[3]: Leaving directory `/home/tchu/Download/curl-loader-0.48/build/curl/src' make[2]: *** [install] Error 2 make[2]: Leaving directory `/home/tchu/Download/curl-loader-0.48/build/curl/src' make[1]: *** [install-recursive] Error 1 make[1]: Leaving directory `/home/tchu/Download/curl-loader-0.48/build/curl' make: *** [lib/libcurl.a] Error 2 I am running Fedora 10 and I have the libssh2-1.1 installed. I appreciate if someone could help me. Regards, Todd |
From: Robert I. <cor...@gm...> - 2009-04-12 18:42:01
|
Gentlemen, Several folks, namely Matt, Huy, Alex, and may be more, have been asking for a way in curl-loader to post an XML-file as a body of POST request. In meanwhile, it was recommended a work-around proposed by Alex. You may wish to try the recent fix for doing that now in svn. Please, checkout it by: svn co https://curl-loader.svn.sourceforge.net/svnroot/curl-loadercurl-loader cd to curl-loader/trunk/curl-loader and make it as usual by make. An example of configuration file is conf-examples/post-xml.conf, where the XML-file itself is residing in the same configuration directory, namely conf-examples/some.xml The majic is now done by: REQUEST_TYPE=POST MULTIPART_FORM_DATA="file=@some.xml" Your inputs will be very much appreciated. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2009-03-29 15:38:14
|
Hi Francous, On Wed, Mar 25, 2009 at 8:46 PM, François <fra...@gm...> wrote: > Hi, > > Of course Robert, here comes the patch. > > Regards, > > *Francois Pesce* > Charles Kettering - "My interest is in the future because I am going > to spend the rest of my life there." > > > You developed a very interesting and useful feature for many users. The development is done (and this OK) in curl-library area. http://curl.haxx.se/devel.html My suggestion is to submit the patch to the great curl development list cur...@co... Prior to submitting, please, formulate the motivation and explain in details, what is the scenario, that you are solving. Please, look here, if there are no other ways to configure the cookies, particularly here: http://curl.haxx.se/libcurl/c/curl_easy_setopt.html If such API exists, it can be done easier, if does not exist, probably you will be requested to make some options configurable in order to preserve the current behavior, when no need in the behavior, that you propose. My best wishes and success. Sincerely, Robert <http://curl.haxx.se/devel.html> |
From: François <fra...@gm...> - 2009-03-25 17:46:24
|
Hi, Of course Robert, here comes the patch. Regards, *Francois Pesce* Charles Kettering - "My interest is in the future because I am going to spend the rest of my life there." On Mon, Mar 23, 2009 at 9:43 PM, Robert Iakobashvili <cor...@gm...> wrote: > Hi François, > > On Mon, Mar 23, 2009 at 5:07 PM, François <fra...@gm...> wrote: >> >> Hi, >> >> I am working on a SaaS product based on a HTTP proxy developed from >> scratch. >> In order to test it correctly, last year, I have submitted a bug fix >> related to the proxy authentication mechanism. >> Now, the QA team of my company needs two additional features in >> curl-loader: >> 1- The cookies must be managed per proxy-user, in order to simulate a >> more “realistic” traffic. >> For the following configuration: >> BATCH_NAME=get_post_login >> CLIENTS_NUM_MAX = 1 >> INTERFACE=lo >> NETMASK=24 >> IP_ADDR_MIN=127.0.0.1 >> IP_ADDR_MAX=127.0.0.1 >> CYCLES_NUM= 1 >> URLS_NUM=3 >> ### Set a cookie for user test1 >> URL= http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user1 >> PROXY_AUTH_METHOD="BASIC" >> PROXY_AUTH_CREDENTIALS=test1:test1 >> URL_SHORT_NAME=" SETCOOKIE1" >> URL_DONT_CYCLE = 1 >> REQUEST_TYPE=GET >> TIMER_URL_COMPLETION = 0 >> TIMER_AFTER_URL_SLEEP =0 >> ### Set a cookie for user test2 >> URL= http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user2 >> PROXY_AUTH_METHOD="BASIC" >> PROXY_AUTH_CREDENTIALS=test2:test2 >> URL_SHORT_NAME="SETCOOKIE2" >> URL_DONT_CYCLE = 1 >> REQUEST_TYPE=GET >> TIMER_URL_COMPLETION = 0 >> TIMER_AFTER_URL_SLEEP =0 >> ### Must replay cookies for user test1 >> URL= http://www.example.com/ >> PROXY_AUTH_METHOD="BASIC" >> PROXY_AUTH_CREDENTIALS=test1:test1 >> URL_SHORT_NAME="GETCOOKIE1" >> URL_DONT_CYCLE = 1 >> REQUEST_TYPE=GET >> TIMER_URL_COMPLETION = 0 >> TIMER_AFTER_URL_SLEEP =0 >> >> Current curl-loader behavior is to replace cookie of the user test1 >> with cookies of the user test2. >> Expected behavior is to have cookies of user test1 marked as different >> from those of user test2. >> >> 2- A CSV output similar to the “siege” tools, in order to make >> curl-loader output parsed with same script tools. >> The expected behavior is to have a new parameter added to the >> curl-loader command line (-a), that will trigger a csv output to the >> .log file of the following format: >> >> get_post_login.SETCOOKIE1.proxyauth=test1,HTTP/1.1,200,0.081726,28,http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user1,,2009-03-23 >> 10:08:24 >> >> get_post_login.SETCOOKIE2.proxyauth=test2,HTTP/1.1,200,0.009584,28,http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user2,,2009-03-23 >> 10:08:24 >> >> get_post_login.GETCOOKIE1.proxyauth=test1,HTTP/1.1,200,0.020465,44,http://www.example.com/,,2009-03-23 >> 10:08:24 >> >> csv format is of the following form: >> test_name,HTTP version,HTTP >> status,total_time,recv_bytes,url,,formatted_time >> >> Feel free to include these modifications to curl-loader if you find them >> useful. >> >> Regards, >> >> *Francois Pesce* > > Nice hearing from you. How are you doing? > > First subject seems to be a rather common matter, whereas logging format > could be a matter of taste. > > Could you, please, send to me a patch dedicated to the cookies collection? > Thank you. > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > ------------------------------------------------------------------------------ > Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are > powering Web 2.0 with engaging, cross-platform capabilities. Quickly and > easily build your RIAs with Flex Builder, the Eclipse(TM)based development > software that enables intelligent coding and step-through debugging. > Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |
From: Robert I. <cor...@gm...> - 2009-03-23 20:43:48
|
Hi François, On Mon, Mar 23, 2009 at 5:07 PM, François <fra...@gm...> wrote: > Hi, > > I am working on a SaaS product based on a HTTP proxy developed from > scratch. > In order to test it correctly, last year, I have submitted a bug fix > related to the proxy authentication mechanism. > Now, the QA team of my company needs two additional features in > curl-loader: > 1- The cookies must be managed per proxy-user, in order to simulate a > more “realistic” traffic. > For the following configuration: > BATCH_NAME=get_post_login > CLIENTS_NUM_MAX = 1 > INTERFACE=lo > NETMASK=24 > IP_ADDR_MIN=127.0.0.1 > IP_ADDR_MAX=127.0.0.1 > CYCLES_NUM= 1 > URLS_NUM=3 > ### Set a cookie for user test1 > URL= http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user1 > PROXY_AUTH_METHOD=<http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user1%0APROXY_AUTH_METHOD=> > "BASIC" > PROXY_AUTH_CREDENTIALS=test1:test1 > URL_SHORT_NAME=" SETCOOKIE1" > URL_DONT_CYCLE = 1 > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP =0 > ### Set a cookie for user test2 > URL= http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user2 > PROXY_AUTH_METHOD=<http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user2%0APROXY_AUTH_METHOD=> > "BASIC" > PROXY_AUTH_CREDENTIALS=test2:test2 > URL_SHORT_NAME="SETCOOKIE2" > URL_DONT_CYCLE = 1 > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP =0 > ### Must replay cookies for user test1 > URL= http://www.example.com/ > PROXY_AUTH_METHOD= <http://www.example.com/%0APROXY_AUTH_METHOD=>"BASIC" > PROXY_AUTH_CREDENTIALS=test1:test1 > URL_SHORT_NAME="GETCOOKIE1" > URL_DONT_CYCLE = 1 > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 0 > TIMER_AFTER_URL_SLEEP =0 > > Current curl-loader behavior is to replace cookie of the user test1 > with cookies of the user test2. > Expected behavior is to have cookies of user test1 marked as different > from those of user test2. > > 2- A CSV output similar to the “siege” tools, in order to make > curl-loader output parsed with same script tools. > The expected behavior is to have a new parameter added to the > curl-loader command line (-a), that will trigger a csv output to the > .log file of the following format: > get_post_login.SETCOOKIE1.proxyauth=test1,HTTP/1.1,200,0.081726,28, > http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user1,,2009-03-23 > 10:08:24 > get_post_login.SETCOOKIE2.proxyauth=test2,HTTP/1.1,200,0.009584,28, > http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user2,,2009-03-23 > 10:08:24 > get_post_login.GETCOOKIE1.proxyauth=test1,HTTP/1.1,200,0.020465,44, > http://www.example.com/,,2009-03-23 > 10:08:24 > > csv format is of the following form: > test_name,HTTP version,HTTP > status,total_time,recv_bytes,url,,formatted_time > > Feel free to include these modifications to curl-loader if you find them > useful. > > Regards, > > *Francois Pesce* > Nice hearing from you. How are you doing? First subject seems to be a rather common matter, whereas logging format could be a matter of taste. Could you, please, send to me a patch dedicated to the cookies collection? Thank you. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: François <fra...@gm...> - 2009-03-23 15:08:05
|
Hi, I am working on a SaaS product based on a HTTP proxy developed from scratch. In order to test it correctly, last year, I have submitted a bug fix related to the proxy authentication mechanism. Now, the QA team of my company needs two additional features in curl-loader: 1- The cookies must be managed per proxy-user, in order to simulate a more “realistic” traffic. For the following configuration: BATCH_NAME=get_post_login CLIENTS_NUM_MAX = 1 INTERFACE=lo NETMASK=24 IP_ADDR_MIN=127.0.0.1 IP_ADDR_MAX=127.0.0.1 CYCLES_NUM= 1 URLS_NUM=3 ### Set a cookie for user test1 URL= http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user1 PROXY_AUTH_METHOD="BASIC" PROXY_AUTH_CREDENTIALS=test1:test1 URL_SHORT_NAME=" SETCOOKIE1" URL_DONT_CYCLE = 1 REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP =0 ### Set a cookie for user test2 URL= http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user2 PROXY_AUTH_METHOD="BASIC" PROXY_AUTH_CREDENTIALS=test2:test2 URL_SHORT_NAME="SETCOOKIE2" URL_DONT_CYCLE = 1 REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP =0 ### Must replay cookies for user test1 URL= http://www.example.com/ PROXY_AUTH_METHOD="BASIC" PROXY_AUTH_CREDENTIALS=test1:test1 URL_SHORT_NAME="GETCOOKIE1" URL_DONT_CYCLE = 1 REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP =0 Current curl-loader behavior is to replace cookie of the user test1 with cookies of the user test2. Expected behavior is to have cookies of user test1 marked as different from those of user test2. 2- A CSV output similar to the “siege” tools, in order to make curl-loader output parsed with same script tools. The expected behavior is to have a new parameter added to the curl-loader command line (-a), that will trigger a csv output to the .log file of the following format: get_post_login.SETCOOKIE1.proxyauth=test1,HTTP/1.1,200,0.081726,28,http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user1,,2009-03-23 10:08:24 get_post_login.SETCOOKIE2.proxyauth=test2,HTTP/1.1,200,0.009584,28,http://www.example.com/cgi-bin/set-cookie?cookie=cookie_for_user2,,2009-03-23 10:08:24 get_post_login.GETCOOKIE1.proxyauth=test1,HTTP/1.1,200,0.020465,44,http://www.example.com/,,2009-03-23 10:08:24 csv format is of the following form: test_name,HTTP version,HTTP status,total_time,recv_bytes,url,,formatted_time Feel free to include these modifications to curl-loader if you find them useful. Regards, *Francois Pesce* |
From: Robert I. <cor...@gm...> - 2009-02-26 18:50:18
|
Hi Rick, On Thu, Feb 26, 2009 at 6:22 PM, Richard Parker <rp...@ce...> wrote: > Hello, > > Has anyone tested against a windows box? (Specifically SharePoint but still > IIS) > > I am getting a lot of failed attempts as I try to run curl-loader. > > Here is a portion of my config; > > WEB_AUTH_METHOD=NTLM > > WEB_AUTH_CREDENTIALS=administrator:certeon > > TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by > cancelling url fetch on timeout > > TIMER_AFTER_URL_SLEEP = 500-2000 > Changing the subject. NTLM authentication in libcurl: - requires a special build, please refer to our or libcurl documentation; - has a limit of a single session and various problems. Therefore, we are not supporting NTLM now. Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Richard P. <rp...@ce...> - 2009-02-26 16:24:54
|
Just for clarification, by Failed attempts I mean 401's. -- Thank you, Rick From: Richard Parker [mailto:rp...@ce...] Sent: Thursday, February 26, 2009 11:23 AM To: curl-loader-devel Subject: RE: with ubuntu 64 - bit still receiving parse_config_file-error Hello, Has anyone tested against a windows box? (Specifically SharePoint but still IIS) I am getting a lot of failed attempts as I try to run curl-loader. Here is a portion of my config; ######### GENERAL SECTION BATCH_NAME=vmoss-07 CLIENTS_NUM_MAX=2 CLIENTS_RAMPUP_INC=1 CLIENTS_NUM_START=1 INTERFACE=eth2 NETMASK=24 IP_ADDR_MIN=10.10.40.40 IP_ADDR_MAX=10.10.40.239 URLS_NUM=8 CYCLES_NUM=10 TIMER_TCP_CONN_SETUP=10 USER_AGENT= CURL-Loader ########### URL SECTION ####### ### Login URL - cycling # GET-part URL=http://vmoss-07/Pages/Default.aspx URL_SHORT_NAME="default page" #URL_DONT_CYCLE=1 REQUEST_TYPE=GET WEB_AUTH_METHOD=NTLM WEB_AUTH_CREDENTIALS=administrator:password TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP = 500-2000 ### Cycling URL # URL=http://vmoss-07/Docs URL_SHORT_NAME="Documents Dir" REQUEST_TYPE=GET WEB_AUTH_METHOD=NTLM WEB_AUTH_CREDENTIALS=administrator:certeon TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP = 500-2000 Thanks, -- Thank you, Rick From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Wednesday, February 25, 2009 3:13 PM To: Richard Parker Cc: curl-loader-devel Subject: Re: with ubuntu 64 - bit still receiving parse_config_file-error Hi Richard, On Wed, Feb 25, 2009 at 9:23 PM, Richard Parker <rp...@ce...<mailto:rp...@ce...>> wrote: Just tested with gcc-4.1 and it works..... Thanks!! Thank you, Rick Great! Well, we need to take a timeslot and rectify this issue with gcc/glibc people. Take care! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: 02/24/09 13:35:00 No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: 02/26/09 07:03:00 |
From: Richard P. <rp...@ce...> - 2009-02-26 16:23:03
|
Hello, Has anyone tested against a windows box? (Specifically SharePoint but still IIS) I am getting a lot of failed attempts as I try to run curl-loader. Here is a portion of my config; ######### GENERAL SECTION BATCH_NAME=vmoss-07 CLIENTS_NUM_MAX=2 CLIENTS_RAMPUP_INC=1 CLIENTS_NUM_START=1 INTERFACE=eth2 NETMASK=24 IP_ADDR_MIN=10.10.40.40 IP_ADDR_MAX=10.10.40.239 URLS_NUM=8 CYCLES_NUM=10 TIMER_TCP_CONN_SETUP=10 USER_AGENT= CURL-Loader ########### URL SECTION ####### ### Login URL - cycling # GET-part URL=http://vmoss-07/Pages/Default.aspx URL_SHORT_NAME="default page" #URL_DONT_CYCLE=1 REQUEST_TYPE=GET WEB_AUTH_METHOD=NTLM WEB_AUTH_CREDENTIALS=administrator:password TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP = 500-2000 ### Cycling URL # URL=http://vmoss-07/Docs URL_SHORT_NAME="Documents Dir" REQUEST_TYPE=GET WEB_AUTH_METHOD=NTLM WEB_AUTH_CREDENTIALS=administrator:certeon TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP = 500-2000 Thanks, -- Thank you, Rick From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Wednesday, February 25, 2009 3:13 PM To: Richard Parker Cc: curl-loader-devel Subject: Re: with ubuntu 64 - bit still receiving parse_config_file-error Hi Richard, On Wed, Feb 25, 2009 at 9:23 PM, Richard Parker <rp...@ce...<mailto:rp...@ce...>> wrote: Just tested with gcc-4.1 and it works..... Thanks!! Thank you, Rick Great! Well, we need to take a timeslot and rectify this issue with gcc/glibc people. Take care! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com<http://www.ghotit.com> Assistive technology that understands you ...................................................................... No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: 02/24/09 13:35:00 |
From: Robert I. <cor...@gm...> - 2009-02-25 20:12:56
|
Hi Richard, On Wed, Feb 25, 2009 at 9:23 PM, Richard Parker <rp...@ce...> wrote: > Just tested with gcc-4.1 and it works….. > > Thanks!! > > Thank you, > > Rick > Great! Well, we need to take a timeslot and rectify this issue with gcc/glibc people. Take care! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Richard P. <rp...@ce...> - 2009-02-25 19:23:34
|
Just tested with gcc-4.1 and it works..... Thanks!! -- Thank you, Rick From: Richard Parker [mailto:rp...@ce...] Sent: Wednesday, February 25, 2009 2:14 PM To: Robert Iakobashvili Cc: curl-loader-devel Subject: RE: with ubuntu 64 - bit still receiving parse_config_file-error I will try a different compiler first.. Thank you very much for your help!! Rick -- Thank you, Rick From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Wednesday, February 25, 2009 1:07 PM To: Richard Parker Cc: curl-loader-devel Subject: Re: with ubuntu 64 - bit still receiving parse_config_file-error Hi Rick, On Wed, Feb 25, 2009 at 8:20 PM, Rick Parker <rp...@ce...<mailto:rp...@ce...>> wrote: > read(3, "######### GENERAL SECTION\nBATCH_"..., 4096) = 610 > > The above string from your strace is resulting from fgets() reading. > fgets is normally reading line till the nearest end of line symbol > (\n, or \r\n) > > Here it is reading a lot (610) characters, which means several (if not > all) lines. > The parser is supposed to get from fgets input as line after line. > Here the correct behavior is completely broken. > The question who is the bad man: the compiler or the glibc? > > Try a one more thing - to place instead of all \n - \r\n > in vi > :set ff=dos > > and save the file :wq > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > No virus found in this incoming message. Checked by AVG - www.avg.com<http://www.avg.com> > Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: > 02/24/09 13:35:00 read(3, "########### GENERAL SECTION\r\nBAT"..., 4096) = 602 Yep, fgets is completely broken. We'll need to re-write it and make our own function instead. Since we are very busy nowdays, please, try another compiler or linux distribution in meanwhile. Sorry. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: 02/24/09 13:35:00 No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: 02/24/09 13:35:00 |
From: Richard P. <rp...@ce...> - 2009-02-25 19:14:30
|
I will try a different compiler first.. Thank you very much for your help!! Rick -- Thank you, Rick From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Wednesday, February 25, 2009 1:07 PM To: Richard Parker Cc: curl-loader-devel Subject: Re: with ubuntu 64 - bit still receiving parse_config_file-error Hi Rick, On Wed, Feb 25, 2009 at 8:20 PM, Rick Parker <rp...@ce...<mailto:rp...@ce...>> wrote: > read(3, "######### GENERAL SECTION\nBATCH_"..., 4096) = 610 > > The above string from your strace is resulting from fgets() reading. > fgets is normally reading line till the nearest end of line symbol > (\n, or \r\n) > > Here it is reading a lot (610) characters, which means several (if not > all) lines. > The parser is supposed to get from fgets input as line after line. > Here the correct behavior is completely broken. > The question who is the bad man: the compiler or the glibc? > > Try a one more thing - to place instead of all \n - \r\n > in vi > :set ff=dos > > and save the file :wq > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > No virus found in this incoming message. Checked by AVG - www.avg.com<http://www.avg.com> > Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: > 02/24/09 13:35:00 read(3, "########### GENERAL SECTION\r\nBAT"..., 4096) = 602 Yep, fgets is completely broken. We'll need to re-write it and make our own function instead. Since we are very busy nowdays, please, try another compiler or linux distribution in meanwhile. Sorry. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: 02/24/09 13:35:00 |
From: Robert I. <cor...@gm...> - 2009-02-25 18:06:52
|
Hi Rick, On Wed, Feb 25, 2009 at 8:20 PM, Rick Parker <rp...@ce...> wrote: > > > read(3, "######### GENERAL SECTION\nBATCH_"..., 4096) = 610 > > > > The above string from your strace is resulting from fgets() reading. > > fgets is normally reading line till the nearest end of line symbol > > (\n, or \r\n) > > > > Here it is reading a lot (610) characters, which means several (if not > > all) lines. > > The parser is supposed to get from fgets input as line after line. > > Here the correct behavior is completely broken. > > The question who is the bad man: the compiler or the glibc? > > > > Try a one more thing - to place instead of all \n - \r\n > > in vi > > :set ff=dos > > > > and save the file :wq > > > > -- > > Truly, > > Robert Iakobashvili, Ph.D. > > ...................................................................... > > Assistive technology that understands you > > ...................................................................... > > No virus found in this incoming message. Checked by AVG - www.avg.com > > Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: > > 02/24/09 13:35:00 > > read(3, "########### GENERAL SECTION\r\nBAT"..., 4096) = 602 Yep, fgets is completely broken. We'll need to re-write it and make our own function instead. Since we are very busy nowdays, please, try another compiler or linux distribution in meanwhile. Sorry. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Rick P. <rp...@ce...> - 2009-02-25 18:02:10
|
Hello Robert, I did what you said and this is what I have in strace with the optimize read(3, "########### GENERAL SECTION\r\nBAT"..., 4096) = 602 read(3, "", 4096) = 0 close(3) = 0 munmap(0x7f4fec1ad000, 4096) = 0 write(2, "parse_config_file - error: faile"..., 63parse_config_file - error: failed to load even a single batch. ) = 63 write(2, "main - error: parse_config_file "..., 43main - error: parse_config_file () failed. ) = 43 exit_group(-1) = ? Process 6161 detached Here is what I received with the debug; brk(0x12b2000) = 0x12b2000 open("./10K.conf", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=602, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7fb3ef2df000 read(3, "########### GENERAL SECTION\r\nBAT"..., 4096) = 602 --- SIGSEGV (Segmentation fault) @ 0 (0) --- +++ killed by SIGSEGV +++ Process 6249 detached On Wed, 2009-02-25 at 12:45 -0500, Robert Iakobashvili wrote: > Hi Rick, > > On Wed, Feb 25, 2009 at 7:47 PM, Rick Parker <rp...@ce...> > wrote: > "Printed out strings?" The ones from the strace? or something > different? > I will get you whatever you would like for debugging. > I will attempt to recompile with a gcc that is <4.1. > I also did make a default "make" (which i thought was debug=1 > and > optimize =0) but it segfaulted.. (would you like to see that > strace?) > > Thank you very much Robert!! > Rick > > > read(3, "######### GENERAL SECTION\nBATCH_"..., 4096) = 610 > > The above string from your strace is resulting from fgets() reading. > fgets is normally reading line till the nearest end of line symbol > (\n, or \r\n) > > Here it is reading a lot (610) characters, which means several (if not > all) lines. > The parser is supposed to get from fgets input as line after line. > Here the correct behavior is completely broken. > The question who is the bad man: the compiler or the glibc? > > Try a one more thing - to place instead of all \n - \r\n > in vi > :set ff=dos > > and save the file :wq > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > No virus found in this incoming message. Checked by AVG - www.avg.com > Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: > 02/24/09 13:35:00 |
From: Robert I. <cor...@gm...> - 2009-02-25 17:45:07
|
Hi Rick, On Wed, Feb 25, 2009 at 7:47 PM, Rick Parker <rp...@ce...> wrote: > "Printed out strings?" The ones from the strace? or something different? > I will get you whatever you would like for debugging. > I will attempt to recompile with a gcc that is <4.1. > I also did make a default "make" (which i thought was debug=1 and > optimize =0) but it segfaulted.. (would you like to see that strace?) > > Thank you very much Robert!! > Rick > read(3, "######### GENERAL SECTION\nBATCH_"..., 4096) = 610 The above string from your strace is resulting from fgets() reading. fgets is normally reading line till the nearest end of line symbol (\n, or \r\n) Here it is reading a lot (610) characters, which means several (if not all) lines. The parser is supposed to get from fgets input as line after line. Here the correct behavior is completely broken. The question who is the bad man: the compiler or the glibc? Try a one more thing - to place instead of all \n - \r\n in vi :set ff=dos and save the file :wq -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Rick P. <rp...@ce...> - 2009-02-25 17:29:53
|
Hello Robert, On Wed, 2009-02-25 at 12:18 -0500, Robert Iakobashvili wrote: > Hi Rick > > On Wed, Feb 25, 2009 at 7:24 PM, Rick Parker <rp...@ce...> > wrote: > Sorry about that... > Shoulda read the docs a little more.... > > > CURL-LOADER VERSION: 0.47, December 2, 2008 > > HW DETAILS: CPU/S and memory are must: > MEMORY > MemTotal: 3922660 kB > MemFree: 1658920 kB > > CPU > processor : 0 > vendor_id : GenuineIntel > cpu family : 6 > model : 23 > model name : Intel(R) Xeon(R) CPU E5430 @ > 2.66GHz > stepping : 6 > cpu MHz : 2659.984 > cache size : 6144 KB > physical id : 0 > siblings : 1 > core id : 0 > cpu cores : 1 > a > > LINUX DISTRIBUTION and KERNEL (uname -r): > 2.6.27-11-server > > GCC VERSION (gcc -v): > Place the file inline here: > Using built-in specs. > Target: x86_64-linux-gnu > Configured with: ../src/configure -v --with-pkgversion='Ubuntu > > gcc version 4.3.2 (Ubuntu 4.3.2-1ubuntu12) > > COMPILATION AND MAKING OPTIONS (if defaults changed): > make optimize=1 debug=0 > > COMMAND-LINE: > > CONFIGURATION-FILE (The most common source of problems): > ########### GENERAL SECTION ################################ > BATCH_NAME=10K > CLIENTS_NUM_MAX=10000 > CLIENTS_NUM_START=100 > CLIENTS_RAMPUP_INC=50 > INTERFACE=eth2 > NETMASK=24 > IP_ADDR_MIN=10.10.40.50 > IP_ADDR_MAX=10.10.40.150 > CYCLES_NUM=-1 > URLS_NUM=1 > > ########### URL SECTION #################################### > > URL=http://localhost/index.html > #URL=http://localhost/ACE-INSTALL.html > URL_SHORT_NAME="local-index" > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 5000 # In msec. When positive, Now > it is > enforced by cancelling url fetch on timeout > TIMER_AFTER_URL_SLEEP =20 > > > > DOES THE PROBLEM AFFECT: > COMPILATION? No > LINKING? No > EXECUTION? yes > OTHER (please specify)? > Have you run $make cleanall prior to $make ? Yes many times > > > DESCRIPTION: > I receive the following when trying to run the batch > parse_config_file - error: failed to load even a single batch. > main - error: parse_config_file () failed. > > > QUESTION/ SUGGESTION/ PATCH: > > > You are supposed to get printed out strings of your batch file, > at least the first one. > Could you see something? > > There is a rather weird issue with fgets () and gcc-4.3 or recent > glibc at 64-bit systems. Initially, we considered that as our own bug, > but later we have seen the issue at least in two other projects. > > As a work-around you may try an older compiler, like gcc-3.4, gcc-4.1 > or to compile curl-loader with debugging and without optimization. > $make cleanall > $make debug=1 optimize=0 "Printed out strings?" The ones from the strace? or something different? I will get you whatever you would like for debugging. I will attempt to recompile with a gcc that is <4.1. I also did make a default "make" (which i thought was debug=1 and optimize =0) but it segfaulted.. (would you like to see that strace?) Thank you very much Robert!! Rick > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > No virus found in this incoming message. Checked by AVG - www.avg.com > Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: > 02/24/09 13:35:00 |
From: Robert I. <cor...@gm...> - 2009-02-25 17:18:58
|
Hi Rick On Wed, Feb 25, 2009 at 7:24 PM, Rick Parker <rp...@ce...> wrote: > Sorry about that... > Shoulda read the docs a little more.... > > > CURL-LOADER VERSION: 0.47, December 2, 2008 > > HW DETAILS: CPU/S and memory are must: > MEMORY > MemTotal: 3922660 kB > MemFree: 1658920 kB > > CPU > processor : 0 > vendor_id : GenuineIntel > cpu family : 6 > model : 23 > model name : Intel(R) Xeon(R) CPU E5430 @ 2.66GHz > stepping : 6 > cpu MHz : 2659.984 > cache size : 6144 KB > physical id : 0 > siblings : 1 > core id : 0 > cpu cores : 1 > a > > LINUX DISTRIBUTION and KERNEL (uname -r): > 2.6.27-11-server > > GCC VERSION (gcc -v): > Place the file inline here: > Using built-in specs. > Target: x86_64-linux-gnu > Configured with: ../src/configure -v --with-pkgversion='Ubuntu > > gcc version 4.3.2 (Ubuntu 4.3.2-1ubuntu12) > > COMPILATION AND MAKING OPTIONS (if defaults changed): > make optimize=1 debug=0 > > COMMAND-LINE: > > CONFIGURATION-FILE (The most common source of problems): > ########### GENERAL SECTION ################################ > BATCH_NAME=10K > CLIENTS_NUM_MAX=10000 > CLIENTS_NUM_START=100 > CLIENTS_RAMPUP_INC=50 > INTERFACE=eth2 > NETMASK=24 > IP_ADDR_MIN=10.10.40.50 > IP_ADDR_MAX=10.10.40.150 > CYCLES_NUM=-1 > URLS_NUM=1 > > ########### URL SECTION #################################### > > URL=http://localhost/index.html > #URL=http://localhost/ACE-INSTALL.html > URL_SHORT_NAME="local-index" > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 5000 # In msec. When positive, Now it is > enforced by cancelling url fetch on timeout > TIMER_AFTER_URL_SLEEP =20 > > > > DOES THE PROBLEM AFFECT: > COMPILATION? No > LINKING? No > EXECUTION? yes > OTHER (please specify)? > Have you run $make cleanall prior to $make ? Yes many times > > > DESCRIPTION: > I receive the following when trying to run the batch > parse_config_file - error: failed to load even a single batch. > main - error: parse_config_file () failed. > > QUESTION/ SUGGESTION/ PATCH: > You are supposed to get printed out strings of your batch file, at least the first one. Could you see something? There is a rather weird issue with fgets () and gcc-4.3 or recent glibc at 64-bit systems. Initially, we considered that as our own bug, but later we have seen the issue at least in two other projects. As a work-around you may try an older compiler, like gcc-3.4, gcc-4.1 or to compile curl-loader with debugging and without optimization. $make cleanall $make debug=1 optimize=0 -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Rick P. <rp...@ce...> - 2009-02-25 17:06:15
|
Sorry about that... Shoulda read the docs a little more.... CURL-LOADER VERSION: 0.47, December 2, 2008 HW DETAILS: CPU/S and memory are must: MEMORY MemTotal: 3922660 kB MemFree: 1658920 kB Buffers: 76260 kB Cached: 1613740 kB SwapCached: 0 kB Active: 159144 kB Inactive: 1587444 kB SwapTotal: 5445992 kB SwapFree: 5445992 kB Dirty: 0 kB Writeback: 0 kB AnonPages: 56588 kB Mapped: 17408 kB Slab: 163708 kB SReclaimable: 131804 kB SUnreclaim: 31904 kB PageTables: 4600 kB NFS_Unstable: 0 kB Bounce: 0 kB WritebackTmp: 0 kB CommitLimit: 7407320 kB Committed_AS: 320524 kB VmallocTotal: 34359738367 kB VmallocUsed: 19284 kB VmallocChunk: 34359719075 kB HugePages_Total: 0 HugePages_Free: 0 HugePages_Rsvd: 0 HugePages_Surp: 0 Hugepagesize: 2048 kB DirectMap4k: 8128 kB DirectMap2M: 6283264 kB CPU processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 23 model name : Intel(R) Xeon(R) CPU E5430 @ 2.66GHz stepping : 6 cpu MHz : 2659.984 cache size : 6144 KB physical id : 0 siblings : 1 core id : 0 cpu cores : 1 apicid : 0 initial apicid : 0 fpu : yes fpu_exception : yes cpuid level : 6 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ss ht syscall nx lm constant_tsc up rep_good nopl pni ssse3 cx16 sse4_1 lahf_lm bogomips : 5319.96 clflush size : 64 cache_alignment : 64 address sizes : 40 bits physical, 48 bits virtual power management: LINUX DISTRIBUTION and KERNEL (uname -r): 2.6.27-11-server GCC VERSION (gcc -v): Place the file inline here: Using built-in specs. Target: x86_64-linux-gnu Configured with: ../src/configure -v --with-pkgversion='Ubuntu 4.3.2-1ubuntu12' --with-bugurl=file:///usr/share/doc/gcc-4.3/README.Bugs --enable-languages=c,c++,fortran,objc,obj-c++ --prefix=/usr --enable-shared --with-system-zlib --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --enable-nls --with-gxx-include-dir=/usr/include/c++/4.3 --program-suffix=-4.3 --enable-clocale=gnu --enable-libstdcxx-debug --enable-objc-gc --enable-mpfr --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu Thread model: posix gcc version 4.3.2 (Ubuntu 4.3.2-1ubuntu12) COMPILATION AND MAKING OPTIONS (if defaults changed): make optimize=1 debug=0 COMMAND-LINE: CONFIGURATION-FILE (The most common source of problems): ########### GENERAL SECTION ################################ BATCH_NAME=10K CLIENTS_NUM_MAX=10000 CLIENTS_NUM_START=100 CLIENTS_RAMPUP_INC=50 INTERFACE=eth2 NETMASK=24 IP_ADDR_MIN=10.10.40.50 IP_ADDR_MAX=10.10.40.150 CYCLES_NUM=-1 URLS_NUM=1 ########### URL SECTION #################################### URL=http://localhost/index.html #URL=http://localhost/ACE-INSTALL.html URL_SHORT_NAME="local-index" REQUEST_TYPE=GET TIMER_URL_COMPLETION = 5000 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP =20 DOES THE PROBLEM AFFECT: COMPILATION? No LINKING? No EXECUTION? yes OTHER (please specify)? Have you run $make cleanall prior to $make ? Yes many times DESCRIPTION: I receive the following when trying to run the batch parse_config_file - error: failed to load even a single batch. main - error: parse_config_file () failed. QUESTION/ SUGGESTION/ PATCH: On Wed, 2009-02-25 at 11:50 -0500, Robert Iakobashvili wrote: > Hi Richard > > On Wed, Feb 25, 2009 at 5:56 PM, Richard Parker <rp...@ce...> > wrote: > Hello, > > I did see that 0.47 was to fix the following problem with > Ubuntu 2.6.x kernels on 64 bit platform. > > Parse_config_file – error: failed to load even a single batch. > > Main – error: parse_config_file() failed. > > > > What other kind of information would you like? > > Thanks, > > Rick > > > > Please, post the duly filled PROBLEM-REPORTING-FORM (PRF) > from the docs. > > > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > Assistive technology that understands you > ...................................................................... > No virus found in this incoming message. Checked by AVG - www.avg.com > Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: > 02/24/09 13:35:00 |
From: Robert I. <cor...@gm...> - 2009-02-25 16:50:39
|
Hi Richard On Wed, Feb 25, 2009 at 5:56 PM, Richard Parker <rp...@ce...> wrote: > Hello, > > I did see that 0.47 was to fix the following problem with Ubuntu 2.6.x > kernels on 64 bit platform. > > Parse_config_file – error: failed to load even a single batch. > > Main – error: parse_config_file() failed. > > > > What other kind of information would you like? > > Thanks, > > Rick > Please, post the duly filled PROBLEM-REPORTING-FORM (PRF) from the docs. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Rick P. <rp...@ce...> - 2009-02-25 16:25:11
|
Here is the strace on the execution; set_tid_address(0x7f51f9b9d770) = 4718 set_robust_list(0x7f51f9b9d780, 0x18) = 0 futex(0x7fff01bab33c, 0x81 /* FUTEX_??? */, 1) = 0 rt_sigaction(SIGRTMIN, {0x7f51f9571660, [], SA_RESTORER|SA_SIGINFO, 0x7f51f957b0f0}, NULL, 8) = 0 rt_sigaction(SIGRT_1, {0x7f51f95716f0, [], SA_RESTORER|SA_RESTART| SA_SIGINFO, 0x7f51f957b0f0}, NULL, 8) = 0 rt_sigprocmask(SIG_UNBLOCK, [RTMIN RT_1], NULL, 8) = 0 getrlimit(RLIMIT_STACK, {rlim_cur=8192*1024, rlim_max=RLIM_INFINITY}) = 0 rt_sigaction(SIGPIPE, {SIG_IGN}, {SIG_DFL}, 8) = 0 geteuid() = 0 stat("curl-config.conf", {st_mode=S_IFREG|0644, st_size=610, ...}) = 0 brk(0) = 0x147c000 brk(0x149d000) = 0x149d000 open("curl-config.conf", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=610, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x7f51f9ba6000 read(3, "######### GENERAL SECTION\nBATCH_"..., 4096) = 610 read(3, "", 4096) = 0 close(3) = 0 munmap(0x7f51f9ba6000, 4096) = 0 write(2, "parse_config_file - error: faile"..., 63) = 63 write(2, "main - error: parse_config_file "..., 43) = 43 exit_group(-1 On Wed, 2009-02-25 at 10:56 -0500, Richard Parker wrote: > Hello, > > I did see that 0.47 was to fix the following problem with Ubuntu 2.6.x > kernels on 64 bit platform. > > Parse_config_file – error: failed to load even a single batch. > > Main – error: parse_config_file() failed. > > > > What other kind of information would you like? > > Thanks, > > Rick > > > > > > > No virus found in this incoming message. Checked by AVG - www.avg.com > Version: 8.0.237 / Virus Database: 270.11.3/1969 - Release Date: > 02/24/09 13:35:00 |
From: Richard P. <rp...@ce...> - 2009-02-25 16:09:18
|
Hello, I did see that 0.47 was to fix the following problem with Ubuntu 2.6.x kernels on 64 bit platform. Parse_config_file - error: failed to load even a single batch. Main - error: parse_config_file() failed. What other kind of information would you like? Thanks, Rick |
From: Robert I. <cor...@gm...> - 2009-02-24 20:36:57
|
Hi Aron, On Tue, Feb 24, 2009 at 10:28 PM, Bellorado, Aron <abe...@ve...>wrote: > To force the TCP connections not to be reused, I tried using the –r > option with version 0.47 during the curl-loader execution command and this > did not cause curl-loader to close the TCP connections after finishing the > transaction as I expected. Then, I tried including the ‘FRESH_CONNECT=1’ > option for each URL, and curl-loader did properly close the connection for > each transaction. With the connections not being re-used, the client IP > addresses were ramped up correctly. > > > > With this said, is there a way to use curl-loader to ramp up clients using > properly incrementing IP addresses without closing the TCP connection after > a transaction is complete? Thanks in advance for the help and prompt > response. > > > > Aron > > Sorry, it requires some heavy development in libcurl, therefore, I do not see it will be fixed very soon. Take care! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |
From: Bellorado, A. <abe...@ve...> - 2009-02-24 20:28:21
|
To force the TCP connections not to be reused, I tried using the -r option with version 0.47 during the curl-loader execution command and this did not cause curl-loader to close the TCP connections after finishing the transaction as I expected. Then, I tried including the 'FRESH_CONNECT=1' option for each URL, and curl-loader did properly close the connection for each transaction. With the connections not being re-used, the client IP addresses were ramped up correctly. With this said, is there a way to use curl-loader to ramp up clients using properly incrementing IP addresses without closing the TCP connection after a transaction is complete? Thanks in advance for the help and prompt response. Aron ________________________________ From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Tuesday, February 24, 2009 3:20 PM To: curl-loader-devel Subject: Re: Not ramping up IP addresses correctly when TCP connections remain open Hi Aron, On Tue, Feb 24, 2009 at 9:56 PM, Bellorado, Aron <abe...@ve...<mailto:abe...@ve...>> wrote: I am using curl-loader 0.47 against an Apache web server running HTTP 1.1. With the Apache web server "KeepAlive" config parameter set to OFF, curl-loader properly ramp ups the number of clients based on the specific batch file parameters as expected with the Apache web server closing every connection after each transaction is made. The attached capture file "withKeepAliveOff.cap" shows this proper behavior using the curl-loader batch file shown below. When Apache was configured with "KeepAlive" set to ON, the web server does not close the TCP connection after a transaction is complete leaving the connection open for the client for subsequent transactions. When curl-loader was run an Apache web server with Keep Alive set to ON, curl-loader does not ramp up the clients correctly not incrementing IP address appropriately during the ramp up time. Curl-loader appears to be reusing existing client IP addresses (possibly because the connections are still open) instead of using incremented IP addresses, although the curl-loader log files show the IP addresses are being incremented properly. The attached capture file "withKeepAliveOn.cap" shows this incorrect behavior using the same curl-loader batch file shown below. Any help would be greatly appreciated. This behavior is inherited from libcurl. Try to configure the connections not to be re-used (there is a tag for that), but it is not necessarily will be helpful to keep KA connections with apache. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... Assistive technology that understands you ...................................................................... |