curl-loader-devel Mailing List for curl-loader - web application testing (Page 21)
Status: Alpha
Brought to you by:
coroberti
You can subscribe to this list here.
2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2007 |
Jan
|
Feb
(1) |
Mar
(7) |
Apr
(19) |
May
(25) |
Jun
(16) |
Jul
(59) |
Aug
(29) |
Sep
(18) |
Oct
(19) |
Nov
(7) |
Dec
(29) |
2008 |
Jan
(6) |
Feb
(18) |
Mar
(8) |
Apr
(27) |
May
(26) |
Jun
(5) |
Jul
(6) |
Aug
|
Sep
(9) |
Oct
(37) |
Nov
(61) |
Dec
(17) |
2009 |
Jan
(21) |
Feb
(25) |
Mar
(4) |
Apr
(2) |
May
(8) |
Jun
(15) |
Jul
(18) |
Aug
(23) |
Sep
(10) |
Oct
(16) |
Nov
(14) |
Dec
(22) |
2010 |
Jan
(23) |
Feb
(8) |
Mar
(18) |
Apr
(1) |
May
(34) |
Jun
(23) |
Jul
(11) |
Aug
(1) |
Sep
(13) |
Oct
(10) |
Nov
(2) |
Dec
(8) |
2011 |
Jan
|
Feb
(7) |
Mar
(24) |
Apr
(12) |
May
(3) |
Jun
(2) |
Jul
(2) |
Aug
|
Sep
(5) |
Oct
(20) |
Nov
(7) |
Dec
(11) |
2012 |
Jan
(12) |
Feb
(5) |
Mar
(16) |
Apr
(3) |
May
|
Jun
(5) |
Jul
(12) |
Aug
(6) |
Sep
|
Oct
|
Nov
(8) |
Dec
|
2013 |
Jan
(1) |
Feb
(3) |
Mar
(5) |
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(8) |
Dec
(4) |
2014 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
(11) |
Dec
(5) |
2015 |
Jan
(1) |
Feb
|
Mar
(11) |
Apr
(3) |
May
(1) |
Jun
(1) |
Jul
(4) |
Aug
(1) |
Sep
(7) |
Oct
(4) |
Nov
(2) |
Dec
|
2016 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Robert I. <cor...@gm...> - 2009-07-27 05:38:22
|
Hi Jan, On Fri, Jul 24, 2009 at 4:25 PM, Jan Wielemaker <J.W...@cs...>wrote: > Hi, > > Just trying my first steps into curl-loader :-) I looked through the FAQ > and config files, but I couldn't find an answer on how to query a *lot* of > random paths using a pool of -lets say- 20 clients. > > I see we have FORM_RECORDS_RANDOM, which seems to do the trick for > form-data. I am trying to debug server crashes, where I know it is > related to concurrency, but otherwise I have little clue on whether or > not it is related to the actual query paths. > > Just to bash the server, I collected a list of 6,000 paths that have > been queried on the server recently and then I use the config below, but > curl-loader only seems to use the first 20 paths of the file :-( > > Did I overlook something trivial? > > Cheers --- Jan > > ================================================================ > BATCH_NAME= bulk > CLIENTS_NUM_MAX=20 > CLIENTS_NUM_START=20 > CLIENTS_RAMPUP_INC=0 > INTERFACE=eth0 > NETMASK=16 > IP_ADDR_MIN= 192.168.1.1 > IP_ADDR_MAX= 192.168.1.200 > IP_SHARED_NUM=200 > CYCLES_NUM= -1 > URLS_NUM= 1 > > URL_TEMPLATE=http://localhost:3040%s > URL_TOKEN_FILE=paths > URL_SHORT_NAME="local-server" > REQUEST_TYPE=GET > TIMER_URL_COMPLETION = 5000 > TIMER_AFTER_URL_SLEEP = 0 > ================================================================ > It is a rather recent feature developed by Gary and it will be great if Gary will comment. Still, you are keeping 20 clients, which are testing 20 paths. Could it be that 6000 clients will do the job? Give it a try. Just keep in mind, that by default you need for 35K of memory per each client and ramp-up about 100 not to start all them at start-up. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Val S. <Val...@no...> - 2009-07-24 18:25:46
|
I'm not an expert on curl-loader and I have not used the url templates yet. The way I do it is literally generate all the urls in the urls section and set FETCH_PROBABILITY on each url to a low value, e.g 2., so it looks like this: ########### GENERAL SECTION ################################ BATCH_NAME = random-picked-urls CLIENTS_NUM_MAX = 35 INTERFACE = eth1 NETMASK = 255.255.0.0 IP_ADDR_MIN = 10.2.1.2 IP_ADDR_MAX = 10.2.1.201 CYCLES_NUM = -1 URLS_NUM = 5000 ########### URL SECTION #################################### URL = http://www.XYZ.NET/ URL_SHORT_NAME = "badurl1" REQUEST_TYPE = GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 0 FETCH_PROBABILITY = 2 FETCH_PROBABILITY_ONCE = 0 URL = http://www.XYZ1.NET/ URL_SHORT_NAME = "badurl2" REQUEST_TYPE = GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 0 FETCH_PROBABILITY = 2 FETCH_PROBABILITY_ONCE = 0 .... >-----Original Message----- >From: Jan Wielemaker [mailto:J.W...@cs...] >Sent: Friday, July 24, 2009 6:25 AM >To: cur...@li... >Subject: Query random paths? > >Hi, > >Just trying my first steps into curl-loader :-) I looked through the FAQ >and config files, but I couldn't find an answer on how to query a *lot* of >random paths using a pool of -lets say- 20 clients. > >I see we have FORM_RECORDS_RANDOM, which seems to do the trick for >form-data. I am trying to debug server crashes, where I know it is >related to concurrency, but otherwise I have little clue on whether or >not it is related to the actual query paths. > >Just to bash the server, I collected a list of 6,000 paths that have >been queried on the server recently and then I use the config below, but >curl-loader only seems to use the first 20 paths of the file :-( > >Did I overlook something trivial? > > Cheers --- Jan > >================================================================ >BATCH_NAME= bulk >CLIENTS_NUM_MAX=20 >CLIENTS_NUM_START=20 >CLIENTS_RAMPUP_INC=0 >INTERFACE=eth0 >NETMASK=16 >IP_ADDR_MIN= 192.168.1.1 >IP_ADDR_MAX= 192.168.1.200 >IP_SHARED_NUM=200 >CYCLES_NUM= -1 >URLS_NUM= 1 > >URL_TEMPLATE=http://localhost:3040%s >URL_TOKEN_FILE=paths >URL_SHORT_NAME="local-server" >REQUEST_TYPE=GET >TIMER_URL_COMPLETION = 5000 >TIMER_AFTER_URL_SLEEP = 0 >================================================================ > > > > >--------------------------------------------------------------------------- >--- >_______________________________________________ >curl-loader-devel mailing list >cur...@li... >https://lists.sourceforge.net/lists/listinfo/curl-loader-devel |
From: Jan W. <J.W...@cs...> - 2009-07-24 15:08:50
|
Hi, Just trying my first steps into curl-loader :-) I looked through the FAQ and config files, but I couldn't find an answer on how to query a *lot* of random paths using a pool of -lets say- 20 clients. I see we have FORM_RECORDS_RANDOM, which seems to do the trick for form-data. I am trying to debug server crashes, where I know it is related to concurrency, but otherwise I have little clue on whether or not it is related to the actual query paths. Just to bash the server, I collected a list of 6,000 paths that have been queried on the server recently and then I use the config below, but curl-loader only seems to use the first 20 paths of the file :-( Did I overlook something trivial? Cheers --- Jan ================================================================ BATCH_NAME= bulk CLIENTS_NUM_MAX=20 CLIENTS_NUM_START=20 CLIENTS_RAMPUP_INC=0 INTERFACE=eth0 NETMASK=16 IP_ADDR_MIN= 192.168.1.1 IP_ADDR_MAX= 192.168.1.200 IP_SHARED_NUM=200 CYCLES_NUM= -1 URLS_NUM= 1 URL_TEMPLATE=http://localhost:3040%s URL_TOKEN_FILE=paths URL_SHORT_NAME="local-server" REQUEST_TYPE=GET TIMER_URL_COMPLETION = 5000 TIMER_AFTER_URL_SLEEP = 0 ================================================================ |
From: Robert I. <cor...@gm...> - 2009-07-22 05:10:40
|
Dear Val, On Wed, Jul 22, 2009 at 3:47 AM, Val Shkolnikov <Val...@no...>wrote: > Hi, > > Here are some changes I made to the code to improve logging. Mainly so > that the .log file could be postprocessed to get more detailed statistics of > the response time beyond the average and count. If interested, I can send > the Python code that computes percentiles, average and std.dev from that. I > redirected the voluminous opstats output into a file. Also added a couple > of new options. The changes are summarized in the ChangeLog. I also > updated the documentation. All the changes are against Rev 0.48. > > /Val > Thank you very much, we really appreciate. The maintainers (both me and Michael) will look into it. Just for future, it is much easier to look (and accept) the changes, when several patches are provided with patch-per-feature. Please, fill you free to submit to this list your script and how to run it, what it does. Sincerely, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2009-07-07 07:43:51
|
Hi Leonart, On Fri, Jun 26, 2009 at 7:36 PM, Tolentino, Leonard<leo...@di...> wrote: > I am getting the following error when I issue the command make to install > curl-loader: > gcc -g -o curl-loader obj/batch.o obj/cl_alloc.o obj/client.o > obj/conf.o obj/environment.o obj/heap.o obj/ip_secondary.o obj/loader.o > obj/loader_fsm.o obj/loader_hyper.o obj/loader_smooth.o obj/mpool.o > obj/parse_conf.o obj/screen.o obj/ssl_thr_lock.o obj/statistics.o > obj/timer_queue.o obj/url.o -L./lib -L/usr//lib -ldl -lpthread -lrt -lcurl > -levent -lz -lssl -lcrypto > > ./lib/libcurl.a(ssh.o): In function `ssh_statemach_act': > > ssh.c:(.text+0x71d): undefined reference to `libssh2_session_free' > > ssh.c:(.text+0x775): undefined reference to `libssh2_channel_free' You might wish to build the latest release 0.49, where libcurl is built without support for ssh, and, therefore, libssh2 library is not in need any more. Take care. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2009-07-05 14:37:19
|
Hi folks, version 0.49, unstable, July 5, 2009, svn 561 * Corrected weirdness in parse_conf.c Thanks to nobody/anonymous. * Making of libcurl without support for SSH. Some platforms do not have libssh2 installed and it breaks our build. * Post an XML-file as a body of POST request. An example of configuration file is conf-examples/post-xml.conf, where the XML-file itself is residing in the same configuration directory, namely conf-examples/some.xml The majic is now done by: REQUEST_TYPE=POST MULTIPART_FORM_DATA="file=@some.xml" * Advanced to libevent-1.4.11-stable, which fixes several nastry epoll bugs. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Syed T. A. <taq...@gm...> - 2009-06-26 08:31:33
|
Dear Sir, I want to ask , can we retrieve response and add dyanmic data with subsequent request with this tool. that means, we sent request 1 ----> and got response 1, depending on this we added some dynamic cookie to it and send request 2 --->, Is it possible. Here I want to retrieve the cookie and depending on cookie value I need to add another cookie.( it is dependent on previous request). I am having one website like this, I need to test it. waiting for ur reply.... Best Regards ---TAQI----- -- Best Regards ---TAQI----- |
From: Syed T. A. <taq...@gm...> - 2009-06-25 18:58:51
|
my error is , curl-7.19.4/tests/data/test30 curl-7.19.4/tests/data/test524 curl-7.19.4/tests/data/test626 curl-7.19.4/tests/data/test37 curl-7.19.4/tests/data/test1042 curl-7.19.4/tests/data/test1015 curl-7.19.4/tests/data/test190 curl-7.19.4/tests/data/test256 curl-7.19.4/tests/data/test621 curl-7.19.4/tests/data/test307 curl-7.19.4/tests/data/test705 curl-7.19.4/tests/data/test285 curl-7.19.4/tests/data/test5 curl-7.19.4/tests/data/test293 curl-7.19.4/tests/data/test1002 curl-7.19.4/tests/data/test298 curl-7.19.4/tests/data/test604 curl-7.19.4/tests/data/test100 curl-7.19.4/tests/data/test224 curl-7.19.4/tests/data/test259 curl-7.19.4/tests/data/test1063 curl-7.19.4/tests/data/test539 curl-7.19.4/tests/data/test242 curl-7.19.4/tests/data/test157 curl-7.19.4/tests/data/test1088 curl-7.19.4/tests/data/test608 curl-7.19.4/tests/data/test534 curl-7.19.4/tests/data/test265 curl-7.19.4/tests/data/test146 curl-7.19.4/tests/data/test133 curl-7.19.4/tests/data/test2002 curl-7.19.4/tests/data/test521 curl-7.19.4/tests/data/test1075 curl-7.19.4/tests/data/test602 curl-7.19.4/tests/data/test79 curl-7.19.4/tests/data/test75 curl-7.19.4/tests/data/test48 curl-7.19.4/tests/data/test162 curl-7.19.4/tests/data/test83 curl-7.19.4/tests/data/test180 curl-7.19.4/tests/data/test1024 curl-7.19.4/tests/data/test250 curl-7.19.4/tests/data/test76 curl-7.19.4/tests/data/test1068 curl-7.19.4/tests/data/test137 curl-7.19.4/tests/data/test176 curl-7.19.4/tests/data/test6 curl-7.19.4/tests/data/test554 curl-7.19.4/tests/data/test2001 curl-7.19.4/tests/data/test241 curl-7.19.4/tests/data/test505 curl-7.19.4/tests/data/test302 curl-7.19.4/tests/data/test612 curl-7.19.4/tests/data/test216 curl-7.19.4/tests/data/test149 curl-7.19.4/tests/data/test628 curl-7.19.4/tests/data/test606 curl-7.19.4/tests/data/test553 curl-7.19.4/tests/data/test13 curl-7.19.4/tests/data/test1077 curl-7.19.4/tests/data/test85 curl-7.19.4/tests/data/test3 curl-7.19.4/tests/data/test261 curl-7.19.4/tests/data/test65 curl-7.19.4/tests/data/test73 curl-7.19.4/tests/data/test106 curl-7.19.4/tests/data/test60 curl-7.19.4/tests/data/test210 curl-7.19.4/tests/data/test234 curl-7.19.4/tests/data/test613 curl-7.19.4/tests/data/test550 curl-7.19.4/tests/data/test62 curl-7.19.4/tests/data/test616 curl-7.19.4/tests/data/test19 curl-7.19.4/tests/data/test610 curl-7.19.4/tests/data/test132 curl-7.19.4/tests/data/test61 curl-7.19.4/tests/data/test184 curl-7.19.4/tests/data/test42 curl-7.19.4/tests/data/test1076 curl-7.19.4/tests/data/test205 curl-7.19.4/tests/data/test89 curl-7.19.4/tests/data/test218 curl-7.19.4/tests/data/test121 curl-7.19.4/tests/data/test1054 curl-7.19.4/tests/data/test1095 curl-7.19.4/tests/data/test136 curl-7.19.4/tests/data/test54 curl-7.19.4/tests/data/test243 curl-7.19.4/tests/data/test702 curl-7.19.4/tests/data/test528 curl-7.19.4/tests/data/test123 curl-7.19.4/tests/data/test20 curl-7.19.4/tests/data/test255 curl-7.19.4/tests/data/test278 curl-7.19.4/tests/data/test92 curl-7.19.4/tests/data/test141 curl-7.19.4/tests/data/test211 curl-7.19.4/tests/data/test68 curl-7.19.4/tests/data/test148 curl-7.19.4/tests/data/test617 curl-7.19.4/tests/data/test84 curl-7.19.4/tests/data/test97 curl-7.19.4/tests/data/test1044 curl-7.19.4/tests/data/test46 curl-7.19.4/tests/data/test4 curl-7.19.4/tests/data/test1004 curl-7.19.4/tests/data/test306 curl-7.19.4/tests/data/test1032 curl-7.19.4/tests/data/test107 curl-7.19.4/tests/data/test513 curl-7.19.4/tests/data/test1013 curl-7.19.4/tests/data/test270 curl-7.19.4/tests/data/test7 curl-7.19.4/tests/data/test1011 curl-7.19.4/tests/data/test233 curl-7.19.4/tests/data/test26 curl-7.19.4/tests/data/test101 curl-7.19.4/tests/data/test532 curl-7.19.4/tests/data/test408 curl-7.19.4/tests/data/test1041 curl-7.19.4/tests/data/test503 curl-7.19.4/tests/data/test258 curl-7.19.4/tests/data/test600 curl-7.19.4/tests/data/test1035 curl-7.19.4/tests/data/test267 curl-7.19.4/tests/data/test131 curl-7.19.4/tests/data/test1016 curl-7.19.4/tests/data/test9 curl-7.19.4/tests/data/test202 curl-7.19.4/tests/data/test620 curl-7.19.4/tests/data/test212 curl-7.19.4/tests/data/test59 curl-7.19.4/tests/data/test201 curl-7.19.4/tests/data/test247 curl-7.19.4/tests/data/test545 curl-7.19.4/tests/data/test279 curl-7.19.4/tests/data/test43 curl-7.19.4/tests/data/test23 curl-7.19.4/tests/data/test231 curl-7.19.4/tests/data/test1033 curl-7.19.4/tests/data/test124 curl-7.19.4/tests/data/test635 curl-7.19.4/tests/data/test1012 curl-7.19.4/tests/data/test283 curl-7.19.4/tests/data/test156 curl-7.19.4/tests/data/test237 curl-7.19.4/tests/data/test622 curl-7.19.4/tests/data/test531 curl-7.19.4/tests/data/test227 curl-7.19.4/tests/data/test236 curl-7.19.4/tests/data/test223 curl-7.19.4/tests/data/test1014 curl-7.19.4/tests/data/test277 curl-7.19.4/tests/data/test2000 curl-7.19.4/tests/data/test82 curl-7.19.4/tests/data/test1031 curl-7.19.4/tests/data/test611 curl-7.19.4/tests/data/test15 curl-7.19.4/tests/data/test47 curl-7.19.4/tests/data/test49 curl-7.19.4/tests/data/test115 curl-7.19.4/tests/data/test559 curl-7.19.4/tests/data/test1028 curl-7.19.4/tests/data/test119 curl-7.19.4/tests/data/test226 curl-7.19.4/tests/data/test1020 curl-7.19.4/tests/data/test614 curl-7.19.4/tests/data/test174 curl-7.19.4/tests/data/test637 curl-7.19.4/tests/data/test153 curl-7.19.4/tests/data/test703 curl-7.19.4/tests/data/test235 curl-7.19.4/tests/data/test533 curl-7.19.4/tests/data/test633 curl-7.19.4/tests/data/test514 curl-7.19.4/tests/data/test519 curl-7.19.4/tests/data/test1065 curl-7.19.4/tests/data/Makefile.in curl-7.19.4/tests/data/test701 curl-7.19.4/tests/data/test246 curl-7.19.4/tests/data/test169 curl-7.19.4/tests/data/test194 curl-7.19.4/tests/data/test1072 curl-7.19.4/tests/data/test1017 curl-7.19.4/tests/data/test542 curl-7.19.4/tests/data/test1047 curl-7.19.4/tests/data/test217 curl-7.19.4/tests/data/test138 curl-7.19.4/tests/data/test1052 curl-7.19.4/tests/data/test1079 curl-7.19.4/tests/data/test506 curl-7.19.4/tests/data/test164 curl-7.19.4/tests/data/test1001 curl-7.19.4/tests/data/test166 curl-7.19.4/tests/data/test1073 curl-7.19.4/tests/data/test1010 curl-7.19.4/tests/data/test1008 curl-7.19.4/tests/data/test353 curl-7.19.4/tests/data/test185 curl-7.19.4/tests/data/test625 curl-7.19.4/tests/data/test271 curl-7.19.4/tests/data/test350 curl-7.19.4/tests/data/test1083 curl-7.19.4/tests/data/test108 curl-7.19.4/tests/data/test1062 curl-7.19.4/tests/data/test1094 curl-7.19.4/tests/data/test601 curl-7.19.4/tests/data/test504 curl-7.19.4/tests/data/test104 curl-7.19.4/tests/data/test187 curl-7.19.4/tests/data/test287 curl-7.19.4/tests/data/test94 curl-7.19.4/tests/data/test409 curl-7.19.4/tests/data/test254 curl-7.19.4/tests/data/test53 curl-7.19.4/tests/data/test111 curl-7.19.4/tests/data/test102 curl-7.19.4/tests/data/test206 curl-7.19.4/tests/data/test158 curl-7.19.4/tests/data/test155 curl-7.19.4/tests/data/test520 curl-7.19.4/tests/data/test29 curl-7.19.4/tests/data/test704 curl-7.19.4/tests/data/test406 curl-7.19.4/tests/data/test632 curl-7.19.4/tests/valgrind.pm tar: curl-7.19.4/tests/data: Cannot utime: Operation not permitted curl-7.19.4/tests/stunnel.pem curl-7.19.4/tests/runtests.1 curl-7.19.4/tests/server/ curl-7.19.4/tests/server/Makefile.am curl-7.19.4/tests/server/getpart.h curl-7.19.4/tests/server/tftpd.c curl-7.19.4/tests/server/sws.c curl-7.19.4/tests/server/tftp.h curl-7.19.4/tests/server/testpart.c curl-7.19.4/tests/server/resolve.c curl-7.19.4/tests/server/getpart.c curl-7.19.4/tests/server/sockfilt.c curl-7.19.4/tests/server/util.c curl-7.19.4/tests/server/Makefile.in curl-7.19.4/tests/server/util.h curl-7.19.4/tests/ftpserver.pl tar: curl-7.19.4/tests/server: Cannot utime: Operation not permitted curl-7.19.4/tests/testcurl.1 curl-7.19.4/tests/ftp.pm curl-7.19.4/tests/testcurl.pl curl-7.19.4/tests/testcurl.pdf curl-7.19.4/tests/libtest/ curl-7.19.4/tests/libtest/first.c curl-7.19.4/tests/libtest/lib542.c curl-7.19.4/tests/libtest/lib554.c curl-7.19.4/tests/libtest/lib510.c curl-7.19.4/tests/libtest/Makefile.am curl-7.19.4/tests/libtest/lib503.c curl-7.19.4/tests/libtest/lib514.c curl-7.19.4/tests/libtest/lib525.c curl-7.19.4/tests/libtest/lib560.c curl-7.19.4/tests/libtest/testutil.h curl-7.19.4/tests/libtest/test.h curl-7.19.4/tests/libtest/test613.pl curl-7.19.4/tests/libtest/lib553.c curl-7.19.4/tests/libtest/lib547.c curl-7.19.4/tests/libtest/test1013.pl curl-7.19.4/tests/libtest/lib506.c curl-7.19.4/tests/libtest/lib519.c curl-7.19.4/tests/libtest/lib517.c curl-7.19.4/tests/libtest/lib500.c curl-7.19.4/tests/libtest/lib537.c curl-7.19.4/tests/libtest/lib555.c curl-7.19.4/tests/libtest/test75.pl curl-7.19.4/tests/libtest/lib549.c curl-7.19.4/tests/libtest/lib520.c curl-7.19.4/tests/libtest/lib508.c curl-7.19.4/tests/libtest/lib557.c curl-7.19.4/tests/libtest/test610.pl curl-7.19.4/tests/libtest/lib501.c curl-7.19.4/tests/libtest/lib521.c curl-7.19.4/tests/libtest/lib512.c curl-7.19.4/tests/libtest/lib536.c curl-7.19.4/tests/libtest/lib518.c curl-7.19.4/tests/libtest/test307.pl curl-7.19.4/tests/libtest/lib526.c curl-7.19.4/tests/libtest/lib502.c curl-7.19.4/tests/libtest/lib507.c curl-7.19.4/tests/libtest/lib541.c curl-7.19.4/tests/libtest/lib515.c curl-7.19.4/tests/libtest/lib540.c curl-7.19.4/tests/libtest/lib552.c curl-7.19.4/tests/libtest/lib530.c curl-7.19.4/tests/libtest/testutil.c curl-7.19.4/tests/libtest/lib533.c curl-7.19.4/tests/libtest/lib516.c curl-7.19.4/tests/libtest/lib511.c curl-7.19.4/tests/libtest/lib505.c curl-7.19.4/tests/libtest/lib544.c curl-7.19.4/tests/libtest/lib523.c curl-7.19.4/tests/libtest/lib539.c curl-7.19.4/tests/libtest/lib504.c curl-7.19.4/tests/libtest/lib543.c curl-7.19.4/tests/libtest/Makefile.in curl-7.19.4/tests/libtest/lib556.c curl-7.19.4/tests/libtest/test1022.pl curl-7.19.4/tests/libtest/lib558.c curl-7.19.4/tests/libtest/lib513.c curl-7.19.4/tests/libtest/lib524.c curl-7.19.4/tests/FILEFORMAT tar: curl-7.19.4/tests/libtest: Cannot utime: Operation not permitted curl-7.19.4/tests/testcurl.html curl-7.19.4/tests/sshhelp.pm curl-7.19.4/tests/getpart.pm curl-7.19.4/tests/runtests.html curl-7.19.4/tests/Makefile.in curl-7.19.4/tests/httpsserver.pl curl-7.19.4/tests/httpserver.pl curl-7.19.4/maketgz tar: curl-7.19.4/tests: Cannot utime: Operation not permitted curl-7.19.4/include/ curl-7.19.4/include/Makefile.am curl-7.19.4/include/README curl-7.19.4/include/curl/ curl-7.19.4/include/curl/mprintf.h curl-7.19.4/include/curl/Makefile.am curl-7.19.4/include/curl/stdcheaders.h curl-7.19.4/include/curl/curlver.h curl-7.19.4/include/curl/easy.h curl-7.19.4/include/curl/curl.h curl-7.19.4/include/curl/typecheck-gcc.h curl-7.19.4/include/curl/curlbuild.h.in curl-7.19.4/include/curl/curlbuild.h curl-7.19.4/include/curl/curlrules.h curl-7.19.4/include/curl/types.h curl-7.19.4/include/curl/Makefile.in curl-7.19.4/include/curl/multi.h curl-7.19.4/include/Makefile.in tar: curl-7.19.4/include/curl: Cannot utime: Operation not permitted curl-7.19.4/packages/ tar: curl-7.19.4/include: Cannot utime: Operation not permitted curl-7.19.4/packages/vms/ curl-7.19.4/packages/vms/curlmsg.h curl-7.19.4/packages/vms/Makefile.am curl-7.19.4/packages/vms/defines.com curl-7.19.4/packages/vms/curlmsg.msg curl-7.19.4/packages/vms/hpssl_alpha.opt curl-7.19.4/packages/vms/axp/ curl-7.19.4/packages/vms/axp/README curl-7.19.4/packages/vms/hpssl_vax.opt tar: curl-7.19.4/packages/vms/axp: Cannot utime: Operation not permitted curl-7.19.4/packages/vms/config-vms.h curl-7.19.4/packages/vms/vax/ curl-7.19.4/packages/vms/vax/README curl-7.19.4/packages/vms/readme tar: curl-7.19.4/packages/vms/vax: Cannot utime: Operation not permitted curl-7.19.4/packages/vms/batch_compile.com curl-7.19.4/packages/vms/curlmsg_vms.h curl-7.19.4/packages/vms/hpssl_ia64.opt curl-7.19.4/packages/vms/ia64/ curl-7.19.4/packages/vms/ia64/README curl-7.19.4/packages/vms/build_vms.com tar: curl-7.19.4/packages/vms/ia64: Cannot utime: Operation not permitted curl-7.19.4/packages/vms/Makefile.in curl-7.19.4/packages/vms/curlmsg.sdl curl-7.19.4/packages/Makefile.am tar: curl-7.19.4/packages/vms: Cannot utime: Operation not permitted curl-7.19.4/packages/README curl-7.19.4/packages/Linux/ curl-7.19.4/packages/Linux/Makefile.am curl-7.19.4/packages/Linux/RPM/ curl-7.19.4/packages/Linux/RPM/Makefile.am curl-7.19.4/packages/Linux/RPM/make_curl_rpm curl-7.19.4/packages/Linux/RPM/README curl-7.19.4/packages/Linux/RPM/curl-ssl.spec.in curl-7.19.4/packages/Linux/RPM/curl.spec.in curl-7.19.4/packages/Linux/RPM/Makefile.in curl-7.19.4/packages/Linux/Makefile.in tar: curl-7.19.4/packages/Linux/RPM: Cannot utime: Operation not permitted curl-7.19.4/packages/Win32/ tar: curl-7.19.4/packages/Linux: Cannot utime: Operation not permitted curl-7.19.4/packages/Win32/Makefile.am curl-7.19.4/packages/Win32/README curl-7.19.4/packages/Win32/cygwin/ curl-7.19.4/packages/Win32/cygwin/Makefile.am curl-7.19.4/packages/Win32/cygwin/README curl-7.19.4/packages/Win32/cygwin/Makefile.in curl-7.19.4/packages/Win32/Makefile.in tar: curl-7.19.4/packages/Win32/cygwin: Cannot utime: Operation not permitted curl-7.19.4/packages/Symbian/ tar: curl-7.19.4/packages/Win32: Cannot utime: Operation not permitted curl-7.19.4/packages/Symbian/eabi/ curl-7.19.4/packages/Symbian/eabi/libcurlu.def curl-7.19.4/packages/Symbian/bwins/ tar: curl-7.19.4/packages/Symbian/eabi: Cannot utime: Operation not permitted curl-7.19.4/packages/Symbian/bwins/libcurlu.def curl-7.19.4/packages/Symbian/readme.txt tar: curl-7.19.4/packages/Symbian/bwins: Cannot utime: Operation not permitted curl-7.19.4/packages/Symbian/group/ curl-7.19.4/packages/Symbian/group/curl.pkg curl-7.19.4/packages/Symbian/group/libcurl.iby curl-7.19.4/packages/Symbian/group/libcurl.pkg curl-7.19.4/packages/Symbian/group/curl.iby curl-7.19.4/packages/Symbian/group/curl.mmp curl-7.19.4/packages/Symbian/group/libcurl.mmp curl-7.19.4/packages/Symbian/group/bld.inf curl-7.19.4/packages/NetWare/ tar: curl-7.19.4/packages/Symbian/group: Cannot utime: Operation not permitted tar: curl-7.19.4/packages/Symbian: Cannot utime: Operation not permitted curl-7.19.4/packages/NetWare/get_ver.awk curl-7.19.4/packages/DOS/ tar: curl-7.19.4/packages/NetWare: Cannot utime: Operation not permitted curl-7.19.4/packages/DOS/Makefile.am curl-7.19.4/packages/DOS/README curl-7.19.4/packages/DOS/common.dj curl-7.19.4/packages/DOS/Makefile.in curl-7.19.4/packages/OS400/ tar: curl-7.19.4/packages/DOS: Cannot utime: Operation not permitted curl-7.19.4/packages/OS400/os400sys.h curl-7.19.4/packages/OS400/README.OS400 curl-7.19.4/packages/OS400/make-tests.sh curl-7.19.4/packages/OS400/os400sys.c curl-7.19.4/packages/OS400/make-include.sh curl-7.19.4/packages/OS400/make-src.sh curl-7.19.4/packages/OS400/make-lib.sh curl-7.19.4/packages/OS400/ccsidcurl.c curl-7.19.4/packages/OS400/makefile.sh curl-7.19.4/packages/OS400/initscript.sh curl-7.19.4/packages/OS400/ccsidcurl.h curl-7.19.4/packages/OS400/curl.inc.in curl-7.19.4/packages/AIX/ tar: curl-7.19.4/packages/OS400: Cannot utime: Operation not permitted curl-7.19.4/packages/AIX/Makefile.am curl-7.19.4/packages/AIX/RPM/ curl-7.19.4/packages/AIX/RPM/Makefile.am curl-7.19.4/packages/AIX/RPM/README curl-7.19.4/packages/AIX/RPM/curl.spec.in curl-7.19.4/packages/AIX/RPM/Makefile.in curl-7.19.4/packages/AIX/Makefile.in tar: curl-7.19.4/packages/AIX/RPM: Cannot utime: Operation not permitted <like this others also> curl-7.19.4/packages/Solaris/ tar: curl-7.19.4/packages/AIX: Cannot utime: Operation not permitted curl-7.19.4/packages/Solaris/Makefile.am curl-7.19.4/packages/Solaris/Makefile.in curl-7.19.4/packages/EPM/ tar: curl-7.19.4/packages/Solaris: Cannot utime: Operation not permitted curl-7.19.4/packages/EPM/Makefile.am curl-7.19.4/packages/EPM/README curl-7.19.4/packages/EPM/curl.list.in curl-7.19.4/packages/EPM/Makefile.in curl-7.19.4/packages/Makefile.in tar: curl-7.19.4/packages/EPM: Cannot utime: Operation not permitted curl-7.19.4/install-sh tar: curl-7.19.4/packages: Cannot utime: Operation not permitted curl-7.19.4/MacOSX-Framework curl-7.19.4/buildconf.bat curl-7.19.4/vc6curl.dsw curl-7.19.4/libcurl.pc.in curl-7.19.4/sample.emacs curl-7.19.4/depcomp curl-7.19.4/ChangeLog curl-7.19.4/compile curl-7.19.4/curl-config.in curl-7.19.4/curl-style.el curl-7.19.4/acinclude.m4 curl-7.19.4/m4/ curl-7.19.4/m4/curl-override.m4 curl-7.19.4/m4/curl-confopts.m4 curl-7.19.4/m4/libtool.m4 curl-7.19.4/m4/curl-system.m4 curl-7.19.4/m4/ltoptions.m4 curl-7.19.4/m4/curl-compilers.m4 curl-7.19.4/m4/lt~obsolete.m4 curl-7.19.4/m4/curl-reentrant.m4 curl-7.19.4/m4/ltversion.m4 curl-7.19.4/m4/curl-functions.m4 curl-7.19.4/m4/ltsugar.m4 curl-7.19.4/Makefile.in tar: curl-7.19.4/m4: Cannot utime: Operation not permitted curl-7.19.4/buildconf tar: curl-7.19.4: Cannot utime: Operation not permitted tar: Error exit delayed from previous errors patching file include/curl/curl.h Hunk #1 succeeded at 271 (offset 51 lines). patching file lib/sendf.c Hunk #1 succeeded at 259 (offset 1 line). Hunk #2 succeeded at 713 with fuzz 1 (offset 195 lines). mkdir -p /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; cd /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; ../../packages/curl/configure --prefix=/home/matti/Desktop/curl loader/curl-loader-0.48/build/curl \ --without-libidn \ --disable-ldap \ --enable-ipv6 \ --enable-thread \ --with-random=/dev/urandom \ --with-ssl=/usr/include/openssl \ --enable-shared=no \ CFLAGS=" -g -DCURL_MAX_WRITE_SIZE=4096" /bin/sh: ../../packages/curl/configure: not found make: *** [lib/libcurl.a] Error 127 matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ thank you.... On Thu, Jun 25, 2009 at 11:45 PM, Syed Taqi Ali <taq...@gm...>wrote: > yes I am having these options... > > > matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ tar > tar: You must specify one of the `-Acdtrux' options > Try `tar --help' or `tar --usage' for more information. > matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ bzip2 > bzip2: I won't write compressed data to a terminal. > bzip2: For help, type: `bzip2 --help'. > matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ bunzip2 > bunzip2: I won't read compressed data from a terminal. > bunzip2: For help, type: `bunzip2 --help'. > matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ > > On Thu, Jun 25, 2009 at 11:42 PM, Robert Iakobashvili <cor...@gm... > > wrote: > >> Hi, >> >> On Thu, Jun 25, 2009 at 8:59 PM, Syed Taqi Ali <taq...@gm...>wrote: >> >>> Sir, >>> I installed patch, now Its different error,... >>> >>> curl-7.19.4/buildconf >>> patching file include/curl/curl.h >>> Hunk #1 succeeded at 271 (offset 51 lines). >>> patching file lib/sendf.c >>> Hunk #1 succeeded at 259 (offset 1 line). >>> Hunk #2 succeeded at 713 with fuzz 1 (offset 195 lines). >>> mkdir -p /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; >>> cd /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; >>> ../../packages/curl/configure --prefix=/home/matti/Desktop/curl >>> loader/curl-loader-0.48/build/curl \ >>> --without-libidn \ >>> --disable-ldap \ >>> --enable-ipv6 \ >>> --enable-thread \ >>> --with-random=/dev/urandom \ >>> --with-ssl=/usr/include/openssl \ >>> --enable-shared=no \ >>> CFLAGS=" -g -DCURL_MAX_WRITE_SIZE=4096" >>> /bin/sh: ../../packages/curl/configure: not found >>> make: *** [lib/libcurl.a] Error 127 >>> >>> >>> thanks in advance.... >>> >> >> >> Please, test that you have the following commands: >> tar >> bzip2 >> bunzip2 >> >> >> -- >> Truly, >> Robert Iakobashvili, Ph.D. >> ...................................................................... >> www.ghotit.com >> Assistive technology that understands you >> ...................................................................... >> >> >> ------------------------------------------------------------------------------ >> >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> > > > -- > Best Regards > > ---TAQI----- > -- Best Regards ---TAQI----- |
From: Syed T. A. <taq...@gm...> - 2009-06-25 18:16:06
|
yes I am having these options... matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ tar tar: You must specify one of the `-Acdtrux' options Try `tar --help' or `tar --usage' for more information. matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ bzip2 bzip2: I won't write compressed data to a terminal. bzip2: For help, type: `bzip2 --help'. matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ bunzip2 bunzip2: I won't read compressed data from a terminal. bunzip2: For help, type: `bunzip2 --help'. matti@matti5:~/Desktop/curl loader/curl-loader-0.48$ On Thu, Jun 25, 2009 at 11:42 PM, Robert Iakobashvili <cor...@gm...>wrote: > Hi, > > On Thu, Jun 25, 2009 at 8:59 PM, Syed Taqi Ali <taq...@gm...>wrote: > >> Sir, >> I installed patch, now Its different error,... >> >> curl-7.19.4/buildconf >> patching file include/curl/curl.h >> Hunk #1 succeeded at 271 (offset 51 lines). >> patching file lib/sendf.c >> Hunk #1 succeeded at 259 (offset 1 line). >> Hunk #2 succeeded at 713 with fuzz 1 (offset 195 lines). >> mkdir -p /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; >> cd /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; >> ../../packages/curl/configure --prefix=/home/matti/Desktop/curl >> loader/curl-loader-0.48/build/curl \ >> --without-libidn \ >> --disable-ldap \ >> --enable-ipv6 \ >> --enable-thread \ >> --with-random=/dev/urandom \ >> --with-ssl=/usr/include/openssl \ >> --enable-shared=no \ >> CFLAGS=" -g -DCURL_MAX_WRITE_SIZE=4096" >> /bin/sh: ../../packages/curl/configure: not found >> make: *** [lib/libcurl.a] Error 127 >> >> >> thanks in advance.... >> > > > Please, test that you have the following commands: > tar > bzip2 > bunzip2 > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Best Regards ---TAQI----- |
From: Robert I. <cor...@gm...> - 2009-06-25 18:13:07
|
Hi, On Thu, Jun 25, 2009 at 8:59 PM, Syed Taqi Ali <taq...@gm...> wrote: > Sir, > I installed patch, now Its different error,... > > curl-7.19.4/buildconf > patching file include/curl/curl.h > Hunk #1 succeeded at 271 (offset 51 lines). > patching file lib/sendf.c > Hunk #1 succeeded at 259 (offset 1 line). > Hunk #2 succeeded at 713 with fuzz 1 (offset 195 lines). > mkdir -p /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; > cd /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; > ../../packages/curl/configure --prefix=/home/matti/Desktop/curl > loader/curl-loader-0.48/build/curl \ > --without-libidn \ > --disable-ldap \ > --enable-ipv6 \ > --enable-thread \ > --with-random=/dev/urandom \ > --with-ssl=/usr/include/openssl \ > --enable-shared=no \ > CFLAGS=" -g -DCURL_MAX_WRITE_SIZE=4096" > /bin/sh: ../../packages/curl/configure: not found > make: *** [lib/libcurl.a] Error 127 > > > thanks in advance.... > Please, test that you have the following commands: tar bzip2 bunzip2 -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2009-06-25 18:08:06
|
Hi, On Thu, Jun 25, 2009 at 8:59 PM, Syed Taqi Ali <taq...@gm...> wrote: > Sir, > I installed patch, now Its different error,... > > curl-7.19.4/buildconf > patching file include/curl/curl.h > Hunk #1 succeeded at 271 (offset 51 lines). > patching file lib/sendf.c > Hunk #1 succeeded at 259 (offset 1 line). > Hunk #2 succeeded at 713 with fuzz 1 (offset 195 lines). > mkdir -p /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; > cd /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; > ../../packages/curl/configure --prefix=/home/matti/Desktop/curl > loader/curl-loader-0.48/build/curl \ > --without-libidn \ > --disable-ldap \ > --enable-ipv6 \ > --enable-thread \ > --with-random=/dev/urandom \ > --with-ssl=/usr/include/openssl \ > --enable-shared=no \ > CFLAGS=" -g -DCURL_MAX_WRITE_SIZE=4096" > /bin/sh: ../../packages/curl/configure: not found > make: *** [lib/libcurl.a] Error 127 > > > thanks in advance.... > Try to remove it, re-open the archive and to build from the very beginning. Yet another option is that the error is somewhere higher in the output, and not shown here. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Syed T. A. <taq...@gm...> - 2009-06-25 17:59:18
|
Sir, I installed patch, now Its different error,... curl-7.19.4/buildconf patching file include/curl/curl.h Hunk #1 succeeded at 271 (offset 51 lines). patching file lib/sendf.c Hunk #1 succeeded at 259 (offset 1 line). Hunk #2 succeeded at 713 with fuzz 1 (offset 195 lines). mkdir -p /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; cd /home/matti/Desktop/curl loader/curl-loader-0.48/build/curl; ../../packages/curl/configure --prefix=/home/matti/Desktop/curl loader/curl-loader-0.48/build/curl \ --without-libidn \ --disable-ldap \ --enable-ipv6 \ --enable-thread \ --with-random=/dev/urandom \ --with-ssl=/usr/include/openssl \ --enable-shared=no \ CFLAGS=" -g -DCURL_MAX_WRITE_SIZE=4096" /bin/sh: ../../packages/curl/configure: not found make: *** [lib/libcurl.a] Error 127 thanks in advance.... On Thu, Jun 25, 2009 at 11:12 PM, Robert Iakobashvili <cor...@gm...>wrote: > Hi Syed Taqi Ali > > On Thu, Jun 25, 2009 at 8:22 PM, Syed Taqi Ali <taq...@gm...>wrote: > >> I have attach problem reporting document , please go through it, and reply >> as soon as possible... >> >> > Please, install command: > patch > > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > > > ------------------------------------------------------------------------------ > > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Best Regards ---TAQI----- |
From: Robert I. <cor...@gm...> - 2009-06-25 17:42:09
|
Hi Syed Taqi Ali On Thu, Jun 25, 2009 at 8:22 PM, Syed Taqi Ali <taq...@gm...> wrote: > I have attach problem reporting document , please go through it, and reply > as soon as possible... > > Please, install command: patch -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2009-06-16 20:41:01
|
Hi Val, On Tue, Jun 16, 2009 at 11:25 PM, Val Shkolnikov <Val...@no... > wrote: > Sorry if this has been posted before, this is my first use of the tool > (pretty good!). I noticed that the number of clients spawned is off by 1, > ie. if I ask for 150 clients, I get 149. Looking at the code I see that > clients are counted from 1 but the limits are all checked as “< > bctx->client_num_max”. A check for “<= bctx->client_num_max” would be more > appropriate. > > /Val Shkolnikov > Correct. If you can provide the patch against svn or the latest version, it would be very much appreciated. Thanks, -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Val S. <Val...@no...> - 2009-06-16 20:25:38
|
Sorry if this has been posted before, this is my first use of the tool (pretty good!). I noticed that the number of clients spawned is off by 1, ie. if I ask for 150 clients, I get 149. Looking at the code I see that clients are counted from 1 but the limits are all checked as "< bctx->client_num_max". A check for "<= bctx->client_num_max" would be more appropriate. /Val Shkolnikov |
From: Robert I. <cor...@gm...> - 2009-06-09 05:11:16
|
Hi Gregory, On Mon, Jun 8, 2009 at 10:11 PM, Greg Patmore <gr...@sl...> wrote: > > > When I’ve run the test with as many as 3,000 clients in the configuration I > sent, I was getting very good response times in the test, however, when I > tried to click around the site in a browser I was getting much worse > results. > Browsers are (naturally) better representing the user-side of experience, since they are fetching all the images at a page (unless you placed your images separately in your URLs) and in parallel. For user-experience you better to rely on your experience with a browser run in parallel. > But, am I correct in my assumption of what it’s actually doing? Ramping up > to 1000 users will start sending 1000 requests to each url in the list when > it hits the max clients? > Since this is TCP and HTTP protocols, it depends also at your network and server resources. You can see the numbers of requests in statistics at screen - summary and recent time-interval (3 seconds by default). Beside that you see the CAPS. If a server does not have enough resources to respond fast, it will be slower. What is for sure, it mimics 1000 users trying to deal with your server. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Greg P. <gr...@sl...> - 2009-06-08 19:11:29
|
>> Could you clarify a bit more, what do you have and what do you wish? >> Thanks! If I understand you correctly, you want to know the target load I'm looking to test? Really, we see about 6,000-10,000 concurrent users browsing around the site at peak time. I would like to start my testing at about 1000 users just clicking though a set of 8 urls, one page every few seconds. Just to get a sense of the query caching/ opcode caching performance. When I've run the test with as many as 3,000 clients in the configuration I sent, I was getting very good response times in the test, however, when I tried to click around the site in a browser I was getting much worse results. >> If you wish to have some picks in your load, you may place not a random, but some >> rigid time in TIMER_AFTER_URL_SLEEP= I did have this at first, I was just thinking that a small random timer between requests would be more realistic (and wouldn't hurt my cluster of web servers as much ;) ) But, am I correct in my assumption of what it's actually doing? Ramping up to 1000 users will start sending 1000 requests to each url in the list when it hits the max clients? I'm sorry if I'm just not asking the right questions. Thanks for your quick response, and help so far. Regards, Gregory Patmore Systems Architect Slingo Inc. 411 Hackensack Ave., Hackensack, NJ 07601 (P) 201.489.6727 - (F) 201.489.6728 http://www.slingo.com <http://www.slingo.com/> ________________________________ From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Monday, June 08, 2009 2:56 PM To: curl-loader-devel Subject: Re: Best way to mimic the load we get normally. Hi Gregory, On Mon, Jun 8, 2009 at 9:45 PM, Greg Patmore <gr...@sl...> wrote: The urls are all configured like so: URL=http://mydomain.com/firstpage.html URL_SHORT_NAME="firstpage" REQUEST_TYPE=GET TIMER_URL_COMPLETION=0 TIMER_AFTER_URL_SLEEP=0-5000 ... ect ... And the command I'm running the test with is: ./curl-loader -f ./conf-examples/1k-clients.conf -v -u -t 2 So when I run the test sometimes it seems that it's sending the number of client requests to each url at the same time, which for 8 URL entries would equate to around 8000 requests per second once it hits the max. What I'm trying to do is show a load of 1000 user's on the site at the same time, clicking to a new page every few seconds. Am I doing it wrong? Could you clarify a bit more, what do you have and what do you wish? Thanks! If you wish to have some picks in your load, you may place not a random, but some rigid time in TIMER_AFTER_URL_SLEEP= Ir you wish to increase an element of randomness in clicking pages, look at FETCH_PROBABILITY. Also, is it more realistic if I use the -r switch? No Gregory Patmore -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2009-06-08 18:57:22
|
Hi Gregory, On Mon, Jun 8, 2009 at 9:45 PM, Greg Patmore <gr...@sl...> wrote: > > *The urls are all configured like so:* > > *URL=http://mydomain.com/firstpage.html* > > *URL_SHORT_NAME="firstpage"* > > *REQUEST_TYPE=GET* > > *TIMER_URL_COMPLETION=0* > > *TIMER_AFTER_URL_SLEEP=0-5000* > > *… ect …* > > *And the command I’m running the test with is:* > > *./curl-loader -f ./conf-examples/1k-clients.conf -v -u -t 2* > > * * > > *So when I run the test sometimes it seems that it’s sending the number of > client requests to each url at the same time, which for 8 URL entries would > equate to around 8000 requests per second once it hits the max. What I’m > trying to do is show a load of 1000 user’s on the site at the same time, > clicking to a new page every few seconds. Am I doing it wrong?* > Could you clarify a bit more, what do you have and what do you wish? Thanks! If you wish to have some picks in your load, you may place not a random, but some rigid time in *TIMER_AFTER_URL_SLEEP= Ir you wish to increase an element of randomness in clicking pages, look at FETCH_PROBABILITY. * > ** > > *Also, is it more realistic if I use the –r switch?* > No > ** > > *Gregory Patmore* > -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Greg P. <gr...@sl...> - 2009-06-08 18:46:14
|
Hi all, Wondering if anyone can give me some tips on the best way to mimic a load that we see around peak time. I'm trying to mimic the load we would see if we had 1000 users perusing a few pages on our site. I have a conf file with the following general section: BATCH_NAME= 1k-clients CLIENTS_NUM_MAX=1000 CLIENTS_NUM_START=5 CLIENTS_RAMPUP_INC=3 INTERFACE=eth2 NETMASK=32 IP_ADDR_MIN=192.168.1.189 IP_ADDR_MAX=192.168.1.189 IP_SHARED_NUM=1 CYCLES_NUM=-1 URLS_NUM=8 The urls are all configured like so: URL=http://mydomain.com/firstpage.html URL_SHORT_NAME="firstpage" REQUEST_TYPE=GET TIMER_URL_COMPLETION=0 TIMER_AFTER_URL_SLEEP=0-5000 ... ect ... And the command I'm running the test with is: ./curl-loader -f ./conf-examples/1k-clients.conf -v -u -t 2 So when I run the test sometimes it seems that it's sending the number of client requests to each url at the same time, which for 8 URL entries would equate to around 8000 requests per second once it hits the max. What I'm trying to do is show a load of 1000 user's on the site at the same time, clicking to a new page every few seconds. Am I doing it wrong? Also, is it more realistic if I use the -r switch? I appreciate any help or feedback on this. Regards, Gregory Patmore Systems Architect Slingo Inc. 411 Hackensack Ave., Hackensack, NJ 07601 (P) 201.489.6727 - (F) 201.489.6728 http://www.slingo.com <http://www.slingo.com/> |
From: Support <su...@ta...> - 2009-05-22 10:42:39
|
Hi, Thanks your suggestion. I was using 0.44 previously. I have downloaded the latest version 0.48 and find the feature; I will try. By the way, is it any stability issue in 0.48 (as curl-loader-stable = curl-loader-0.44)? Best Regards, Andy NG -----Original message----- From: Robert Iakobashvili cor...@gm... Date: Fri, 22 May 2009 17:59:23 +0800 To: curl-loader-devel cur...@li... Subject: Re: How to enable 1000 http client downloading 1000 file at the same timein curl-loader > Dear Andy, > > On Fri, May 22, 2009 at 10:49 AM, Support wrote: > > > Dear Sir, > > > > I would like to use curlloader to simulate a scenario like ?1000 http t> est > > clients download 1000 different files via http at the same time?. > > > > I can define 1000 different URL in config file like ?URL= > > http://server_ip/fileN? where N = 1 ? 1000 > > > > However, if I use 1000 different URL in the config file, all clients > > download file1 at the same time; after file1 downloaded completely, all > > client download file2, and so on. > > > > How can I make all clients download different files at the same time. > > Please help. Thank you. > > > > Best Regards, > > > > Andy Ng > > > Please, mind to feel and post your PRF (Problem Reporting Form) > > What you need is to use a feature developed by Gary, namely, > template URL. > > A good example is in file ./conf-examples/url-template-fr-file.conf > > and more details please learn in README. You need static tokens > taken from file. > > Take care and have a good day! > > > -- > Truly, > Robert Iakobashvili, Ph.D. > ..................................................................... > www.ghotit.com > Assistive technology that understands you > ..................................................................... > |
From: Robert I. <cor...@gm...> - 2009-05-22 10:27:44
|
Hi Andy On Fri, May 22, 2009 at 10:49 AM, Support <su...@ta...> wrote: > However, if I use 1000 different URL in the config file, all clients > download file1 at the same time; after file1 downloaded completely, all > client download file2, and so on. > > How can I make all clients download different files at the same time. > Please help. Thank you. > Note, that at the first cycle they are dowing it rather synchronously, but further each client can progresses independently. To enhance that you can use TIMER_AFTER_URL_SLEEP as a randon value, e.g. TIMER_AFTER_URL_SLEEP=0-5000 and keep TIMER_URL_COMPLETION at a rather large value. At at ceratin stage, let's say after several hundred cycles, the clients are supposed to work with your URLs in a rather random fashion. Take care. -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2009-05-22 09:59:29
|
Dear Andy, On Fri, May 22, 2009 at 10:49 AM, Support <su...@ta...> wrote: > Dear Sir, > > I would like to use curlloader to simulate a scenario like “1000 http test > clients download 1000 different files via http at the same time”. > > I can define 1000 different URL in config file like “URL= > http://server_ip/fileN” where N = 1 … 1000 > > However, if I use 1000 different URL in the config file, all clients > download file1 at the same time; after file1 downloaded completely, all > client download file2, and so on. > > How can I make all clients download different files at the same time. > Please help. Thank you. > > Best Regards, > > Andy Ng > Please, mind to feel and post your PRF (Problem Reporting Form) What you need is to use a feature developed by Gary, namely, template URL. A good example is in file ./conf-examples/url-template-fr-file.conf and more details please learn in README. You need static tokens taken from file. Take care and have a good day! -- Truly, Robert Iakobashvili, Ph.D. ...................................................................... www.ghotit.com Assistive technology that understands you ...................................................................... |
From: 邵利铮 <sh...@ne...> - 2009-05-22 09:05:22
|
Hi, It is a good idear. However I think your requirement is is... as we know,curl-loader just a capability test tool. maybe,you could try to modify the codes of curl-loader. "for (i=0; i< CLIENTS_NUM_MAX; i++)" --> "for (i=0; i< URLS_NUM; i++)" maybe. Best Regards, Shaolzh From: Support To: cur...@li... Sent: Friday, May 22, 2009 3:49 PM Subject: How to enable 1000 http client downloading 1000 file at the sametimein curl-loader Dear Sir, I would like to use curlloader to simulate a scenario like “1000 http test clients download 1000 different files via http at the same time”. I can define 1000 different URL in config file like “URL=http://server_ip/fileN” where N = 1 … 1000 However, if I use 1000 different URL in the config file, all clients download file1 at the same time; after file1 downloaded completely, all client download file2, and so on. How can I make all clients download different files at the same time. Please help. Thank you. Best Regards, Andy Ng ------------------------------------------------------------------------------ ------------------------------------------------------------------------------ Register Now for Creativity and Technology (CaT), June 3rd, NYC. CaT is a gathering of tech-side developers & brand creativity professionals. Meet the minds behind Google Creative Lab, Visual Complexity, Processing, & iPhoneDevCamp asthey present alongside digital heavyweights like Barbarian Group, R/GA, & Big Spaceship. http://www.creativitycat.com ------------------------------------------------------------------------------ _______________________________________________ curl-loader-devel mailing list cur...@li... https://lists.sourceforge.net/lists/listinfo/curl-loader-devel --------------------------------------------------------------------------------------------------- Confidentiality Notice: The information contained in this e-mail and any accompanying attachment(s) is intended only for the use of the intended recipient and may be confidential and/or privileged of Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of this communication is not the intended recipient, unauthorized use, forwarding, printing, storing, disclosure or copying is strictly prohibited, and may be unlawful.If you have received this communication in error,please immediately notify the sender by return e-mail, and delete the original message and all copies from your system. Thank you. --------------------------------------------------------------------------------------------------- |
From: Support <su...@ta...> - 2009-05-22 08:03:52
|
Dear Sir, I would like to use curlloader to simulate a scenario like 1000 http test clients download 1000 different files via http at the same time. I can define 1000 different URL in config file like URL=http://server_ip/fileN where N = 1 1000 However, if I use 1000 different URL in the config file, all clients download file1 at the same time; after file1 downloaded completely, all client download file2, and so on. How can I make all clients download different files at the same time. Please help. Thank you. Best Regards, Andy Ng |
From: Todd C. <tc...@in...> - 2009-05-05 18:34:58
|
Your suggestion works. Thanks, Todd ________________________________ From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Sunday, May 03, 2009 6:53 AM To: curl-loader-devel Subject: Re: Compiling error: ssh.c:(.text+0x2121): undefined reference to `libssh2_sftp_seek64' Hi Todd, On Thu, Apr 30, 2009 at 5:32 PM, Todd Chu <tc...@in...> wrote: After untar the curl-loader-0.48, I just did "make". Then, I got the following error: make[3]: Entering directory `/home/tchu/Download/curl-loader-0.48/build/curl/src' /bin/sh ../libtool --tag=CC --mode=link gcc -DCURL_MAX_WRITE_SIZE=4096 -g0 -O2 -Wno-system-headers -o curl main.o hugehelp.o urlglob.o writeout.o writeenv.o getpass.o homedir.o curlutil.o strtoofft.o strdup.o rawstr.o ../lib/libcurl.la -lz libtool: link: gcc -DCURL_MAX_WRITE_SIZE=4096 -g0 -O2 -Wno-system-headers -o curl main.o hugehelp.o urlglob.o writeout.o writeenv.o getpass.o homedir.o curlutil.o strtoofft.o strdup.o rawstr.o /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a -lssh2 -lssl -lcrypto -lrt -lz /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In function `ssh_block2waitfor': ssh.c:(.text+0x23a): undefined reference to `libssh2_session_block_directions' /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In function `ssh_statemach_act': ssh.c:(.text+0x1de1): undefined reference to `libssh2_sftp_seek64' ssh.c:(.text+0x2121): undefined reference to `libssh2_sftp_seek64' ssh.c:(.text+0x2715): undefined reference to `libssh2_sftp_seek64' /home/tchu/Download/curl-loader-0.48/build/curl/lib/libcurl.a(ssh.o): In function `ssh_easy_statemach': ssh.c:(.text+0x422b): undefined reference to `libssh2_session_block_directions' I am running Fedora 10 and I have the libssh2-1.1 installed. I appreciate if someone could help me. Regards, Todd Normally, Problem-Reporting-Form is expected. Something mismatched in Fedora-10 libssh2 and libcurl, that we are building. Therefore, please, try to modify the Makefile, target LIBCURL and instead : --without-libidn \ --disable-ldap \ to place: --without-libidn \ --without-libssh2 \ --disable-ldap \ Note, that since this is a target of a Makefile every line is preceded by a tabulation. Further run: $make cleanall $make -- Truly, Robert Iakobashvili |