curl-loader-devel Mailing List for curl-loader - web application testing (Page 11)
Status: Alpha
Brought to you by:
coroberti
You can subscribe to this list here.
2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2007 |
Jan
|
Feb
(1) |
Mar
(7) |
Apr
(19) |
May
(25) |
Jun
(16) |
Jul
(59) |
Aug
(29) |
Sep
(18) |
Oct
(19) |
Nov
(7) |
Dec
(29) |
2008 |
Jan
(6) |
Feb
(18) |
Mar
(8) |
Apr
(27) |
May
(26) |
Jun
(5) |
Jul
(6) |
Aug
|
Sep
(9) |
Oct
(37) |
Nov
(61) |
Dec
(17) |
2009 |
Jan
(21) |
Feb
(25) |
Mar
(4) |
Apr
(2) |
May
(8) |
Jun
(15) |
Jul
(18) |
Aug
(23) |
Sep
(10) |
Oct
(16) |
Nov
(14) |
Dec
(22) |
2010 |
Jan
(23) |
Feb
(8) |
Mar
(18) |
Apr
(1) |
May
(34) |
Jun
(23) |
Jul
(11) |
Aug
(1) |
Sep
(13) |
Oct
(10) |
Nov
(2) |
Dec
(8) |
2011 |
Jan
|
Feb
(7) |
Mar
(24) |
Apr
(12) |
May
(3) |
Jun
(2) |
Jul
(2) |
Aug
|
Sep
(5) |
Oct
(20) |
Nov
(7) |
Dec
(11) |
2012 |
Jan
(12) |
Feb
(5) |
Mar
(16) |
Apr
(3) |
May
|
Jun
(5) |
Jul
(12) |
Aug
(6) |
Sep
|
Oct
|
Nov
(8) |
Dec
|
2013 |
Jan
(1) |
Feb
(3) |
Mar
(5) |
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(8) |
Dec
(4) |
2014 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
(11) |
Dec
(5) |
2015 |
Jan
(1) |
Feb
|
Mar
(11) |
Apr
(3) |
May
(1) |
Jun
(1) |
Jul
(4) |
Aug
(1) |
Sep
(7) |
Oct
(4) |
Nov
(2) |
Dec
|
2016 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Edy H. <edw...@gm...> - 2011-03-01 11:18:24
|
Hi All, I'm newbie to the curl-loader, hopefully I'm targeting the right list. After performing a few tests, I found it very usefull when trying to load a specific page, however, I was not able to simulate a full browser web page download. I'm currently working on a project that includes a HTTP Proxy, suposed to transparently spoof HTTP traffic targeted to real internet sites. Therefore, I was looking for a tool to load the Proxy by accessing several Internet sites (yahoo.com, cnn.com, intranet ones, etc...) and loading all its main page content. When using the curl-loader, it managed to download the initial page without its linked objects (objects like images were not retrieved afterwards). I was wondering if there is a way to make it happen, or if any of you know a different alternative tool. Thanks, Edy. |
From: Robert I. <cor...@gm...> - 2011-02-28 05:24:17
|
Hi Yossi, On Mon, Feb 28, 2011 at 1:02 AM, Yossi Shaul <y1...@gm...> wrote: > >>> 1. When the tag FORM_STRING is used to read data from file, can you read >>> more than 2 variables (credentials) from the file? >>> >> >> Yes >> > Yossi: > The third varaible and on read null from the file. Here is my config line > followed by the request log: > FORM_STRING =email=%s&firstname=%s&lastname=%s&address=%s > > POST /api/users/create?authkey=0v9iq9tm8l4eug1k1cktnsk0u5& HTTP/1.1 > Authorization: Basic dGVzdGVyOmFwcHNkcmVhbQ== > User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0) > Accept: */* > Content-Length: 69 > Content-Type: ap... > > email=sh...@gm...&firstname=test&lastname=(null)&address=(null) > > 180 0 0 1 !! OK 200 > HTTP/1.1 200 OK > > external file looks like this: > # Separator used here is ',' > sh...@gm...,test,last,address > > Sounds like a good option to fix it or to add it and provide a patch to this list. Thanks. Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Yossi S. <y1...@gm...> - 2011-02-27 23:02:29
|
Thanks Robert, Point 3 and 4 work nicely with FORM_STRING. Regarding point 1, see the comment below. Thanks, Yossi On Sun, Feb 27, 2011 at 6:31 PM, Robert Iakobashvili <cor...@gm...>wrote: > Hi Yossi, > > On Sun, Feb 27, 2011 at 6:08 PM, Yossi Shaul <y1...@gm...> wrote: > >> >> 1. When the tag FORM_STRING is used to read data from file, can you read >> more than 2 variables (credentials) from the file? >> > > Yes > Yossi: The third varaible and on read null from the file. Here is my config line followed by the request log: FORM_STRING =email=%s&firstname=%s&lastname=%s&address=%s POST /api/users/create?authkey=0v9iq9tm8l4eug1k1cktnsk0u5& HTTP/1.1 Authorization: Basic dGVzdGVyOmFwcHNkcmVhbQ== User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0) Accept: */* Content-Length: 69 Content-Type: ap... email=sh...@gm...&firstname=test&lastname=(null)&address=(null) 180 0 0 1 !! OK 200 HTTP/1.1 200 OK external file looks like this: # Separator used here is ',' sh...@gm...,test,last,address > >> 2. Can MULTIPART_FORM_DATA tag get data from external file? >> > > No, but it is easy to add, simple C-prog. > > > >> 3. Can MULTIPART_FORM_DATA/FORM_STRING tag be used with PUT request? >> > > REQUEST_TYPE - HTTP request method to be chosen from GET, POST or PUT. > > Have you tried? > > >> 4. Is there a way to get paramter value for a GET request from external >> file? We need to construct a url request that looks like: >> >> http://www.example.com/test/authenticate?username=VALUE1&password=VALUE2&countryid=VALUE3 >> where VALUE1, VALUE2, and VALUE3 need to be imported from external file. >> > > > http://curl-loader.svn.sourceforge.net/viewvc/curl-loader/trunk/curl-loader/conf-examples/get-forms.conf > > and try FORM_USAGE_TYPE= RECORDS_FROM_FILE > > As a general note - it's a simple C-written open-source > wrapper around libcurl. > > http://curl-loader.sourceforge.net/doc/faq.html#tag-description > > > <http://curl-loader.svn.sourceforge.net/viewvc/curl-loader/trunk/curl-loader/conf-examples/get-forms.conf> > > >> >> >> Thanks, >> Yossi >> > > > >> >> >> >> >> ------------------------------------------------------------------------------ >> Free Software Download: Index, Search & Analyze Logs and other IT data in >> Real-Time with Splunk. Collect, index and harness all the fast moving IT >> data >> generated by your applications, servers and devices whether physical, >> virtual >> or in the cloud. Deliver compliance at lower cost and gain new business >> insights. http://p.sf.net/sfu/splunk-dev2dev >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> > > > -- > Regards, > Robert Iakobashvili, Ph.D. > > Home: http://www.ghotit.com > Blog: http://dyslexia-blog.ghotit.com > Twitter: http://twitter.com/ghotit > Facebook: http://facebook.com/ghotit > ...................................................................... > Ghotit Dyslexia > Assistive technology that understands you > ...................................................................... > > > > ------------------------------------------------------------------------------ > Free Software Download: Index, Search & Analyze Logs and other IT data in > Real-Time with Splunk. Collect, index and harness all the fast moving IT > data > generated by your applications, servers and devices whether physical, > virtual > or in the cloud. Deliver compliance at lower cost and gain new business > insights. http://p.sf.net/sfu/splunk-dev2dev > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |
From: Robert I. <cor...@gm...> - 2011-02-27 17:44:41
|
Thanks, applied. On Tue, Sep 28, 2010 at 11:31 PM, Pranav Desai <pra...@gm...>wrote: > > > > The idea looks very interesting and it is worth adding to the mainline. > > Please, kindly, provide the detailed documentation in: > > 1. README, > > 2. man pages; > > 3. configuration example in conf directory > > Everything should enable a new to curl-loader QA-person to understand the > > new tags, how to use them and > > to easy the usage by providing an example. > > Thank you in advance. > > I have attached the patch with config examples and updated docs. Let > me know if I have missed something. > > Thanks > -- Pranav > > > > > -- > > Truly, > > Robert Iakobashvili, Ph.D. > > > > Home: http://www.ghotit.com > > Blog: http://dyslexia-blog.ghotit.com > > Twitter: http://twitter.com/ghotit > > Facebook: http://facebook.com/ghotit > > ...................................................................... > > Ghotit Dyslexia > > Assistive technology that understands you > > ...................................................................... > > > > > ------------------------------------------------------------------------------ > > Nokia and AT&T present the 2010 Calling All Innovators-North America > contest > > Create new apps & games for the Nokia N8 for consumers in U.S. and > Canada > > $10 million total in prizes - $4M cash, 500 devices, nearly $6M in > marketing > > Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store > > http://p.sf.net/sfu/nokia-dev2dev > > _______________________________________________ > > curl-loader-devel mailing list > > cur...@li... > > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > > > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2011-02-27 16:31:16
|
Hi Yossi, On Sun, Feb 27, 2011 at 6:08 PM, Yossi Shaul <y1...@gm...> wrote: > > 1. When the tag FORM_STRING is used to read data from file, can you read > more than 2 variables (credentials) from the file? > Yes > 2. Can MULTIPART_FORM_DATA tag get data from external file? > No, but it is easy to add, simple C-prog. > 3. Can MULTIPART_FORM_DATA/FORM_STRING tag be used with PUT request? > REQUEST_TYPE - HTTP request method to be chosen from GET, POST or PUT. Have you tried? > 4. Is there a way to get paramter value for a GET request from external > file? We need to construct a url request that looks like: > > http://www.example.com/test/authenticate?username=VALUE1&password=VALUE2&countryid=VALUE3 > where VALUE1, VALUE2, and VALUE3 need to be imported from external file. > http://curl-loader.svn.sourceforge.net/viewvc/curl-loader/trunk/curl-loader/conf-examples/get-forms.conf and try FORM_USAGE_TYPE= RECORDS_FROM_FILE As a general note - it's a simple C-written open-source wrapper around libcurl. http://curl-loader.sourceforge.net/doc/faq.html#tag-description <http://curl-loader.svn.sourceforge.net/viewvc/curl-loader/trunk/curl-loader/conf-examples/get-forms.conf> > > > Thanks, > Yossi > > > > > > ------------------------------------------------------------------------------ > Free Software Download: Index, Search & Analyze Logs and other IT data in > Real-Time with Splunk. Collect, index and harness all the fast moving IT > data > generated by your applications, servers and devices whether physical, > virtual > or in the cloud. Deliver compliance at lower cost and gain new business > insights. http://p.sf.net/sfu/splunk-dev2dev > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Yossi S. <y1...@gm...> - 2011-02-27 16:08:27
|
Hi, I've been testing the tool and have a couple of questions that came due to our testing needs. In general, we need to read parameter values from external file, often more than 2 parameters. 1. When the tag FORM_STRING is used to read data from file, can you read more than 2 variables (credentials) from the file? 2. Can MULTIPART_FORM_DATA tag get data from external file? 3. Can MULTIPART_FORM_DATA/FORM_STRING tag be used with PUT request? 4. Is there a way to get paramter value for a GET request from external file? We need to construct a url request that looks like: http://www.example.com/test/authenticate?username=VALUE1&password=VALUE2&countryid=VALUE3 where VALUE1, VALUE2, and VALUE3 need to be imported from external file. Thanks, Yossi |
From: Kamal A. <kam...@ya...> - 2011-02-11 20:25:28
|
i am getting error on make In file included from loader.c:57: ssl_thr_lock.h:27:28: error: openssl/crypto.h: No such file or directory make: *** [obj/loader.o] Error 1 I would appreciate any help Thanks, -Kamal |
From: Robert I. <cor...@gm...> - 2011-02-11 05:48:27
|
Hi folks, Increase of ARP-cache thresholds improves performance dramatically, when curl-loader is used with thousands IP-addresses. Good point. http://txt.bitprocessor.be/2011/02/10/curl-loader-and-arp-cache-issues/ -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2010-12-27 20:56:37
|
Hi, On Mon, Dec 27, 2010 at 10:34 PM, Kamal Ahmed <kam...@ya...>wrote: > I have tried i=on ubuntu and Mac, but failing on both: > > Mac: > Read the FAQs, this is for linux. > > gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 > -D_FILE_OFFSET_BITS=64 -O0 -g -I. -I./inc -I/usr//include -c -o obj/batch.o > batch.c > In file included from batch.c:23: > fdsetsize.h:27:24: error: bits/types.h: No such file or directory > make: *** [obj/batch.o] Error 1 > > > ubuntu: > > make optimize=1 debug=0 > gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 > -D_FILE_OFFSET_BITS=64 -fomit-frame-pointer -O3 -ffast-math > -finline-functions -funroll-all-loops -finline-limit=1000 -mmmx -msse > -foptimize-sibling-calls -I. -I./inc -I/usr//include -c -o obj/parse_conf.o > parse_conf.c > parse_conf.c: In function ‘read_callback’: > parse_conf.c:3894: error: conflicting types for ‘pread’ > /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here > make: *** [obj/parse_conf.o] Error 1 > Some minor badness. Make it to compile and send here your patches. Thanks a lot. -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Kamal A. <kam...@ya...> - 2010-12-27 20:34:28
|
I have tried i=on ubuntu and Mac, but failing on both: Mac: gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 -D_FILE_OFFSET_BITS=64 -O0 -g -I. -I./inc -I/usr//include -c -o obj/batch.o batch.c In file included from batch.c:23: fdsetsize.h:27:24: error: bits/types.h: No such file or directory make: *** [obj/batch.o] Error 1 ubuntu: make optimize=1 debug=0 gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 -D_FILE_OFFSET_BITS=64 -fomit-frame-pointer -O3 -ffast-math -finline-functions -funroll-all-loops -finline-limit=1000 -mmmx -msse -foptimize-sibling-calls -I. -I./inc -I/usr//include -c -o obj/parse_conf.o parse_conf.c parse_conf.c: In function ‘read_callback’: parse_conf.c:3894: error: conflicting types for ‘pread’ /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here make: *** [obj/parse_conf.o] Error 1 |
From: Jan K. <jk...@gm...> - 2010-12-22 20:03:37
|
I retract my last post. This time, i changed the line 3905 back to "pread", while keeping "mypread" on the line 3894. And I added back the patch you suggested to remove. It doesn't work as it should but doesn't block the build. Anyway curl-loader compiled! :) Very likely I'll post some more, because I'm trying to integrate curl-loader with continuous integration tools... Mahy On 12/22/2010 08:52 PM, Jan Kotuc wrote: > Thanks, it looks like the renaming from "pread" to "mypread" didn't work: > > In function `read_callback': > /home/janci/apps/curl-loader/parse_conf.c:3905: undefined reference to > `mypread' > collect2: ld returned 1 exit status > make: *** [curl-loader] Error 1 > > Should I rename some header file as well? I scanned through all files in > the build directory and there's no other relevant occurence of > "pread"... :-/ > > Mahy |
From: Jan K. <jk...@gm...> - 2010-12-22 19:53:20
|
Thanks, it looks like the renaming from "pread" to "mypread" didn't work: In function `read_callback': /home/janci/apps/curl-loader/parse_conf.c:3905: undefined reference to `mypread' collect2: ld returned 1 exit status make: *** [curl-loader] Error 1 Should I rename some header file as well? I scanned through all files in the build directory and there's no other relevant occurence of "pread"... :-/ Mahy |
From: Robert I. <cor...@gm...> - 2010-12-22 17:19:37
|
Hi, On Wed, Dec 22, 2010 at 7:11 PM, Jan Kotuc <jk...@gm...> wrote: > Thanks a lot, renaming fixed this one issue and compilation went far > further, but still did not finish successfully. When the compilation and > make is seemingly over, it displays this: > > <snipped> > > libevent-1.4.13-stable/compat/sys/_libevent_time.h > libevent-1.4.13-stable/compat/sys/queue.h > cd /home/janci/apps/curl-loader/build/libevent/libevent-1.4.13-stable; > patch -p1 < ../../../patches/libevent-nevent.patch; ./configure --prefix > /home/janci/apps/curl-loader/build/libevent \ > CFLAGS=" -g -O0" > patching file devpoll.c > Hunk #1 succeeded at 85 with fuzz 1 (offset -3 lines). > patching file epoll.c > Hunk #1 FAILED at 97. > 1 out of 1 hunk FAILED -- saving rejects to file epoll.c.rej > checking for a BSD-compatible install... /usr/bin/install -c > checking whether build environment is sane... yes > ... > > ...and then the whole process reiterates and starts checking and > building from scratch *over and over again* until i stop it with Ctrl+C > > Apparently there's some patch gone wrong. Any ideas of fixing it? Has > anybody succeeded in building curl-loader on Ubuntu? Thx for your help. > > Sounds like Ubuntu developers have broke many things. delete in Makefile patch -p1 < ../../../patches/libevent-nevent.patch; and re-compile. > Mahy > > On 12/22/2010 03:19 PM, Robert Iakobashvili wrote: > > Hi Jan, > > > > > > 1. Rename pread to e.g. prep_read. > > 2, Recompile, make cleanall; make. > > 3. Submit patch, > > > > Tnx > > -- > > > > ------------------------------------------------------------------------------ > Forrester recently released a report on the Return on Investment (ROI) of > Google Apps. They found a 300% ROI, 38%-56% cost savings, and break-even > within 7 months. Over 3 million businesses have gone Google with Google > Apps: > an online email calendar, and document program that's accessible from your > browser. Read the Forrester report: http://p.sf.net/sfu/googleapps-sfnew > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Jan K. <jk...@gm...> - 2010-12-22 17:12:26
|
Thanks a lot, renaming fixed this one issue and compilation went far further, but still did not finish successfully. When the compilation and make is seemingly over, it displays this: <snipped> libevent-1.4.13-stable/compat/sys/_libevent_time.h libevent-1.4.13-stable/compat/sys/queue.h cd /home/janci/apps/curl-loader/build/libevent/libevent-1.4.13-stable; patch -p1 < ../../../patches/libevent-nevent.patch; ./configure --prefix /home/janci/apps/curl-loader/build/libevent \ CFLAGS=" -g -O0" patching file devpoll.c Hunk #1 succeeded at 85 with fuzz 1 (offset -3 lines). patching file epoll.c Hunk #1 FAILED at 97. 1 out of 1 hunk FAILED -- saving rejects to file epoll.c.rej checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes ... ...and then the whole process reiterates and starts checking and building from scratch *over and over again* until i stop it with Ctrl+C Apparently there's some patch gone wrong. Any ideas of fixing it? Has anybody succeeded in building curl-loader on Ubuntu? Thx for your help. Mahy On 12/22/2010 03:19 PM, Robert Iakobashvili wrote: > Hi Jan, > > > 1. Rename pread to e.g. prep_read. > 2, Recompile, make cleanall; make. > 3. Submit patch, > > Tnx > -- |
From: Robert I. <cor...@gm...> - 2010-12-22 14:20:07
|
Hi Jan, On Wed, Dec 22, 2010 at 3:58 PM, Jan Kotuc <jk...@gm...> wrote: > Hello people, > > when I run make in curl-loader's directory, I end up with the following > message: > > parse_conf.c: In function ‘read_callback’: > parse_conf.c:3894: error: conflicting types for ‘pread’ > /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here > make: *** [obj/parse_conf.o] Error 1 > > I have some idea what it means, but how do I work it around? Thx for any > assistance. > > Mahy > > > ------------------------------------------------------------------------------ > Forrester recently released a report on the Return on Investment (ROI) of > Google Apps. They found a 300% ROI, 38%-56% cost savings, and break-even > within 7 months. Over 3 million businesses have gone Google with Google > Apps: > an online email calendar, and document program that's accessible from your > browser. Read the Forrester report: http://p.sf.net/sfu/googleapps-sfnew > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > 1. Rename pread to e.g. prep_read. 2, Recompile, make cleanall; make. 3. Submit patch, Tnx -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Jan K. <jk...@gm...> - 2010-12-22 13:59:07
|
Hello people, when I run make in curl-loader's directory, I end up with the following message: parse_conf.c: In function ‘read_callback’: parse_conf.c:3894: error: conflicting types for ‘pread’ /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here make: *** [obj/parse_conf.o] Error 1 I have some idea what it means, but how do I work it around? Thx for any assistance. Mahy |
From: Robert I. <cor...@gm...> - 2010-11-09 06:29:07
|
Hi Dave, On Tue, Nov 9, 2010 at 8:23 AM, Dave Seddon <da...@se...> wrote: > Greetings, > > Sorry for the dumb question, but does somebody have an example of a > large number of clients, like the 10K.conf example, AND also with a very > large number of URLs? I've tried setting it up with a heap of the URL > sections, but it didn't work. Could you, please provide more details as required by PROBLEM-REPORTING form? Besides that, have you tried a single URL to begin with? Have you looked at log files when running with -v? Etc, etc -- Truly, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Dave S. <da...@se...> - 2010-11-09 06:23:58
|
Greetings, Sorry for the dumb question, but does somebody have an example of a large number of clients, like the 10K.conf example, AND also with a very large number of URLs? I've tried setting it up with a heap of the URL sections, but it didn't work. Here's what I tried: --------------------------- #URL 1 URL = http://my.web.site.com/MP3Preview3/Mammoth/283/3/0724383183654_01_011_mp3_64k_30sec.mp3 REQUEST_TYPE = GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 0 # #URL 2 URL = http://my.web.site.com/MP3Preview2/Mammoth/106/575/UMG_audclp_00602517490970_01_006_61.mp3 REQUEST_TYPE = GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 0 # #URL 3 URL = http://my.web.site.com/MP3Preview/Mammoth/43/79/085365998568_029_64.mp3 REQUEST_TYPE = GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 0 # ... #URL 622142 URL = http://my.web.site.com/MP3Preview3/Mammoth/288/789/0724381004555_01_003_mp3_64k_30sec.mp3 REQUEST_TYPE = GET TIMER_URL_COMPLETION = 0 TIMER_AFTER_URL_SLEEP = 0 --------------------------------------------- Essentially the giant URL list was added to the end of the 10K.conf example: ---------------------------------- cat 10K.conf ########### GENERAL SECTION ################################ BATCH_NAME= 10K CLIENTS_NUM_MAX=10000 CLIENTS_NUM_START=100 CLIENTS_RAMPUP_INC=50 INTERFACE =eth0 NETMASK=16 IP_ADDR_MIN= 192.168.1.1 IP_ADDR_MAX= 192.168.53.255 #Actually - this is for self-control CYCLES_NUM= -1 URLS_NUM= 1 ########### URL SECTION #################################### URL=http://localhost/index.html #URL=http://localhost/ACE-INSTALL.html URL_SHORT_NAME="local-index" REQUEST_TYPE=GET TIMER_URL_COMPLETION = 5000 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP =20 ---------------------------------- Essentially we are trying to stress out a proxy cache by requesting a large number of unique objects and unique clients. I notice that other people are trying to do something similar, except they are just changing the object path at the end, rather than the entire URL. Kind regards, Dave Seddon +61 447 SEDDON da...@se... |
From: Robert I. <cor...@gm...> - 2010-10-07 05:48:09
|
Hi, On Thu, Oct 7, 2010 at 2:27 AM, sajal <s.b...@qu...> wrote: > . > CLIENTS_NUM_START = 100 > CLIENTS_RAMPUP_INC = 150 > CLIENTS_RAMPUP_INC = 50 CYCLES_NUM= -1 > OK > FRESH_CONNECT=0 > Remove this tag > > I ran the test for approx 100 sec and I was able to achieve ~3000 packets > per second, and with average packet size of 96 bytes, > 96- ??? > the data rate was ~2.21 mega bits per sec. > Check your network with somebody serious. If the link is not 10 MB, etc > I also monitored the server side, it was like ~40% CPU usage. The client > machine has consumed almost entire CPU (dual core). > Strange. Add -t 2 to your command line > But the memory utilization on client machine was comparatively less > ~15-20%. > > Can you suggest the configuration to get a much higher load than this in > terms of no of packets per second and data rate? > > > Thanks and Regards > > > -- > > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > > ------------------------------------------------------------------------------ > Beautiful is writing same markup. Internet Explorer 9 supports > standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 & L3. > Spend less time writing and rewriting code and more time creating great > experiences on the web. Be a part of the beta today. > http://p.sf.net/sfu/beautyoftheweb > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Truly, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: sajal <s.b...@qu...> - 2010-10-07 00:27:57
|
On 06/10/10 18:25, Robert Iakobashvili wrote: > Hi, > > On Wed, Oct 6, 2010 at 8:27 AM, sajal <s.b...@qu... > <mailto:s.b...@qu...>> wrote: > > Hi, > > I am using curl-loader on a physical machine with following > specifications: Intel Core 2 duo 3.0 GHz processor, 4 GB Memory and > running Linux 2.6.32 > > The configuration file being used is the following: > ########### GENERAL SECTION ################################ > BATCH_NAME = 100 > CLIENTS_NUM_MAX = 10000 > > > Try some 3000-5000 to begin with > > CLIENTS_NUM_START = 1000 > > > Start from some 100-200 > > CLIENTS_RAMPUP_INC = 1000 > > > keep 50-200 > > INTERFACE = eth0 > NETMASK = 8 > IP_ADDR_MIN= 10.1.0.1 > IP_ADDR_MAX= 10.2.255.255 #Actually - this is for self-control > CYCLES_NUM= 1 > > > You mean -1 or some high number? > > Use > CYCLES_NUM= -1 > > URLS_NUM=1 > ########### URL SECTION #################################### > > URL=http://192.168.100.8/index.html > URL_SHORT_NAME="local-index" > REQUEST_TYPE=GET > FRESH_CONNECT=1 > > > Remove the tag FRESH_CONNECT=1 > > TIMER_URL_COMPLETION =0 # In msec. When positive, Now it is > enforced by cancelling url fetch on timeout > TIMER_AFTER_URL_SLEEP =0 > > With this configuration file I am able to get approximately 130 new > connections/second and the data-rate is somewhere 30-35 kbps. I wanted > to know with this configuration and available resources what is the > maximum possible no. of new connections/second and data-rate I can > get? > > > Monitor your server-side, where it may be the bottleneck, > after you place the correct tags with values. > > Read: > > http://curl-loader.sourceforge.net/high-load-hw > <http://curl-loader.sourceforge.net/high-load-hw/> > > and read more the README, other files in docs directory and the links > sent to you. > > > Thanks and Regards > > -- > > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > ------------------------------------------------------------------------------ > Beautiful is writing same markup. Internet Explorer 9 supports > standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 & L3. > Spend less time writing and rewriting code and more time creating > great > experiences on the web. Be a part of the beta today. > http://p.sf.net/sfu/beautyoftheweb > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > <mailto:cur...@li...> > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > > > > -- > Truly, > Robert Iakobashvili, Ph.D. > > Home: http://www.ghotit.com > Blog: http://dyslexia-blog.ghotit.com <http://dyslexia-blog.ghotit.com/> > Twitter: http://twitter.com/ghotit > Facebook: http://facebook.com/ghotit > ...................................................................... > Ghotit Dyslexia > Assistive technology that understands you > ...................................................................... > Hi, Thanks for your reply. I tried with the suggested configuration i.e. CLIENTS_NUM_START = 100 CLIENTS_RAMPUP_INC = 150 CYCLES_NUM= -1 FRESH_CONNECT=0 I ran the test for approx 100 sec and I was able to achieve ~3000 packets per second, and with average packet size of 96 bytes, the data rate was ~2.21 mega bits per sec. I also monitored the server side, it was like ~40% CPU usage. The client machine has consumed almost entire CPU (dual core). But the memory utilization on client machine was comparatively less ~15-20%. Can you suggest the configuration to get a much higher load than this in terms of no of packets per second and data rate? Thanks and Regards -- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: Robert I. <cor...@gm...> - 2010-10-06 08:25:24
|
Hi, On Wed, Oct 6, 2010 at 8:27 AM, sajal <s.b...@qu...> wrote: > Hi, > > I am using curl-loader on a physical machine with following > specifications: Intel Core 2 duo 3.0 GHz processor, 4 GB Memory and > running Linux 2.6.32 > > The configuration file being used is the following: > ########### GENERAL SECTION ################################ > BATCH_NAME = 100 > CLIENTS_NUM_MAX = 10000 > Try some 3000-5000 to begin with > CLIENTS_NUM_START = 1000 > Start from some 100-200 > CLIENTS_RAMPUP_INC = 1000 > keep 50-200 > INTERFACE = eth0 > NETMASK = 8 > IP_ADDR_MIN= 10.1.0.1 > IP_ADDR_MAX= 10.2.255.255 #Actually - this is for self-control > CYCLES_NUM= 1 > You mean -1 or some high number? Use CYCLES_NUM= -1 > URLS_NUM=1 > ########### URL SECTION #################################### > > URL=http://192.168.100.8/index.html > URL_SHORT_NAME="local-index" > REQUEST_TYPE=GET > FRESH_CONNECT=1 > Remove the tag FRESH_CONNECT=1 > TIMER_URL_COMPLETION =0 # In msec. When positive, Now it is > enforced by cancelling url fetch on timeout > TIMER_AFTER_URL_SLEEP =0 > > With this configuration file I am able to get approximately 130 new > connections/second and the data-rate is somewhere 30-35 kbps. I wanted > to know with this configuration and available resources what is the > maximum possible no. of new connections/second and data-rate I can get? > Monitor your server-side, where it may be the bottleneck, after you place the correct tags with values. Read: http://curl-loader.sourceforge.net/high-load-hw and read more the README, other files in docs directory and the links sent to you. > > Thanks and Regards > > -- > > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > > ------------------------------------------------------------------------------ > Beautiful is writing same markup. Internet Explorer 9 supports > standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 & L3. > Spend less time writing and rewriting code and more time creating great > experiences on the web. Be a part of the beta today. > http://p.sf.net/sfu/beautyoftheweb > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > -- Truly, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: sajal <s.b...@qu...> - 2010-10-06 06:27:26
|
Hi, I am using curl-loader on a physical machine with following specifications: Intel Core 2 duo 3.0 GHz processor, 4 GB Memory and running Linux 2.6.32 The configuration file being used is the following: ########### GENERAL SECTION ################################ BATCH_NAME = 100 CLIENTS_NUM_MAX = 10000 CLIENTS_NUM_START = 1000 CLIENTS_RAMPUP_INC = 1000 INTERFACE = eth0 NETMASK = 8 IP_ADDR_MIN= 10.1.0.1 IP_ADDR_MAX= 10.2.255.255 #Actually - this is for self-control CYCLES_NUM= 1 URLS_NUM=1 ########### URL SECTION #################################### URL=http://192.168.100.8/index.html URL_SHORT_NAME="local-index" REQUEST_TYPE=GET FRESH_CONNECT=1 TIMER_URL_COMPLETION =0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP =0 With this configuration file I am able to get approximately 130 new connections/second and the data-rate is somewhere 30-35 kbps. I wanted to know with this configuration and available resources what is the maximum possible no. of new connections/second and data-rate I can get? Thanks and Regards -- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: Robert I. <cor...@gm...> - 2010-10-06 06:22:29
|
Hi, On Wed, Oct 6, 2010 at 8:19 AM, sajal <s.b...@qu...> wrote: > On 06/10/10 16:04, Robert Iakobashvili wrote: > > > > Thanks for your reply. Is it also using libcurl to generate the clients? or > is it using some other base program or library? > > > Regards > > -- > > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > http://curl-loader.sourceforge.net/doc/faq.html <http://curl-loader.sourceforge.net/doc/faq.html> http://curl-loader.sourceforge.net/doc/fast.html <http://curl-loader.sourceforge.net/doc/fast.html> http://curl-loader.sourceforge.net/high-load-hw/index.html -- Truly, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: sajal <s.b...@qu...> - 2010-10-06 06:19:43
|
On 06/10/10 16:04, Robert Iakobashvili wrote: > Hi, > > On Wed, Oct 6, 2010 at 7:58 AM, sajal <s.b...@qu... > <mailto:s.b...@qu...>> wrote: > > On 06/10/10 15:20, Robert Iakobashvili wrote: > > does not store the > Hi, > > Thanks for a quick reply. If it not stores these files then what > exactly > happens to these files once they are fetched from the web server? > > > Drops the bytes. This is about just not to write to the files the > bytes you get by TCP/IP > and read into memory. > > One > more thing, what exactly curl-loader uses to fetch the files? > > > Client-side of HTTP/FTP protocol using libcurl > > I haven't > looked into the code which appears to be a bit long so if you can > answer > this it would help me save some time :-) > > > Regards > > -- > > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > ------------------------------------------------------------------------------ > Beautiful is writing same markup. Internet Explorer 9 supports > standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 & L3. > Spend less time writing and rewriting code and more time creating > great > experiences on the web. Be a part of the beta today. > http://p.sf.net/sfu/beautyoftheweb > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > <mailto:cur...@li...> > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > > > > -- > Truly, > Robert Iakobashvili, Ph.D. > > Home: http://www.ghotit.com > Blog: http://dyslexia-blog.ghotit.com <http://dyslexia-blog.ghotit.com/> > Twitter: http://twitter.com/ghotit > Facebook: http://facebook.com/ghotit > ...................................................................... > Ghotit Dyslexia > Assistive technology that understands you > ...................................................................... > Hi, Thanks for your reply. Is it also using libcurl to generate the clients? or is it using some other base program or library? Regards -- Sajal Bhatia Research Masters Student QUT, Brisbane AUSTRALIA |
From: Robert I. <cor...@gm...> - 2010-10-06 06:04:29
|
Hi, On Wed, Oct 6, 2010 at 7:58 AM, sajal <s.b...@qu...> wrote: > On 06/10/10 15:20, Robert Iakobashvili wrote: > > does not store the > Hi, > > Thanks for a quick reply. If it not stores these files then what exactly > happens to these files once they are fetched from the web server? Drops the bytes. This is about just not to write to the files the bytes you get by TCP/IP and read into memory. > One > more thing, what exactly curl-loader uses to fetch the files? Client-side of HTTP/FTP protocol using libcurl > I haven't > looked into the code which appears to be a bit long so if you can answer > this it would help me save some time :-) > Regards > > -- > > Sajal Bhatia > Research Masters Student > QUT, Brisbane > AUSTRALIA > > > > ------------------------------------------------------------------------------ > Beautiful is writing same markup. Internet Explorer 9 supports > standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 & L3. > Spend less time writing and rewriting code and more time creating great > experiences on the web. Be a part of the beta today. > http://p.sf.net/sfu/beautyoftheweb > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > -- Truly, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |