curl-loader-devel Mailing List for curl-loader - web application testing (Page 9)
Status: Alpha
Brought to you by:
coroberti
You can subscribe to this list here.
2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2007 |
Jan
|
Feb
(1) |
Mar
(7) |
Apr
(19) |
May
(25) |
Jun
(16) |
Jul
(59) |
Aug
(29) |
Sep
(18) |
Oct
(19) |
Nov
(7) |
Dec
(29) |
2008 |
Jan
(6) |
Feb
(18) |
Mar
(8) |
Apr
(27) |
May
(26) |
Jun
(5) |
Jul
(6) |
Aug
|
Sep
(9) |
Oct
(37) |
Nov
(61) |
Dec
(17) |
2009 |
Jan
(21) |
Feb
(25) |
Mar
(4) |
Apr
(2) |
May
(8) |
Jun
(15) |
Jul
(18) |
Aug
(23) |
Sep
(10) |
Oct
(16) |
Nov
(14) |
Dec
(22) |
2010 |
Jan
(23) |
Feb
(8) |
Mar
(18) |
Apr
(1) |
May
(34) |
Jun
(23) |
Jul
(11) |
Aug
(1) |
Sep
(13) |
Oct
(10) |
Nov
(2) |
Dec
(8) |
2011 |
Jan
|
Feb
(7) |
Mar
(24) |
Apr
(12) |
May
(3) |
Jun
(2) |
Jul
(2) |
Aug
|
Sep
(5) |
Oct
(20) |
Nov
(7) |
Dec
(11) |
2012 |
Jan
(12) |
Feb
(5) |
Mar
(16) |
Apr
(3) |
May
|
Jun
(5) |
Jul
(12) |
Aug
(6) |
Sep
|
Oct
|
Nov
(8) |
Dec
|
2013 |
Jan
(1) |
Feb
(3) |
Mar
(5) |
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(8) |
Dec
(4) |
2014 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
(11) |
Dec
(5) |
2015 |
Jan
(1) |
Feb
|
Mar
(11) |
Apr
(3) |
May
(1) |
Jun
(1) |
Jul
(4) |
Aug
(1) |
Sep
(7) |
Oct
(4) |
Nov
(2) |
Dec
|
2016 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Oren S. <or...@gm...> - 2011-10-11 10:07:13
|
Hi all, I wanted to know if it's possible to automatically instruct curl-loader to fetch embedded URLs (images, js, css) from a given test URL. This will help simulate actual browser load, and not just a single page fetching time. I couldn't find this option in the confs or docs, did I miss it or does it not exist? Thanks! Oren |
From: Robert I. <cor...@gm...> - 2011-10-11 09:48:43
|
Please, try CLIENTS_NUM_MAX=100 CLIENTS_NUM_START=100 and comment out #CLIENTS_RAMPUP_INC=10 Say, what happens with traffic and send me the log. On Tue, Oct 11, 2011 at 11:36 AM, CK Zhang <ck...@fo...> wrote: > Attached :) thanks! > > Best Regards, > CK > #8042 > > > -----Original Message----- > From: Robert Iakobashvili [mailto:cor...@gm...] > Sent: Tuesday, October 11, 2011 5:33 PM > To: curl-loader-devel > Subject: Re: HTTPS throughput too low > > Could you send me it zipped not rarred? > Thanks! > > On Tue, Oct 11, 2011 at 11:30 AM, CK Zhang <ck...@fo...> wrote: >> Dear Robert >> >> Attached test log by command "./curl-loader -f https.conf -v" >> >> Best Regards, >> CK >> #8042 >> >> -----Original Message----- >> From: Robert Iakobashvili [mailto:cor...@gm...] >> Sent: Tuesday, October 11, 2011 5:19 PM >> To: curl-loader-devel >> Subject: Re: HTTPS throughput too low >> >> Dear CK Zhang, >> >> Can you run HTTPS sessions with -v detailed logging and say if >> you seen in logs something interesting. >> Thanks, >> Robert >> >> >> >> On Tue, Oct 11, 2011 at 11:14 AM, CK Zhang <ck...@fo...> wrote: >>> Dear Robert >>> >>> >>> >>> Thanks a lot for your replay, >>> >>> >>> >>> Here are some answer about your questions >>> >>> >>> >>> >>> >>> What happens if you try HTTP? Does it impact throughput? >>> >>> It is normal and perfect if I test with HTTP, The throughput can reach > to >>> 950Mbps. >>> >>> >>> >>> Which tools you are using to measure throughput? >>> >>> There are 2 ways I used, >>> >>> 1 check the log summary form cur-loader like this” >>> Ti:5261372B/s,To:18375B/s” >>> >>> 2 check real time traffic speed on switch with connected test PC and HTTP >>> server >>> >>> >>> >>> What happens, if you try just to ftp some large file between the > machines? >>> >>> NO test >>> >>> What about CPU usage at both client and server? >>> >>> Both they are busy, about 80% >>> >>> Which HTTP server is in use? >>> >>> Apache/2.2.21 >>> >>> Have you looked into logs of curl-loader and your web-server? >>> >>> Checked log, not find abnormal issue. >>> >>> >>> >>> >>> >>> Today, I used Loadrunner11 to simulate https traffic with same >> environments >>> (same apache server) , it can reach to 110Mpbs. But still only get >> 800kbps >>> with cur-loader. >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> Best Regards, >>> >>> CK >>> >>> #8042 >>> >>> >>> >>> From: Robert Iakobashvili [mailto:cor...@gm...] >>> Sent: Tuesday, September 27, 2011 1:19 PM >>> To: curl-loader-devel >>> Subject: Re: HTTPS throughput too low >>> >>> >>> >>> Dear CK Zhang, >>> >>> On Mon, Sep 26, 2011 at 5:01 AM, CK Zhang <ck...@fo...> wrote: >>> >>> >>> >>> I want to get high https stream with curl loader, but the Https > throughput >>> is about 800kbps between two PC, Http throughput can reach 950Mbps in > that >> 2 >>> pcs, I do know if some have experience like this, and get a high https >>> throughput with curl-loader, Thanks for any supports and help!! >>> >>> >>> >>> >>> >>> Here is my configuration: >>> >>> >>> >>> PC1:CPU Inter E2200, Mem 2G, 1000M NIC >>> >>> OpenSSL: >>> >>> openssl-devel-1.0.0-4.el6_0.2.x86_64 >>> >>> openssl-1.0.0-4.el6_0.2.x86_64 >>> >>> openssl-perl-1.0.0-4.el6_0.2.x86_64 >>> >>> >>> >>> curl-loader-0.53 >>> >>> ########### GENERAL SECTION >>> ################################ >>> >>> BATCH_NAME= Apach >>> >>> CLIENTS_NUM_MAX=400 >>> >>> CLIENTS_NUM_START=100 >>> >>> CLIENTS_RAMPUP_INC=10 >>> >>> INTERFACE =eth2 >>> >>> NETMASK=16 >>> >>> IP_ADDR_MIN= 172.16.76.150 >>> >>> IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control >>> >>> CYCLES_NUM= -1 >>> >>> URLS_NUM= 1 >>> >>> >>> >>> ########### URL SECTION #################################### >>> >>> >>> >>> URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB >>> >>> #URL=http://localhost/ACE-INSTALL.html >>> >>> URL_SHORT_NAME="local-index" >>> >>> REQUEST_TYPE=GET >>> >>> TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is > enforced >>> by cancelling url fetch on timeout >>> >>> TIMER_AFTER_URL_SLEEP =1000 >>> >>> ~ >>> >>> >>> >>> PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC >>> >>> Server version: Apache/2.2.21 (Unix) >>> >>> Server built: Sep 21 2011 16:19:43 >>> >>> openssl-devel-1.0.0-4.el6_0.2.x86_64 >>> >>> openssl-1.0.0-4.el6_0.2.x86_64 >>> >>> openssl-perl-1.0.0-4.el6_0.2.x86_64 >>> >>> >>> >>> >>> >>> Interesting. >>> >>> >>> >>> What happens if you try HTTP? Does it impact throughput? >>> >>> >>> >>> Which tools you are using to measure throughput? >>> >>> >>> >>> What happens, if you try just to ftp some large file between the > machines? >>> >>> >>> >>> What about CPU usage at both client and server? >>> >>> Which HTTP server is in use? >>> >>> >>> >>> Have you looked into logs of curl-loader and your web-server? >>> >>> >>> >>> -- >>> Regards, >>> Robert Iakobashvili, Ph.D. >>> >>> >>> >>> Home: http://www.ghotit.com >>> >>> Blog: http://dyslexia-blog.ghotit.com >>> >>> Twitter: http://twitter.com/ghotit >>> >>> Facebook: http://facebook.com/ghotit >>> >>> ...................................................................... >>> >>> Ghotit Dyslexia >>> >>> Assistive technology that understands you >>> >>> ...................................................................... >>> >>> >>> >>> ________________________________ >>> >>> *** Please note that this message and any attachments may contain >>> confidential >>> and proprietary material and information and are intended only for the > use >>> of >>> the intended recipient(s). If you are not the intended recipient, you are >>> hereby >>> notified that any review, use, disclosure, dissemination, distribution or >>> copying >>> of this message and any attachments is strictly prohibited. If you have >>> received >>> this email in error, please immediately notify the sender and destroy > this >>> e-mail >>> and any attachments and all copies, whether electronic or printed. >>> Please also note that any views, opinions, conclusions or commitments >>> expressed >>> in this message are those of the individual sender and do not necessarily >>> reflect >>> the views of Fortinet, Inc., its affiliates, and emails are not binding > on >>> Fortinet and only a writing manually signed by Fortinet's General Counsel >>> can be >>> a binding commitment of Fortinet to Fortinet's customers or partners. >> Thank >>> you. *** >>> >>> ________________________________ >>> >>> >> > ---------------------------------------------------------------------------- >> -- >>> All the data continuously generated in your IT infrastructure contains a >>> definitive record of customers, application performance, security >>> threats, fraudulent activity and more. Splunk takes this data and makes >>> sense of it. Business sense. IT sense. Common sense. >>> http://p.sf.net/sfu/splunk-d2d-oct >>> _______________________________________________ >>> curl-loader-devel mailing list >>> cur...@li... >>> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >>> >>> >> >> >> >> -- >> Regards, >> Robert Iakobashvili, Ph.D. >> >> Home: http://www.ghotit.com >> Blog: http://dyslexia-blog.ghotit.com >> Twitter: http://twitter.com/ghotit >> Facebook: http://facebook.com/ghotit >> ...................................................................... >> Ghotit Dyslexia >> Assistive technology that understands you >> ...................................................................... >> >> > ---------------------------------------------------------------------------- >> -- >> All the data continuously generated in your IT infrastructure contains a >> definitive record of customers, application performance, security >> threats, fraudulent activity and more. Splunk takes this data and makes >> sense of it. Business sense. IT sense. Common sense. >> http://p.sf.net/sfu/splunk-d2d-oct >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> >> *** Please note that this message and any attachments may contain > confidential >> and proprietary material and information and are intended only for the use > of >> the intended recipient(s). If you are not the intended recipient, you are > hereby >> notified that any review, use, disclosure, dissemination, distribution or > copying >> of this message and any attachments is strictly prohibited. If you have > received >> this email in error, please immediately notify the sender and destroy this > e-mail >> and any attachments and all copies, whether electronic or printed. >> Please also note that any views, opinions, conclusions or commitments > expressed >> in this message are those of the individual sender and do not necessarily > reflect >> the views of Fortinet, Inc., its affiliates, and emails are not binding on >> Fortinet and only a writing manually signed by Fortinet's General Counsel > can be >> a binding commitment of Fortinet to Fortinet's customers or partners. > Thank you. *** >> >> > ---------------------------------------------------------------------------- > -- >> All the data continuously generated in your IT infrastructure contains a >> definitive record of customers, application performance, security >> threats, fraudulent activity and more. Splunk takes this data and makes >> sense of it. Business sense. IT sense. Common sense. >> http://p.sf.net/sfu/splunk-d2d-oct >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> > > > > -- > Regards, > Robert Iakobashvili, Ph.D. > > Home: http://www.ghotit.com > Blog: http://dyslexia-blog.ghotit.com > Twitter: http://twitter.com/ghotit > Facebook: http://facebook.com/ghotit > ...................................................................... > Ghotit Dyslexia > Assistive technology that understands you > ...................................................................... > > ---------------------------------------------------------------------------- > -- > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > > *** Please note that this message and any attachments may contain confidential > and proprietary material and information and are intended only for the use of > the intended recipient(s). If you are not the intended recipient, you are hereby > notified that any review, use, disclosure, dissemination, distribution or copying > of this message and any attachments is strictly prohibited. If you have received > this email in error, please immediately notify the sender and destroy this e-mail > and any attachments and all copies, whether electronic or printed. > Please also note that any views, opinions, conclusions or commitments expressed > in this message are those of the individual sender and do not necessarily reflect > the views of Fortinet, Inc., its affiliates, and emails are not binding on > Fortinet and only a writing manually signed by Fortinet's General Counsel can be > a binding commitment of Fortinet to Fortinet's customers or partners. Thank you. *** > > ------------------------------------------------------------------------------ > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: CK Z. <ck...@fo...> - 2011-10-11 09:36:50
|
Attached :) thanks! Best Regards, CK #8042 -----Original Message----- From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Tuesday, October 11, 2011 5:33 PM To: curl-loader-devel Subject: Re: HTTPS throughput too low Could you send me it zipped not rarred? Thanks! On Tue, Oct 11, 2011 at 11:30 AM, CK Zhang <ck...@fo...> wrote: > Dear Robert > > Attached test log by command "./curl-loader -f https.conf -v" > > Best Regards, > CK > #8042 > > -----Original Message----- > From: Robert Iakobashvili [mailto:cor...@gm...] > Sent: Tuesday, October 11, 2011 5:19 PM > To: curl-loader-devel > Subject: Re: HTTPS throughput too low > > Dear CK Zhang, > > Can you run HTTPS sessions with -v detailed logging and say if > you seen in logs something interesting. > Thanks, > Robert > > > > On Tue, Oct 11, 2011 at 11:14 AM, CK Zhang <ck...@fo...> wrote: >> Dear Robert >> >> >> >> Thanks a lot for your replay, >> >> >> >> Here are some answer about your questions >> >> >> >> >> >> What happens if you try HTTP? Does it impact throughput? >> >> It is normal and perfect if I test with HTTP, The throughput can reach to >> 950Mbps. >> >> >> >> Which tools you are using to measure throughput? >> >> There are 2 ways I used, >> >> 1 check the log summary form cur-loader like this >> Ti:5261372B/s,To:18375B/s >> >> 2 check real time traffic speed on switch with connected test PC and HTTP >> server >> >> >> >> What happens, if you try just to ftp some large file between the machines? >> >> NO test >> >> What about CPU usage at both client and server? >> >> Both they are busy, about 80% >> >> Which HTTP server is in use? >> >> Apache/2.2.21 >> >> Have you looked into logs of curl-loader and your web-server? >> >> Checked log, not find abnormal issue. >> >> >> >> >> >> Today, I used Loadrunner11 to simulate https traffic with same > environments >> (same apache server) , it can reach to 110Mpbs. But still only get > 800kbps >> with cur-loader. >> >> >> >> >> >> >> >> >> >> >> >> Best Regards, >> >> CK >> >> #8042 >> >> >> >> From: Robert Iakobashvili [mailto:cor...@gm...] >> Sent: Tuesday, September 27, 2011 1:19 PM >> To: curl-loader-devel >> Subject: Re: HTTPS throughput too low >> >> >> >> Dear CK Zhang, >> >> On Mon, Sep 26, 2011 at 5:01 AM, CK Zhang <ck...@fo...> wrote: >> >> >> >> I want to get high https stream with curl loader, but the Https throughput >> is about 800kbps between two PC, Http throughput can reach 950Mbps in that > 2 >> pcs, I do know if some have experience like this, and get a high https >> throughput with curl-loader, Thanks for any supports and help!! >> >> >> >> >> >> Here is my configuration: >> >> >> >> PC1:CPU Inter E2200, Mem 2G, 1000M NIC >> >> OpenSSL: >> >> openssl-devel-1.0.0-4.el6_0.2.x86_64 >> >> openssl-1.0.0-4.el6_0.2.x86_64 >> >> openssl-perl-1.0.0-4.el6_0.2.x86_64 >> >> >> >> curl-loader-0.53 >> >> ########### GENERAL SECTION >> ################################ >> >> BATCH_NAME= Apach >> >> CLIENTS_NUM_MAX=400 >> >> CLIENTS_NUM_START=100 >> >> CLIENTS_RAMPUP_INC=10 >> >> INTERFACE =eth2 >> >> NETMASK=16 >> >> IP_ADDR_MIN= 172.16.76.150 >> >> IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control >> >> CYCLES_NUM= -1 >> >> URLS_NUM= 1 >> >> >> >> ########### URL SECTION #################################### >> >> >> >> URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB >> >> #URL=http://localhost/ACE-INSTALL.html >> >> URL_SHORT_NAME="local-index" >> >> REQUEST_TYPE=GET >> >> TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced >> by cancelling url fetch on timeout >> >> TIMER_AFTER_URL_SLEEP =1000 >> >> ~ >> >> >> >> PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC >> >> Server version: Apache/2.2.21 (Unix) >> >> Server built: Sep 21 2011 16:19:43 >> >> openssl-devel-1.0.0-4.el6_0.2.x86_64 >> >> openssl-1.0.0-4.el6_0.2.x86_64 >> >> openssl-perl-1.0.0-4.el6_0.2.x86_64 >> >> >> >> >> >> Interesting. >> >> >> >> What happens if you try HTTP? Does it impact throughput? >> >> >> >> Which tools you are using to measure throughput? >> >> >> >> What happens, if you try just to ftp some large file between the machines? >> >> >> >> What about CPU usage at both client and server? >> >> Which HTTP server is in use? >> >> >> >> Have you looked into logs of curl-loader and your web-server? >> >> >> >> -- >> Regards, >> Robert Iakobashvili, Ph.D. >> >> >> >> Home: http://www.ghotit.com >> >> Blog: http://dyslexia-blog.ghotit.com >> >> Twitter: http://twitter.com/ghotit >> >> Facebook: http://facebook.com/ghotit >> >> ...................................................................... >> >> Ghotit Dyslexia >> >> Assistive technology that understands you >> >> ...................................................................... >> >> >> >> ________________________________ >> >> *** Please note that this message and any attachments may contain >> confidential >> and proprietary material and information and are intended only for the use >> of >> the intended recipient(s). If you are not the intended recipient, you are >> hereby >> notified that any review, use, disclosure, dissemination, distribution or >> copying >> of this message and any attachments is strictly prohibited. If you have >> received >> this email in error, please immediately notify the sender and destroy this >> e-mail >> and any attachments and all copies, whether electronic or printed. >> Please also note that any views, opinions, conclusions or commitments >> expressed >> in this message are those of the individual sender and do not necessarily >> reflect >> the views of Fortinet, Inc., its affiliates, and emails are not binding on >> Fortinet and only a writing manually signed by Fortinet's General Counsel >> can be >> a binding commitment of Fortinet to Fortinet's customers or partners. > Thank >> you. *** >> >> ________________________________ >> >> > ---------------------------------------------------------------------------- > -- >> All the data continuously generated in your IT infrastructure contains a >> definitive record of customers, application performance, security >> threats, fraudulent activity and more. Splunk takes this data and makes >> sense of it. Business sense. IT sense. Common sense. >> http://p.sf.net/sfu/splunk-d2d-oct >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> > > > > -- > Regards, > Robert Iakobashvili, Ph.D. > > Home: http://www.ghotit.com > Blog: http://dyslexia-blog.ghotit.com > Twitter: http://twitter.com/ghotit > Facebook: http://facebook.com/ghotit > ...................................................................... > Ghotit Dyslexia > Assistive technology that understands you > ...................................................................... > > ---------------------------------------------------------------------------- > -- > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > > *** Please note that this message and any attachments may contain confidential > and proprietary material and information and are intended only for the use of > the intended recipient(s). If you are not the intended recipient, you are hereby > notified that any review, use, disclosure, dissemination, distribution or copying > of this message and any attachments is strictly prohibited. If you have received > this email in error, please immediately notify the sender and destroy this e-mail > and any attachments and all copies, whether electronic or printed. > Please also note that any views, opinions, conclusions or commitments expressed > in this message are those of the individual sender and do not necessarily reflect > the views of Fortinet, Inc., its affiliates, and emails are not binding on > Fortinet and only a writing manually signed by Fortinet's General Counsel can be > a binding commitment of Fortinet to Fortinet's customers or partners. Thank you. *** > > ---------------------------------------------------------------------------- -- > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... ---------------------------------------------------------------------------- -- All the data continuously generated in your IT infrastructure contains a definitive record of customers, application performance, security threats, fraudulent activity and more. Splunk takes this data and makes sense of it. Business sense. IT sense. Common sense. http://p.sf.net/sfu/splunk-d2d-oct _______________________________________________ curl-loader-devel mailing list cur...@li... https://lists.sourceforge.net/lists/listinfo/curl-loader-devel *** Please note that this message and any attachments may contain confidential and proprietary material and information and are intended only for the use of the intended recipient(s). If you are not the intended recipient, you are hereby notified that any review, use, disclosure, dissemination, distribution or copying of this message and any attachments is strictly prohibited. If you have received this email in error, please immediately notify the sender and destroy this e-mail and any attachments and all copies, whether electronic or printed. Please also note that any views, opinions, conclusions or commitments expressed in this message are those of the individual sender and do not necessarily reflect the views of Fortinet, Inc., its affiliates, and emails are not binding on Fortinet and only a writing manually signed by Fortinet's General Counsel can be a binding commitment of Fortinet to Fortinet's customers or partners. Thank you. *** |
From: Robert I. <cor...@gm...> - 2011-10-11 09:32:44
|
Could you send me it zipped not rarred? Thanks! On Tue, Oct 11, 2011 at 11:30 AM, CK Zhang <ck...@fo...> wrote: > Dear Robert > > Attached test log by command "./curl-loader -f https.conf -v" > > Best Regards, > CK > #8042 > > -----Original Message----- > From: Robert Iakobashvili [mailto:cor...@gm...] > Sent: Tuesday, October 11, 2011 5:19 PM > To: curl-loader-devel > Subject: Re: HTTPS throughput too low > > Dear CK Zhang, > > Can you run HTTPS sessions with -v detailed logging and say if > you seen in logs something interesting. > Thanks, > Robert > > > > On Tue, Oct 11, 2011 at 11:14 AM, CK Zhang <ck...@fo...> wrote: >> Dear Robert >> >> >> >> Thanks a lot for your replay, >> >> >> >> Here are some answer about your questions >> >> >> >> >> >> What happens if you try HTTP? Does it impact throughput? >> >> It is normal and perfect if I test with HTTP, The throughput can reach to >> 950Mbps. >> >> >> >> Which tools you are using to measure throughput? >> >> There are 2 ways I used, >> >> 1 check the log summary form cur-loader like this” >> Ti:5261372B/s,To:18375B/s” >> >> 2 check real time traffic speed on switch with connected test PC and HTTP >> server >> >> >> >> What happens, if you try just to ftp some large file between the machines? >> >> NO test >> >> What about CPU usage at both client and server? >> >> Both they are busy, about 80% >> >> Which HTTP server is in use? >> >> Apache/2.2.21 >> >> Have you looked into logs of curl-loader and your web-server? >> >> Checked log, not find abnormal issue. >> >> >> >> >> >> Today, I used Loadrunner11 to simulate https traffic with same > environments >> (same apache server) , it can reach to 110Mpbs. But still only get > 800kbps >> with cur-loader. >> >> >> >> >> >> >> >> >> >> >> >> Best Regards, >> >> CK >> >> #8042 >> >> >> >> From: Robert Iakobashvili [mailto:cor...@gm...] >> Sent: Tuesday, September 27, 2011 1:19 PM >> To: curl-loader-devel >> Subject: Re: HTTPS throughput too low >> >> >> >> Dear CK Zhang, >> >> On Mon, Sep 26, 2011 at 5:01 AM, CK Zhang <ck...@fo...> wrote: >> >> >> >> I want to get high https stream with curl loader, but the Https throughput >> is about 800kbps between two PC, Http throughput can reach 950Mbps in that > 2 >> pcs, I do know if some have experience like this, and get a high https >> throughput with curl-loader, Thanks for any supports and help!! >> >> >> >> >> >> Here is my configuration: >> >> >> >> PC1:CPU Inter E2200, Mem 2G, 1000M NIC >> >> OpenSSL: >> >> openssl-devel-1.0.0-4.el6_0.2.x86_64 >> >> openssl-1.0.0-4.el6_0.2.x86_64 >> >> openssl-perl-1.0.0-4.el6_0.2.x86_64 >> >> >> >> curl-loader-0.53 >> >> ########### GENERAL SECTION >> ################################ >> >> BATCH_NAME= Apach >> >> CLIENTS_NUM_MAX=400 >> >> CLIENTS_NUM_START=100 >> >> CLIENTS_RAMPUP_INC=10 >> >> INTERFACE =eth2 >> >> NETMASK=16 >> >> IP_ADDR_MIN= 172.16.76.150 >> >> IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control >> >> CYCLES_NUM= -1 >> >> URLS_NUM= 1 >> >> >> >> ########### URL SECTION #################################### >> >> >> >> URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB >> >> #URL=http://localhost/ACE-INSTALL.html >> >> URL_SHORT_NAME="local-index" >> >> REQUEST_TYPE=GET >> >> TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced >> by cancelling url fetch on timeout >> >> TIMER_AFTER_URL_SLEEP =1000 >> >> ~ >> >> >> >> PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC >> >> Server version: Apache/2.2.21 (Unix) >> >> Server built: Sep 21 2011 16:19:43 >> >> openssl-devel-1.0.0-4.el6_0.2.x86_64 >> >> openssl-1.0.0-4.el6_0.2.x86_64 >> >> openssl-perl-1.0.0-4.el6_0.2.x86_64 >> >> >> >> >> >> Interesting. >> >> >> >> What happens if you try HTTP? Does it impact throughput? >> >> >> >> Which tools you are using to measure throughput? >> >> >> >> What happens, if you try just to ftp some large file between the machines? >> >> >> >> What about CPU usage at both client and server? >> >> Which HTTP server is in use? >> >> >> >> Have you looked into logs of curl-loader and your web-server? >> >> >> >> -- >> Regards, >> Robert Iakobashvili, Ph.D. >> >> >> >> Home: http://www.ghotit.com >> >> Blog: http://dyslexia-blog.ghotit.com >> >> Twitter: http://twitter.com/ghotit >> >> Facebook: http://facebook.com/ghotit >> >> ...................................................................... >> >> Ghotit Dyslexia >> >> Assistive technology that understands you >> >> ...................................................................... >> >> >> >> ________________________________ >> >> *** Please note that this message and any attachments may contain >> confidential >> and proprietary material and information and are intended only for the use >> of >> the intended recipient(s). If you are not the intended recipient, you are >> hereby >> notified that any review, use, disclosure, dissemination, distribution or >> copying >> of this message and any attachments is strictly prohibited. If you have >> received >> this email in error, please immediately notify the sender and destroy this >> e-mail >> and any attachments and all copies, whether electronic or printed. >> Please also note that any views, opinions, conclusions or commitments >> expressed >> in this message are those of the individual sender and do not necessarily >> reflect >> the views of Fortinet, Inc., its affiliates, and emails are not binding on >> Fortinet and only a writing manually signed by Fortinet's General Counsel >> can be >> a binding commitment of Fortinet to Fortinet's customers or partners. > Thank >> you. *** >> >> ________________________________ >> >> > ---------------------------------------------------------------------------- > -- >> All the data continuously generated in your IT infrastructure contains a >> definitive record of customers, application performance, security >> threats, fraudulent activity and more. Splunk takes this data and makes >> sense of it. Business sense. IT sense. Common sense. >> http://p.sf.net/sfu/splunk-d2d-oct >> _______________________________________________ >> curl-loader-devel mailing list >> cur...@li... >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> > > > > -- > Regards, > Robert Iakobashvili, Ph.D. > > Home: http://www.ghotit.com > Blog: http://dyslexia-blog.ghotit.com > Twitter: http://twitter.com/ghotit > Facebook: http://facebook.com/ghotit > ...................................................................... > Ghotit Dyslexia > Assistive technology that understands you > ...................................................................... > > ---------------------------------------------------------------------------- > -- > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > > *** Please note that this message and any attachments may contain confidential > and proprietary material and information and are intended only for the use of > the intended recipient(s). If you are not the intended recipient, you are hereby > notified that any review, use, disclosure, dissemination, distribution or copying > of this message and any attachments is strictly prohibited. If you have received > this email in error, please immediately notify the sender and destroy this e-mail > and any attachments and all copies, whether electronic or printed. > Please also note that any views, opinions, conclusions or commitments expressed > in this message are those of the individual sender and do not necessarily reflect > the views of Fortinet, Inc., its affiliates, and emails are not binding on > Fortinet and only a writing manually signed by Fortinet's General Counsel can be > a binding commitment of Fortinet to Fortinet's customers or partners. Thank you. *** > > ------------------------------------------------------------------------------ > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: CK Z. <ck...@fo...> - 2011-10-11 09:30:04
|
Dear Robert Attached test log by command "./curl-loader -f https.conf -v" Best Regards, CK #8042 -----Original Message----- From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Tuesday, October 11, 2011 5:19 PM To: curl-loader-devel Subject: Re: HTTPS throughput too low Dear CK Zhang, Can you run HTTPS sessions with -v detailed logging and say if you seen in logs something interesting. Thanks, Robert On Tue, Oct 11, 2011 at 11:14 AM, CK Zhang <ck...@fo...> wrote: > Dear Robert > > > > Thanks a lot for your replay, > > > > Here are some answer about your questions > > > > > > What happens if you try HTTP? Does it impact throughput? > > It is normal and perfect if I test with HTTP, The throughput can reach to > 950Mbps. > > > > Which tools you are using to measure throughput? > > There are 2 ways I used, > > 1 check the log summary form cur-loader like this > Ti:5261372B/s,To:18375B/s > > 2 check real time traffic speed on switch with connected test PC and HTTP > server > > > > What happens, if you try just to ftp some large file between the machines? > > NO test > > What about CPU usage at both client and server? > > Both they are busy, about 80% > > Which HTTP server is in use? > > Apache/2.2.21 > > Have you looked into logs of curl-loader and your web-server? > > Checked log, not find abnormal issue. > > > > > > Today, I used Loadrunner11 to simulate https traffic with same environments > (same apache server) , it can reach to 110Mpbs. But still only get 800kbps > with cur-loader. > > > > > > > > > > > > Best Regards, > > CK > > #8042 > > > > From: Robert Iakobashvili [mailto:cor...@gm...] > Sent: Tuesday, September 27, 2011 1:19 PM > To: curl-loader-devel > Subject: Re: HTTPS throughput too low > > > > Dear CK Zhang, > > On Mon, Sep 26, 2011 at 5:01 AM, CK Zhang <ck...@fo...> wrote: > > > > I want to get high https stream with curl loader, but the Https throughput > is about 800kbps between two PC, Http throughput can reach 950Mbps in that 2 > pcs, I do know if some have experience like this, and get a high https > throughput with curl-loader, Thanks for any supports and help!! > > > > > > Here is my configuration: > > > > PC1:CPU Inter E2200, Mem 2G, 1000M NIC > > OpenSSL: > > openssl-devel-1.0.0-4.el6_0.2.x86_64 > > openssl-1.0.0-4.el6_0.2.x86_64 > > openssl-perl-1.0.0-4.el6_0.2.x86_64 > > > > curl-loader-0.53 > > ########### GENERAL SECTION > ################################ > > BATCH_NAME= Apach > > CLIENTS_NUM_MAX=400 > > CLIENTS_NUM_START=100 > > CLIENTS_RAMPUP_INC=10 > > INTERFACE =eth2 > > NETMASK=16 > > IP_ADDR_MIN= 172.16.76.150 > > IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control > > CYCLES_NUM= -1 > > URLS_NUM= 1 > > > > ########### URL SECTION #################################### > > > > URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB > > #URL=http://localhost/ACE-INSTALL.html > > URL_SHORT_NAME="local-index" > > REQUEST_TYPE=GET > > TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced > by cancelling url fetch on timeout > > TIMER_AFTER_URL_SLEEP =1000 > > ~ > > > > PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC > > Server version: Apache/2.2.21 (Unix) > > Server built: Sep 21 2011 16:19:43 > > openssl-devel-1.0.0-4.el6_0.2.x86_64 > > openssl-1.0.0-4.el6_0.2.x86_64 > > openssl-perl-1.0.0-4.el6_0.2.x86_64 > > > > > > Interesting. > > > > What happens if you try HTTP? Does it impact throughput? > > > > Which tools you are using to measure throughput? > > > > What happens, if you try just to ftp some large file between the machines? > > > > What about CPU usage at both client and server? > > Which HTTP server is in use? > > > > Have you looked into logs of curl-loader and your web-server? > > > > -- > Regards, > Robert Iakobashvili, Ph.D. > > > > Home: http://www.ghotit.com > > Blog: http://dyslexia-blog.ghotit.com > > Twitter: http://twitter.com/ghotit > > Facebook: http://facebook.com/ghotit > > ...................................................................... > > Ghotit Dyslexia > > Assistive technology that understands you > > ...................................................................... > > > > ________________________________ > > *** Please note that this message and any attachments may contain > confidential > and proprietary material and information and are intended only for the use > of > the intended recipient(s). If you are not the intended recipient, you are > hereby > notified that any review, use, disclosure, dissemination, distribution or > copying > of this message and any attachments is strictly prohibited. If you have > received > this email in error, please immediately notify the sender and destroy this > e-mail > and any attachments and all copies, whether electronic or printed. > Please also note that any views, opinions, conclusions or commitments > expressed > in this message are those of the individual sender and do not necessarily > reflect > the views of Fortinet, Inc., its affiliates, and emails are not binding on > Fortinet and only a writing manually signed by Fortinet's General Counsel > can be > a binding commitment of Fortinet to Fortinet's customers or partners. Thank > you. *** > > ________________________________ > > ---------------------------------------------------------------------------- -- > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... ---------------------------------------------------------------------------- -- All the data continuously generated in your IT infrastructure contains a definitive record of customers, application performance, security threats, fraudulent activity and more. Splunk takes this data and makes sense of it. Business sense. IT sense. Common sense. http://p.sf.net/sfu/splunk-d2d-oct _______________________________________________ curl-loader-devel mailing list cur...@li... https://lists.sourceforge.net/lists/listinfo/curl-loader-devel *** Please note that this message and any attachments may contain confidential and proprietary material and information and are intended only for the use of the intended recipient(s). If you are not the intended recipient, you are hereby notified that any review, use, disclosure, dissemination, distribution or copying of this message and any attachments is strictly prohibited. If you have received this email in error, please immediately notify the sender and destroy this e-mail and any attachments and all copies, whether electronic or printed. Please also note that any views, opinions, conclusions or commitments expressed in this message are those of the individual sender and do not necessarily reflect the views of Fortinet, Inc., its affiliates, and emails are not binding on Fortinet and only a writing manually signed by Fortinet's General Counsel can be a binding commitment of Fortinet to Fortinet's customers or partners. Thank you. *** |
From: Robert I. <cor...@gm...> - 2011-10-11 09:19:01
|
Dear CK Zhang, Can you run HTTPS sessions with -v detailed logging and say if you seen in logs something interesting. Thanks, Robert On Tue, Oct 11, 2011 at 11:14 AM, CK Zhang <ck...@fo...> wrote: > Dear Robert > > > > Thanks a lot for your replay, > > > > Here are some answer about your questions > > > > > > What happens if you try HTTP? Does it impact throughput? > > It is normal and perfect if I test with HTTP, The throughput can reach to > 950Mbps. > > > > Which tools you are using to measure throughput? > > There are 2 ways I used, > > 1 check the log summary form cur-loader like this” > Ti:5261372B/s,To:18375B/s” > > 2 check real time traffic speed on switch with connected test PC and HTTP > server > > > > What happens, if you try just to ftp some large file between the machines? > > NO test > > What about CPU usage at both client and server? > > Both they are busy, about 80% > > Which HTTP server is in use? > > Apache/2.2.21 > > Have you looked into logs of curl-loader and your web-server? > > Checked log, not find abnormal issue. > > > > > > Today, I used Loadrunner11 to simulate https traffic with same environments > (same apache server) , it can reach to 110Mpbs. But still only get 800kbps > with cur-loader. > > > > > > > > > > > > Best Regards, > > CK > > #8042 > > > > From: Robert Iakobashvili [mailto:cor...@gm...] > Sent: Tuesday, September 27, 2011 1:19 PM > To: curl-loader-devel > Subject: Re: HTTPS throughput too low > > > > Dear CK Zhang, > > On Mon, Sep 26, 2011 at 5:01 AM, CK Zhang <ck...@fo...> wrote: > > > > I want to get high https stream with curl loader, but the Https throughput > is about 800kbps between two PC, Http throughput can reach 950Mbps in that 2 > pcs, I do know if some have experience like this, and get a high https > throughput with curl-loader, Thanks for any supports and help!! > > > > > > Here is my configuration: > > > > PC1:CPU Inter E2200, Mem 2G, 1000M NIC > > OpenSSL: > > openssl-devel-1.0.0-4.el6_0.2.x86_64 > > openssl-1.0.0-4.el6_0.2.x86_64 > > openssl-perl-1.0.0-4.el6_0.2.x86_64 > > > > curl-loader-0.53 > > ########### GENERAL SECTION > ################################ > > BATCH_NAME= Apach > > CLIENTS_NUM_MAX=400 > > CLIENTS_NUM_START=100 > > CLIENTS_RAMPUP_INC=10 > > INTERFACE =eth2 > > NETMASK=16 > > IP_ADDR_MIN= 172.16.76.150 > > IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control > > CYCLES_NUM= -1 > > URLS_NUM= 1 > > > > ########### URL SECTION #################################### > > > > URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB > > #URL=http://localhost/ACE-INSTALL.html > > URL_SHORT_NAME="local-index" > > REQUEST_TYPE=GET > > TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced > by cancelling url fetch on timeout > > TIMER_AFTER_URL_SLEEP =1000 > > ~ > > > > PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC > > Server version: Apache/2.2.21 (Unix) > > Server built: Sep 21 2011 16:19:43 > > openssl-devel-1.0.0-4.el6_0.2.x86_64 > > openssl-1.0.0-4.el6_0.2.x86_64 > > openssl-perl-1.0.0-4.el6_0.2.x86_64 > > > > > > Interesting. > > > > What happens if you try HTTP? Does it impact throughput? > > > > Which tools you are using to measure throughput? > > > > What happens, if you try just to ftp some large file between the machines? > > > > What about CPU usage at both client and server? > > Which HTTP server is in use? > > > > Have you looked into logs of curl-loader and your web-server? > > > > -- > Regards, > Robert Iakobashvili, Ph.D. > > > > Home: http://www.ghotit.com > > Blog: http://dyslexia-blog.ghotit.com > > Twitter: http://twitter.com/ghotit > > Facebook: http://facebook.com/ghotit > > ...................................................................... > > Ghotit Dyslexia > > Assistive technology that understands you > > ...................................................................... > > > > ________________________________ > > *** Please note that this message and any attachments may contain > confidential > and proprietary material and information and are intended only for the use > of > the intended recipient(s). If you are not the intended recipient, you are > hereby > notified that any review, use, disclosure, dissemination, distribution or > copying > of this message and any attachments is strictly prohibited. If you have > received > this email in error, please immediately notify the sender and destroy this > e-mail > and any attachments and all copies, whether electronic or printed. > Please also note that any views, opinions, conclusions or commitments > expressed > in this message are those of the individual sender and do not necessarily > reflect > the views of Fortinet, Inc., its affiliates, and emails are not binding on > Fortinet and only a writing manually signed by Fortinet's General Counsel > can be > a binding commitment of Fortinet to Fortinet's customers or partners. Thank > you. *** > > ________________________________ > > ------------------------------------------------------------------------------ > All the data continuously generated in your IT infrastructure contains a > definitive record of customers, application performance, security > threats, fraudulent activity and more. Splunk takes this data and makes > sense of it. Business sense. IT sense. Common sense. > http://p.sf.net/sfu/splunk-d2d-oct > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: CK Z. <ck...@fo...> - 2011-10-11 09:14:49
|
Dear Robert Thanks a lot for your replay, Here are some answer about your questions What happens if you try HTTP? Does it impact throughput? It is normal and perfect if I test with HTTP, The throughput can reach to 950Mbps. Which tools you are using to measure throughput? There are 2 ways I used, 1 check the log summary form cur-loader like this" Ti:5261372B/s,To:18375B/s" 2 check real time traffic speed on switch with connected test PC and HTTP server What happens, if you try just to ftp some large file between the machines? NO test What about CPU usage at both client and server? Both they are busy, about 80% Which HTTP server is in use? Apache/2.2.21 Have you looked into logs of curl-loader and your web-server? Checked log, not find abnormal issue. Today, I used Loadrunner11 to simulate https traffic with same environments (same apache server) , it can reach to 110Mpbs. But still only get 800kbps with cur-loader. Best Regards, CK #8042 From: Robert Iakobashvili [mailto:cor...@gm...] Sent: Tuesday, September 27, 2011 1:19 PM To: curl-loader-devel Subject: Re: HTTPS throughput too low Dear CK Zhang, On Mon, Sep 26, 2011 at 5:01 AM, CK Zhang <ck...@fo...> wrote: I want to get high https stream with curl loader, but the Https throughput is about 800kbps between two PC, Http throughput can reach 950Mbps in that 2 pcs, I do know if some have experience like this, and get a high https throughput with curl-loader, Thanks for any supports and help!! Here is my configuration: PC1:CPU Inter E2200, Mem 2G, 1000M NIC OpenSSL: openssl-devel-1.0.0-4.el6_0.2.x86_64 openssl-1.0.0-4.el6_0.2.x86_64 openssl-perl-1.0.0-4.el6_0.2.x86_64 curl-loader-0.53 ########### GENERAL SECTION ################################ BATCH_NAME= Apach CLIENTS_NUM_MAX=400 CLIENTS_NUM_START=100 CLIENTS_RAMPUP_INC=10 INTERFACE =eth2 NETMASK=16 IP_ADDR_MIN= 172.16.76.150 IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control CYCLES_NUM= -1 URLS_NUM= 1 ########### URL SECTION #################################### URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB #URL=http://localhost/ACE-INSTALL.html URL_SHORT_NAME="local-index" REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP =1000 ~ PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC Server version: Apache/2.2.21 (Unix) Server built: Sep 21 2011 16:19:43 openssl-devel-1.0.0-4.el6_0.2.x86_64 openssl-1.0.0-4.el6_0.2.x86_64 openssl-perl-1.0.0-4.el6_0.2.x86_64 Interesting. What happens if you try HTTP? Does it impact throughput? Which tools you are using to measure throughput? What happens, if you try just to ftp some large file between the machines? What about CPU usage at both client and server? Which HTTP server is in use? Have you looked into logs of curl-loader and your web-server? -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: <http://dyslexia-blog.ghotit.com/> http://dyslexia-blog.ghotit.com Twitter: <http://twitter.com/ghotit> http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... *** Please note that this message and any attachments may contain confidential and proprietary material and information and are intended only for the use of the intended recipient(s). If you are not the intended recipient, you are hereby notified that any review, use, disclosure, dissemination, distribution or copying of this message and any attachments is strictly prohibited. If you have received this email in error, please immediately notify the sender and destroy this e-mail and any attachments and all copies, whether electronic or printed. Please also note that any views, opinions, conclusions or commitments expressed in this message are those of the individual sender and do not necessarily reflect the views of Fortinet, Inc., its affiliates, and emails are not binding on Fortinet and only a writing manually signed by Fortinet's General Counsel can be a binding commitment of Fortinet to Fortinet's customers or partners. Thank you. *** |
From: Robert I. <cor...@gm...> - 2011-09-27 05:19:18
|
Dear CK Zhang, On Mon, Sep 26, 2011 at 5:01 AM, CK Zhang <ck...@fo...> wrote: > > ** ** > > I want to get high https stream with curl loader, but the Https throughput > is about 800kbps between two PC, Http throughput can reach 950Mbps in that 2 > pcs, I do know if some have experience like this, and get a high https > throughput with curl-loader, Thanks for any supports and help!!**** > > ** ** > > ** ** > > Here is my configuration:**** > > ** ** > > PC1:CPU Inter E2200, Mem 2G, 1000M NIC**** > > OpenSSL: **** > > openssl-devel-1.0.0-4.el6_0.2.x86_64**** > > openssl-1.0.0-4.el6_0.2.x86_64**** > > openssl-perl-1.0.0-4.el6_0.2.x86_64**** > > ** ** > > curl-loader-0.53**** > > ########### GENERAL SECTION > ################################**** > > BATCH_NAME= Apach**** > > CLIENTS_NUM_MAX=400**** > > CLIENTS_NUM_START=100**** > > CLIENTS_RAMPUP_INC=10**** > > INTERFACE =eth2**** > > NETMASK=16**** > > IP_ADDR_MIN= 172.16.76.150**** > > IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control**** > > CYCLES_NUM= -1**** > > URLS_NUM= 1**** > > ** ** > > ########### URL SECTION ####################################**** > > ** ** > > URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB** > ** > > #URL=http://localhost/ACE-INSTALL.html**** > > URL_SHORT_NAME="local-index"**** > > REQUEST_TYPE=GET**** > > TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced > by cancelling url fetch on timeout**** > > TIMER_AFTER_URL_SLEEP =1000**** > > ~**** > > ** ** > > PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC**** > > Server version: Apache/2.2.21 (Unix)**** > > Server built: Sep 21 2011 16:19:43**** > > openssl-devel-1.0.0-4.el6_0.2.x86_64**** > > openssl-1.0.0-4.el6_0.2.x86_64**** > > openssl-perl-1.0.0-4.el6_0.2.x86_64**** > > ** > Interesting. What happens if you try HTTP? Does it impact throughput? Which tools you are using to measure throughput? What happens, if you try just to ftp some large file between the machines? What about CPU usage at both client and server? Which HTTP server is in use? Have you looked into logs of curl-loader and your web-server? -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2011-09-27 05:14:16
|
Hi Wei Guo, Thank you for the patches. I'll review them later. Please, post and send everything to cur...@li... Thanks, Robert 2011/9/23 wei guo <wei...@gm...> > hi, Robert: > The attachment is the patches for 0.53, > > 1. fix the statement which check cycle time between use_current_url and its > primary url. > 2. add HEAD & DELETE method support. > > I test the feature above, It works well. > > shall we add openssl-dev in packages of curl-loader? adding it can make > user more convenient. > > additionally, I used curl-load 0.53, after I have compile it, It could not > work well, the error log is: > bind failed with errno 22: Invalid argument > I trace the error log, It's in lib/curl/ dir. I use old version > curl-7.19.7.tar.bz2 to replace new version. It works well. > Maybe It's the bug of new version curl, I'm not sure the reason. when I > find the reason, I will mail you. > > I'm very glad to receive your reply letter, Thank you. > > Best Regards. > > > 在 2011年9月22日 下午2:13,Robert Iakobashvili <cor...@gm...>写道: > > Dear Guo Wei, >> >> Thanks you for comments and issues described. >> >> Please, submit your proposed corrections/extensions >> to this list cur...@li... as patches for >> 0.53, a separate patch per issue (diff -Naru). >> >> You can join the list here: >> https://lists.sourceforge.net/lists/listinfo/curl-loader-devel >> >> Openssl is normally exists at a linux development machine, >> and what is necessary is to add openssl-dev (on debian). >> >> Regards, >> Robert Iakobashvili, Ph.D. >> >> Home: http://www.ghotit.com >> Blog: http://dyslexia-blog.ghotit.com >> Twitter: http://twitter.com/ghotit >> Facebook: http://facebook.com/ghotit >> ...................................................................... >> Ghotit Dyslexia >> Assistive technology that understands you >> ...................................................................... >> >> >> 2011/9/22 wei guo <wei...@gm...> >> > >> > hi,curl-loader developers: >> > >> > First, I wish to express my appreciation for your work on curl-loader, >> It help me to solve big problem on my project. I have a bug and some more >> question to show you. >> > >> > Bug: >> > 1. In curl-loader-0.52 version, in file parse_conf.c line 2404, the >> statement : >> > if (url_m->url_dont_cycle != url_m->url_dont_cycle) >> > It used to check cycling or not-cycling status of the CURRENT_URLs is >> the same as for the primary-URL, obviously, program can never run into this >> section, This problem still remain in curl-loader-0.53, you can simply >> change the first url_m or the second to url, that can modify this problem. >> i.e >> > if (url_m->url_dont_cycle != url->url_dont_cycle) >> > >> > Additional, I have some question about curl-loader: >> > 1. Why we don't add openssl package to our curl-loader package?If we add >> it, wo don't need to change openssldir.sh or set OPENSSLDIR environment >> variable,after modify Makefile,just one command. $make, you can get >> curl-loader. :) >> > >> > 2. Why our main funciton need to run as the root user? I know is used >> to add secondary ip to local machine, but I think add secondary ip , if we >> needed, is the work before the load test run. Curl-loader run as normal >> user is more safety. If you need to change the environment vriable, just do >> it before the load test begin. Here I think we can delete the check of root >> permission. The first time I user curl-loader, I run as root, and not very >> clear about curl-loader will add secondary ip on my local machine, thus my >> workmate could not visit internet. >> > 3. Curl-loader now olny support PUT, GET, POST method, I add HEAD & >> DELETE method for it. gratefully, the program structure is very clear, It >> is an easy task to finish it. Your work benefit me a lot, if possible, I >> hope I can contribute my code to this project and give others help. How >> could I joined this project? >> > >> > maybe I don't describe very clear, if some place is not distinct, please >> mail me: wei...@gm... >> > >> > At last, Thank you all for you work once more. >> > >> > Regards, >> > >> > Guo_wei >> > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: CK Z. <ck...@fo...> - 2011-09-26 02:28:45
|
Hello I want to get high https stream with curl loader, but the Https throughput is about 800kbps between two PC, Http throughput can reach 950Mbps in that 2 pcs, I do know if some have experience like this, and get a high https throughput with curl-loader, Thanks for any supports and help!! Here is my configuration: PC1:CPU Inter E2200, Mem 2G, 1000M NIC OpenSSL: openssl-devel-1.0.0-4.el6_0.2.x86_64 openssl-1.0.0-4.el6_0.2.x86_64 openssl-perl-1.0.0-4.el6_0.2.x86_64 curl-loader-0.53 ########### GENERAL SECTION ################################ BATCH_NAME= Apach CLIENTS_NUM_MAX=400 CLIENTS_NUM_START=100 CLIENTS_RAMPUP_INC=10 INTERFACE =eth2 NETMASK=16 IP_ADDR_MIN= 172.16.76.150 IP_ADDR_MAX= 172.16.246.250 #Actually - this is for self-control CYCLES_NUM= -1 URLS_NUM= 1 ########### URL SECTION #################################### URL=https://172.16.76.142/was/100KB.txt # the size of 100KB.txt is 100KB #URL=http://localhost/ACE-INSTALL.html URL_SHORT_NAME="local-index" REQUEST_TYPE=GET TIMER_URL_COMPLETION = 0 # In msec. When positive, Now it is enforced by cancelling url fetch on timeout TIMER_AFTER_URL_SLEEP =1000 ~ PC2. Works as Webserver,CPU Intel E7600,Mem 4G, 1000M NIC Server version: Apache/2.2.21 (Unix) Server built: Sep 21 2011 16:19:43 openssl-devel-1.0.0-4.el6_0.2.x86_64 openssl-1.0.0-4.el6_0.2.x86_64 openssl-perl-1.0.0-4.el6_0.2.x86_64 *** Please note that this message and any attachments may contain confidential and proprietary material and information and are intended only for the use of the intended recipient(s). If you are not the intended recipient, you are hereby notified that any review, use, disclosure, dissemination, distribution or copying of this message and any attachments is strictly prohibited. If you have received this email in error, please immediately notify the sender and destroy this e-mail and any attachments and all copies, whether electronic or printed. Please also note that any views, opinions, conclusions or commitments expressed in this message are those of the individual sender and do not necessarily reflect the views of Fortinet, Inc., its affiliates, and emails are not binding on Fortinet and only a writing manually signed by Fortinet's General Counsel can be a binding commitment of Fortinet to Fortinet's customers or partners. Thank you. *** |
From: Val S. <va...@nv...> - 2011-09-23 22:39:52
|
Anand, sorry, the curl-loader only works on Linux as it says on its web-site. /Val ________________________________ From: "kum...@gm..." <kum...@gm...> To: cur...@li... Sent: Fri, September 23, 2011 1:36:45 PM Subject: Re: Curl Loader works with Windows operating system ? Hello, >I have a Windows 7 PC and was wondering if Curl Loader works with Windows >operating system ? > >Please let me know if I need to download any additional softwares to run Curl >Loader on a windows machine. > >Thanks, >Anand > |
From: <kum...@gm...> - 2011-09-23 20:36:51
|
> Hello, > I have a Windows 7 PC and was wondering if Curl Loader works with Windows > operating system ? > > Please let me know if I need to download any additional softwares to run > Curl Loader on a windows machine. > > Thanks, > Anand > |
From: Robert I. <cor...@gm...> - 2011-07-06 20:18:56
|
Hi Aron, On Wed, Jul 6, 2011 at 10:58 PM, Bellorado, Aron <abe...@ve...> wrote: > Using the simple curl-loader configuration below, the FTP upload of a 6 MB > file is extremely slow with curl-loader transmitting only a few packets per > second during the file upload. As the attached packet capture shows, after > receiving the TCP ACK’s for the FTP data uploaded to the server > 10.1.111.170, curl-loader waits 1 second before transmitting the next 2 > packets of FTP data. After receiving the TCP ACK’s for this uploaded data, > curl-loader again waits 1 second before transmitting the next 2 packets of > data and so on. I am not sure why curl-loader is waiting 1 second in > between sending the FTP data packets. This makes for an extremely slow > upload. I am running version 0.52. Thoughts? > > ########### GENERAL SECTION ################## > BATCH_NAME=clCfg-ftpUpload.cfg > CLIENTS_NUM_MAX=1 > CLIENTS_NUM_START=0 > CLIENTS_RAMPUP_INC=50 > INTERFACE=eth4 > NETMASK=255.255.0.0 > IP_ADDR_MIN=10.1.111.100 > IP_ADDR_MAX=10.1.111.100 > CYCLES_NUM=1 > URLS_NUM=1 > > ########### URL SECTION ################## > > URL=ftp://public%3Apublic:public@10.1.111.170:2021/bucket/file-ftpUpload.1 > URL_SHORT_NAME=shortName-file-ftpUpload.1 > UPLOAD_FILE="6MBytes_asciiRandom.txt" > TIMER_URL_COMPLETION=0 > TIMER_AFTER_URL_SLEEP=0 > > > Aron > http://p.sf.net/sfu/splunk-d2d-c2 curl-loader is using libcurl for FTP stack. I mem there was some patching in curl related to uploading that was done rather recently. Try to get the latest version of curl sources, place it to packages subdir of curl-loader, correct the version in our Makefile, and make cleanall make debug=0 optimize=1 Take care, -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Robert I. <cor...@gm...> - 2011-06-08 20:09:58
|
Hi Aron, On Wed, Jun 8, 2011 at 9:34 PM, Bellorado, Aron <abe...@ve...>wrote: > > When attempting to upload a file via HTTP PUT that is several GB’s in size > (specifically 2839783616 bytes) as shown in the URL entry from the > configuration file below, curl-loader generates its HTTP PUT message with a > Content-Length header value of -1455183680. I assume this is an issue with > size of the variable holding the file size. I also tried including a > “HEADER=” line in the config file to overwrite the ‘Content-Length’ header > to the appropriate value, but curl-loader formats the HTTP PUT message with > 2 Content-Length headers; one with the invalid value and one with the > specified value. I am running curl-loader version .47. Any help would be > appreciated. > > URL= > http://10.1.111.173:8080/v1/AUTH_PUBLIC/bucket/file-101111173-movie1Video-PUT.1 > URL_SHORT_NAME=file-101111173-movie1Video-PUT.1 > REQUEST_TYPE=PUT > UPLOAD_FILE=/tmp/mymovie.mpg > TIMER_URL_COMPLETION=120000 > TIMER_AFTER_URL_SLEEP=4000 > > > Aron > > Please, upgrade to the latest version, since this might be an issue with LARGE_FILE support. If still the problem persists I will dig into libcurl to file if any issues with support of large files for PUT. -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Bellorado, A. <abe...@ve...> - 2011-06-08 18:47:18
|
When attempting to upload a file via HTTP PUT that is several GB's in size (specifically 2839783616 bytes) as shown in the URL entry from the configuration file below, curl-loader generates its HTTP PUT message with a Content-Length header value of -1455183680. I assume this is an issue with size of the variable holding the file size. I also tried including a "HEADER=" line in the config file to overwrite the 'Content-Length' header to the appropriate value, but curl-loader formats the HTTP PUT message with 2 Content-Length headers; one with the invalid value and one with the specified value. I am running curl-loader version .47. Any help would be appreciated. URL=http://10.1.111.173:8080/v1/AUTH_PUBLIC/bucket/file-101111173-movie1Video-PUT.1 URL_SHORT_NAME=file-101111173-movie1Video-PUT.1 REQUEST_TYPE=PUT UPLOAD_FILE=/tmp/mymovie.mpg TIMER_URL_COMPLETION=120000 TIMER_AFTER_URL_SLEEP=4000 Aron |
From: Robert I. <cor...@gm...> - 2011-05-28 17:33:56
|
Hi Taras, Sorry, only at linux. Please, don't waiste your time at Free-BSD. Regards, Robert On Sat, May 28, 2011 at 8:36 AM, Taras Kurdyna <tku...@gm...> wrote: > CURL-LOADER VERSION: 0.53, released March 11 2011 > > HW DETAILS: CPU/S and memory are must: > > LINUX DISTRIBUTION and KERNEL (uname -r): 7.2-RELEASE-p2 > > GCC VERSION (gcc -v): > Using built-in specs. > Target: i386-undermydesk-freebsd > Configured with: FreeBSD/i386 system compiler > Thread model: posix > gcc version 4.2.1 20070719 [FreeBSD] > > > COMPILATION AND MAKING OPTIONS (if defaults changed): no changes from defaults > > COMMAND-LINE: sudo gmake > > CONFIGURATION-FILE (The most common source of problems): > Place the file inline here: > Still at installation step > > DOES THE PROBLEM AFFECT: > COMPILATION? Yes > LINKING? No > EXECUTION? No > OTHER (please specify)? > Have you run $make cleanall prior to $make ? YES > > > DESCRIPTION: > > > make -C /data/home/tkurdyna/curl-loader-0.53/build/curl; make -C > /data/home/tkurdyna/curl-loader-0.53/build/curl install; > Making all in lib > make all-am > if /bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H > -I../include/curl -I../include -I../../../packages/curl/include > -I../lib -I../../../packages/curl/lib > -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include > -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl > -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 > -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT > libcurl_la-file.lo -MD -MP -MF ".deps/libcurl_la-file.Tpo" -c -o > libcurl_la-file.lo `test -f 'file.c' || echo > '../../../packages/curl/lib/'`file.c; then mv -f > ".deps/libcurl_la-file.Tpo" ".deps/libcurl_la-file.Plo"; else rm -f > ".deps/libcurl_la-file.Tpo"; exit 1; fi > libtool: compile: gcc -DHAVE_CONFIG_H -I../include/curl -I../include > -I../../../packages/curl/include -I../lib -I../../../packages/curl/lib > -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include > -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl > -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 > -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT > libcurl_la-file.lo -MD -MP -MF .deps/libcurl_la-file.Tpo -c > ../../../packages/curl/lib/file.c -o libcurl_la-file.o > In file included from ../../../packages/curl/lib/cookie.h:34, > from ../../../packages/curl/lib/urldata.h:69, > from ../../../packages/curl/lib/file.c:74: > ../include/curl/curl.h:35:61: error: curlrules.h: No such file or directory > *** Error code 1 > > Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl/lib. > *** Error code 1 > > Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl/lib. > *** Error code 1 > > Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl. > Making install in lib > if /bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H > -I../include/curl -I../include -I../../../packages/curl/include > -I../lib -I../../../packages/curl/lib > -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include > -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl > -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 > -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT > libcurl_la-file.lo -MD -MP -MF ".deps/libcurl_la-file.Tpo" -c -o > libcurl_la-file.lo `test -f 'file.c' || echo > '../../../packages/curl/lib/'`file.c; then mv -f > ".deps/libcurl_la-file.Tpo" ".deps/libcurl_la-file.Plo"; else rm -f > ".deps/libcurl_la-file.Tpo"; exit 1; fi > libtool: compile: gcc -DHAVE_CONFIG_H -I../include/curl -I../include > -I../../../packages/curl/include -I../lib -I../../../packages/curl/lib > -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include > -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl > -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 > -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT > libcurl_la-file.lo -MD -MP -MF .deps/libcurl_la-file.Tpo -c > ../../../packages/curl/lib/file.c -o libcurl_la-file.o > In file included from ../../../packages/curl/lib/cookie.h:34, > from ../../../packages/curl/lib/urldata.h:69, > from ../../../packages/curl/lib/file.c:74: > ../include/curl/curl.h:35:61: error: curlrules.h: No such file or directory > *** Error code 1 > > Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl/lib. > *** Error code 1 > > Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl. > gmake: *** [lib/libcurl.a] Error 1 > > ------------------------------------------------------------------------------ > vRanger cuts backup time in half-while increasing security. > With the market-leading solution for virtual backup and recovery, > you get blazing-fast, flexible, and affordable data protection. > Download your free trial now. > http://p.sf.net/sfu/quest-d2dcopy1 > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Taras K. <tku...@gm...> - 2011-05-28 05:36:50
|
CURL-LOADER VERSION: 0.53, released March 11 2011 HW DETAILS: CPU/S and memory are must: LINUX DISTRIBUTION and KERNEL (uname -r): 7.2-RELEASE-p2 GCC VERSION (gcc -v): Using built-in specs. Target: i386-undermydesk-freebsd Configured with: FreeBSD/i386 system compiler Thread model: posix gcc version 4.2.1 20070719 [FreeBSD] COMPILATION AND MAKING OPTIONS (if defaults changed): no changes from defaults COMMAND-LINE: sudo gmake CONFIGURATION-FILE (The most common source of problems): Place the file inline here: Still at installation step DOES THE PROBLEM AFFECT: COMPILATION? Yes LINKING? No EXECUTION? No OTHER (please specify)? Have you run $make cleanall prior to $make ? YES DESCRIPTION: make -C /data/home/tkurdyna/curl-loader-0.53/build/curl; make -C /data/home/tkurdyna/curl-loader-0.53/build/curl install; Making all in lib make all-am if /bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I../include/curl -I../include -I../../../packages/curl/include -I../lib -I../../../packages/curl/lib -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT libcurl_la-file.lo -MD -MP -MF ".deps/libcurl_la-file.Tpo" -c -o libcurl_la-file.lo `test -f 'file.c' || echo '../../../packages/curl/lib/'`file.c; then mv -f ".deps/libcurl_la-file.Tpo" ".deps/libcurl_la-file.Plo"; else rm -f ".deps/libcurl_la-file.Tpo"; exit 1; fi libtool: compile: gcc -DHAVE_CONFIG_H -I../include/curl -I../include -I../../../packages/curl/include -I../lib -I../../../packages/curl/lib -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT libcurl_la-file.lo -MD -MP -MF .deps/libcurl_la-file.Tpo -c ../../../packages/curl/lib/file.c -o libcurl_la-file.o In file included from ../../../packages/curl/lib/cookie.h:34, from ../../../packages/curl/lib/urldata.h:69, from ../../../packages/curl/lib/file.c:74: ../include/curl/curl.h:35:61: error: curlrules.h: No such file or directory *** Error code 1 Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl/lib. *** Error code 1 Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl/lib. *** Error code 1 Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl. Making install in lib if /bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I../include/curl -I../include -I../../../packages/curl/include -I../lib -I../../../packages/curl/lib -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT libcurl_la-file.lo -MD -MP -MF ".deps/libcurl_la-file.Tpo" -c -o libcurl_la-file.lo `test -f 'file.c' || echo '../../../packages/curl/lib/'`file.c; then mv -f ".deps/libcurl_la-file.Tpo" ".deps/libcurl_la-file.Plo"; else rm -f ".deps/libcurl_la-file.Tpo"; exit 1; fi libtool: compile: gcc -DHAVE_CONFIG_H -I../include/curl -I../include -I../../../packages/curl/include -I../lib -I../../../packages/curl/lib -I/data/home/tkurdyna/curl-loader-0.53/build/c-ares/c-ares-1.7.4/include -I/usr/include/openssl/include -I/usr/include/openssl/include/openssl -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -O0 -DCURL_MAX_WRITE_SIZE=4096 -g0 -Wno-system-headers -MT libcurl_la-file.lo -MD -MP -MF .deps/libcurl_la-file.Tpo -c ../../../packages/curl/lib/file.c -o libcurl_la-file.o In file included from ../../../packages/curl/lib/cookie.h:34, from ../../../packages/curl/lib/urldata.h:69, from ../../../packages/curl/lib/file.c:74: ../include/curl/curl.h:35:61: error: curlrules.h: No such file or directory *** Error code 1 Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl/lib. *** Error code 1 Stop in /data/home/tkurdyna/curl-loader-0.53/build/curl. gmake: *** [lib/libcurl.a] Error 1 |
From: Ken B. <ken...@ma...> - 2011-05-04 17:44:21
|
We have a server we need to test with close to 100k SSL connections, but in our actual production implementation, each connection will be made with it's own unique cert. We don't have to have that level for our load testing, but we'd like to at least use a cert rather than just connecting as anon. Is this possible with the tool as is? It doesn't appear to be, based on the TODO list. Thanks in advance for any help. --Ken Bowen |
From: Ron P. <ro...@kl...> - 2011-04-15 07:29:19
|
Hi Robert, Awesome! Thanks for the great tool. Ron On Fri, Apr 15, 2011 at 2:44 AM, Robert Iakobashvili <cor...@gm...> wrote: > Hi Ron, > > On Thu, Apr 14, 2011 at 9:31 PM, Ron Panduwana <ro...@kl...> > wrote: >> >> CURL-LOADER VERSION: 0.52, June 13, 2010 >> >> HW DETAILS: CPU/S and memory are must: Athlon64 X2, 2 GB RAM >> >> LINUX DISTRIBUTION and KERNEL (uname -r): Ubuntu 10.10 2.6.35-22-generic >> >> GCC VERSION (gcc -v): gcc version 4.4.5 (Ubuntu/Linaro 4.4.4-14ubuntu5) >> >> COMPILATION AND MAKING OPTIONS (if defaults changed): make >> >> COMMAND-LINE: make >> >> CONFIGURATION-FILE (The most common source of problems): not changed >> >> DOES THE PROBLEM AFFECT: >> COMPILATION? Yes >> LINKING? Yes >> EXECUTION? No >> Have you run $make cleanall prior to $make ? Yes >> >> DESCRIPTION: >> gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 >> -D_FILE_OFFSET_BITS=64 -fomit-frame-pointer -O3 -ffast-math >> -finline-functions -funroll-all-loops -finline-limit=1000 -mmmx -msse >> -foptimize-sibling-calls -I. -I./inc -I/usr//include -c -o >> obj/parse_conf.o parse_conf.c >> parse_conf.c: In function ‘read_callback’: >> parse_conf.c:3894: error: conflicting types for ‘pread’ >> /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here >> make: *** [obj/parse_conf.o] Error 1 >> >> QUESTION/ SUGGESTION/ PATCH: Thanks :) >> > > Thank you for using the PRF form. > It was fixed in the newer version 1.53. > Take care! > > -- > Regards, > Robert Iakobashvili, Ph.D. > > Home: http://www.ghotit.com > Blog: http://dyslexia-blog.ghotit.com > Twitter: http://twitter.com/ghotit > Facebook: http://facebook.com/ghotit > ...................................................................... > Ghotit Dyslexia > Assistive technology that understands you > ...................................................................... > > ------------------------------------------------------------------------------ > Benefiting from Server Virtualization: Beyond Initial Workload > Consolidation -- Increasing the use of server virtualization is a top > priority.Virtualization can reduce costs, simplify management, and improve > application availability and disaster protection. Learn more about boosting > the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |
From: Robert I. <cor...@gm...> - 2011-04-14 19:44:26
|
Hi Ron, On Thu, Apr 14, 2011 at 9:31 PM, Ron Panduwana <ro...@kl...>wrote: > CURL-LOADER VERSION: 0.52, June 13, 2010 > > HW DETAILS: CPU/S and memory are must: Athlon64 X2, 2 GB RAM > > LINUX DISTRIBUTION and KERNEL (uname -r): Ubuntu 10.10 2.6.35-22-generic > > GCC VERSION (gcc -v): gcc version 4.4.5 (Ubuntu/Linaro 4.4.4-14ubuntu5) > > COMPILATION AND MAKING OPTIONS (if defaults changed): make > > COMMAND-LINE: make > > CONFIGURATION-FILE (The most common source of problems): not changed > > DOES THE PROBLEM AFFECT: > COMPILATION? Yes > LINKING? Yes > EXECUTION? No > Have you run $make cleanall prior to $make ? Yes > > DESCRIPTION: > gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 > -D_FILE_OFFSET_BITS=64 -fomit-frame-pointer -O3 -ffast-math > -finline-functions -funroll-all-loops -finline-limit=1000 -mmmx -msse > -foptimize-sibling-calls -I. -I./inc -I/usr//include -c -o > obj/parse_conf.o parse_conf.c > parse_conf.c: In function ‘read_callback’: > parse_conf.c:3894: error: conflicting types for ‘pread’ > /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here > make: *** [obj/parse_conf.o] Error 1 > > QUESTION/ SUGGESTION/ PATCH: Thanks :) > > Thank you for using the PRF form. It was fixed in the newer version 1.53. Take care! -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Ron P. <ro...@kl...> - 2011-04-14 19:38:29
|
CURL-LOADER VERSION: 0.52, June 13, 2010 HW DETAILS: CPU/S and memory are must: Athlon64 X2, 2 GB RAM LINUX DISTRIBUTION and KERNEL (uname -r): Ubuntu 10.10 2.6.35-22-generic GCC VERSION (gcc -v): gcc version 4.4.5 (Ubuntu/Linaro 4.4.4-14ubuntu5) COMPILATION AND MAKING OPTIONS (if defaults changed): make COMMAND-LINE: make CONFIGURATION-FILE (The most common source of problems): not changed DOES THE PROBLEM AFFECT: COMPILATION? Yes LINKING? Yes EXECUTION? No Have you run $make cleanall prior to $make ? Yes DESCRIPTION: gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 -D_FILE_OFFSET_BITS=64 -fomit-frame-pointer -O3 -ffast-math -finline-functions -funroll-all-loops -finline-limit=1000 -mmmx -msse -foptimize-sibling-calls -I. -I./inc -I/usr//include -c -o obj/parse_conf.o parse_conf.c parse_conf.c: In function ‘read_callback’: parse_conf.c:3894: error: conflicting types for ‘pread’ /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here make: *** [obj/parse_conf.o] Error 1 QUESTION/ SUGGESTION/ PATCH: Thanks :) |
From: Suviena A. <su...@gm...> - 2011-04-13 21:07:51
|
hi Thanks for helping, i was using the example conf files present in the package.... the files are executed but there are no log files or txt files or anything displayed on the terminal, i am using centos. On Wed, Apr 13, 2011 at 9:49 AM, Suviena Agarwal <su...@gm...> wrote: > hi > i was able to figure that out. when i am running my conf file.. why am i > not able to see the output... the progress of the transfer. I want a log of > that.. I couldnt also ind the text file the output is supposed to be stored > in. > Thanks > Suviena > > On Wed, Apr 13, 2011 at 8:27 AM, Suviena Agarwal <su...@gm...>wrote: > >> hi >> i was trying to build curl loader on my machine by remote login.. but it >> gives me this error.. what can i do to fix it... >> >> bash-3.2$ tar zxfv curl-loader-0.53.tar.bz2 >> >> gzip: stdin: not in gzip format >> tar: Child returned status 1 >> tar: Error exit delayed from previous errors >> >> >> please help. >> >> thanks >> suviena >> > > |
From: Suviena A. <su...@gm...> - 2011-04-13 16:49:49
|
hi i was able to figure that out. when i am running my conf file.. why am i not able to see the output... the progress of the transfer. I want a log of that.. I couldnt also ind the text file the output is supposed to be stored in. Thanks Suviena On Wed, Apr 13, 2011 at 8:27 AM, Suviena Agarwal <su...@gm...> wrote: > hi > i was trying to build curl loader on my machine by remote login.. but it > gives me this error.. what can i do to fix it... > > bash-3.2$ tar zxfv curl-loader-0.53.tar.bz2 > > gzip: stdin: not in gzip format > tar: Child returned status 1 > tar: Error exit delayed from previous errors > > > please help. > > thanks > suviena > |
From: Robert I. <cor...@gm...> - 2011-04-13 15:59:51
|
Hi, use instead z j tar jxfv curl-loader-0.53.tar.bz2 On Wed, Apr 13, 2011 at 6:27 PM, Suviena Agarwal <su...@gm...> wrote: > hi > i was trying to build curl loader on my machine by remote login.. but it > gives me this error.. what can i do to fix it... > > bash-3.2$ tar zxfv curl-loader-0.53.tar.bz2 > > gzip: stdin: not in gzip format > tar: Child returned status 1 > tar: Error exit delayed from previous errors > > > please help. > > thanks > suviena > > > ------------------------------------------------------------------------------ > Forrester Wave Report - Recovery time is now measured in hours and minutes > not days. Key insights are discussed in the 2010 Forrester Wave Report as > part of an in-depth evaluation of disaster recovery service providers. > Forrester found the best-in-class provider in terms of services and vision. > Read this report now! http://p.sf.net/sfu/ibm-webcastpromo > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > -- Regards, Robert Iakobashvili, Ph.D. Home: http://www.ghotit.com Blog: http://dyslexia-blog.ghotit.com Twitter: http://twitter.com/ghotit Facebook: http://facebook.com/ghotit ...................................................................... Ghotit Dyslexia Assistive technology that understands you ...................................................................... |
From: Suviena A. <su...@gm...> - 2011-04-13 15:27:52
|
hi i was trying to build curl loader on my machine by remote login.. but it gives me this error.. what can i do to fix it... bash-3.2$ tar zxfv curl-loader-0.53.tar.bz2 gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error exit delayed from previous errors please help. thanks suviena |