Re: Trying to simulate browser behavior.
Status: Alpha
Brought to you by:
coroberti
From: Pranav D. <pra...@gm...> - 2008-07-02 01:09:00
|
On Tue, Jul 1, 2008 at 1:06 PM, Robert Iakobashvili <cor...@gm...> wrote: > Hi Pranav, > > On Tue, Jul 1, 2008 at 10:55 PM, Pranav Desai <pra...@gm...> > wrote: >> >> Hello, >> >> I am trying to simulate a browser behavior for accessing a front page >> (e.g. www.cnn.com). From traces I see that a few requests go over the >> same TCP connections (persistence) and in general there are a few TCP >> connections made for completely fetching the whole front page. > > Y can see also the behavior of major browsers like IE-6, IE-7, FF-2, FF-3 > amd Safari-3.1 >> >> I am trying to use FRESH_CONNECT to create another TCP connections. >> What I was expecting was that all URLs before the FRESH_CONNECT tag >> would go on the same connection and after the URL that has the >> FRESH_CONNECT tag a new connection would start. So I was expecting to >> see 2 GETs for the first TCP connection, 6 on the second TCP conn. and >> the rest on the third one. Basically, I was thinking of the URL list >> as sequential with conn. close in between. >> >> But that doesn't seem to be case. There are 3 TCP connections, but one >> has most of the GETs and the other 2 have one req. each (for which the >> tag is specified). >> >> So it seems like curl-loader loads all the URLS with it associated >> tags and then access them randomly. Is that correct? If so, is there a >> way to have a behavior similar to the one described above. > > FRESH_CONNECT means that in the next cycle the connection should be closed > and re-established. > > What is the behavior, that you see with major browsers mentioned? > In general, most of them will create a bunch of TCP conn. and send off multiple requests through each connection. I can send you a trace if you like, and I am not trying to simulate any particular browser, just the way a browser normally fetches the main page of a website. To bring this in context, I am trying to load test a proxy, and would like to create/simulate thousands of users opening the main page of a bunch of popular websites. What I do is get a trace on the browser side for a website, from which I can get the URLs and sequence in which the browser fetched them to get whole page. I will also get the number of connections it utilized for the entire page. With that information I can create a curl-loader conf file with the same URLs and add a few FRESH_CONNECT to emulate the new TCP conn. Thats how I thought FRESH_CONNECT would work ... I could just add a bunch of URLs from somewhere in the curl-loader conf and add a few FRESH_CONNECT and TIMER_AFTER_SLEEP in the list and would probably get a similar behavior, but I was hoping to replicate the browser as closely as possible. Thanks for your help. -- Pranav > -- > Truly, > Robert Iakobashvili, Ph.D. > ...................................................................... > www.ghotit.com > Assistive technology that understands you > ...................................................................... > ------------------------------------------------------------------------- > Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! > Studies have shown that voting for your favorite open source project, > along with a healthy diet, reduces your potential for chronic lameness > and boredom. Vote Now at http://www.sourceforge.net/community/cca08 > _______________________________________________ > curl-loader-devel mailing list > cur...@li... > https://lists.sourceforge.net/lists/listinfo/curl-loader-devel > > |