rabbit-proxy-users Mailing List for RabbIT proxy (Page 41)
Brought to you by:
ernimril
You can subscribe to this list here.
2002 |
Jan
|
Feb
|
Mar
|
Apr
(2) |
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(4) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2003 |
Jan
(16) |
Feb
(9) |
Mar
|
Apr
(4) |
May
(7) |
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
(4) |
Dec
(3) |
2004 |
Jan
(13) |
Feb
|
Mar
(29) |
Apr
(44) |
May
(17) |
Jun
(14) |
Jul
(7) |
Aug
(2) |
Sep
|
Oct
(2) |
Nov
(2) |
Dec
(26) |
2005 |
Jan
(7) |
Feb
(5) |
Mar
|
Apr
(4) |
May
(14) |
Jun
(6) |
Jul
(2) |
Aug
(3) |
Sep
|
Oct
|
Nov
(1) |
Dec
(2) |
2006 |
Jan
(14) |
Feb
(6) |
Mar
(11) |
Apr
(7) |
May
(26) |
Jun
(10) |
Jul
(10) |
Aug
(9) |
Sep
(8) |
Oct
(15) |
Nov
(22) |
Dec
(12) |
2007 |
Jan
(3) |
Feb
(17) |
Mar
(19) |
Apr
(18) |
May
(13) |
Jun
(11) |
Jul
(16) |
Aug
(14) |
Sep
(1) |
Oct
(5) |
Nov
(38) |
Dec
(4) |
2008 |
Jan
(3) |
Feb
(5) |
Mar
(7) |
Apr
(189) |
May
(131) |
Jun
(117) |
Jul
(88) |
Aug
(67) |
Sep
(74) |
Oct
(14) |
Nov
(19) |
Dec
(69) |
2009 |
Jan
(32) |
Feb
(23) |
Mar
(35) |
Apr
(47) |
May
(126) |
Jun
(94) |
Jul
(78) |
Aug
(27) |
Sep
(20) |
Oct
(24) |
Nov
(8) |
Dec
(9) |
2010 |
Jan
(8) |
Feb
(3) |
Mar
(32) |
Apr
(50) |
May
(88) |
Jun
(59) |
Jul
(36) |
Aug
(43) |
Sep
(25) |
Oct
(2) |
Nov
|
Dec
(2) |
2011 |
Jan
|
Feb
(1) |
Mar
(2) |
Apr
(2) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2012 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2013 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
|
Nov
|
Dec
|
2014 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
2016 |
Jan
|
Feb
|
Mar
(2) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Dave <tec...@ml...> - 2004-06-23 16:51:44
|
----- Original Message ----- From: "Samuel Hill" <Sam...@Co...> To: <rab...@li...> Sent: Wednesday, June 23, 2004 7:00 AM Subject: [Rabbit-proxy-users] Failure to log in... > When going to the following page it is supposed to ask for a login but > does not prompt for one. > http://206.159.167.148/jbcintranet/ > > Avoiding the proxy works fine. > > Sam > If you are trying to access the management screens, you need to make sure that the DNS entries match the hostname for the computer, and that the browser you are using is set to use the proxy. Dave |
From: Samuel H. <Sam...@Co...> - 2004-06-23 14:01:23
|
When going to the following page it is supposed to ask for a login but does not prompt for one. http://206.159.167.148/jbcintranet/ Avoiding the proxy works fine. Sam |
From: Robert O. <ro...@kh...> - 2004-06-22 16:44:07
|
Hello! I have released the 2.0.32 now. Report bugs and suggestions for the next version, please. You can download it from: http://www.khelekore.org/rabbit/ Changelog avaiable at: http://www.khelekore.org/rabbit/changelog.shtml Have fun /robo |
From: Robert O. <ro...@kh...> - 2004-06-20 13:38:13
|
Hello! As I said before, I plan to release the next release soon. There is a beta of 2.0.32 available from http://www.khelekore.org/rabbit/RabbIT2.0.32-pre8.tar.gz Any help with final testing is appreciated. The 2.0.32 release is mostly memory optimization so that rabbit can handle big caches. There are a few other fixes in there as well like: HTTP/0.9 handling, ssl proxy chaining, rejected client handling and admin page handling. Have fun. /robo |
From: Robert O. <ro...@di...> - 2004-06-09 15:30:08
|
Hello. The khelekore.org site is currently down. It will be back up soon (hopefully back up on thursday).The adsl modem seems to have gone kaputt. The sf.net site still works and I read the mailing list, but a bit less often (since I currently only have network access at work). Rabbit proxy is still in development, the next release will probably be shortly after the site is up again. Sam: do you have any news about the pre6 version? /robo |
From: Samuel H. <Sam...@Co...> - 2004-06-04 18:19:48
|
Maybe it is just his server is down. I could offer hosting if I could get a hold of him. The mail and web server are the same one. Sam -----Original Message----- From: rab...@li... [mailto:rab...@li...] On Behalf Of Tom Grove Sent: Friday, June 04, 2004 9:16 AM To: rab...@li... Subject: Re: [Rabbit-proxy-users] Site/Files I haven't been able to get to the site either. This is not good because as you, I am using it in a production environment. If he is reading these and has plans of stopping productions please give someone the okay to continue development. Thanks. -TG ======================================== >I have been unable to reach via email or by the web page the author and >files for this project. This is a week now. > >Is it just me? > >Sam ------------------------------------------------------- This SF.Net email is sponsored by the new InstallShield X. From Windows to Linux, servers to mobile, InstallShield X is the one installation-authoring solution that does it all. Learn more and evaluate today! http://www.installshield.com/Dev2Dev/0504 _______________________________________________ Rabbit-proxy-users mailing list Rab...@li... https://lists.sourceforge.net/lists/listinfo/rabbit-proxy-users |
From: Tom G. <tg...@wi...> - 2004-06-04 13:27:01
|
I haven't been able to get to the site either. This is not good because as you, I am using it in a production environment. If he is reading these and has plans of stopping productions please give someone the okay to continue development. Thanks. -TG ======================================== >I have been unable to reach via email or by the web page the author and >files for this project. >This is a week now. > >Is it just me? > >Sam |
From: Samuel H. <Sam...@Co...> - 2004-06-04 12:45:42
|
I have been unable to reach via email or by the web page the author and files for this project. This is a week now. Is it just me? Sam |
From: Costa T. <kt...@ho...> - 2004-05-23 18:50:11
|
> Ok, it seems that proxy chaning and ssl did not work > well in rabbit, I have created a simple fix. > > http://www.khelekore.org/rabbit/RabbIT2.0.32-pre5.tar.gz > > Please help test it. > It works fine for me with two rabbit proxies in a chain. > > If it fails for you can you give me the network traffic > that rabbit sends and recieves? (using ethereal or something > similar). It seems to work fine now. Thanks! Costa |
From: Robert O. <ro...@kh...> - 2004-05-23 16:57:27
|
Costa Tsaousis wrote: >Since you mentioned "my setup", I tested HTTPS with and without a back-end >proxy (proxyhost, proxyport). It seems that the problem appears only when >RabbIT2 is configured to use another proxy . I verified that if RabbIT2 >is standalone HTTPS is working properly. > > Ok, it seems that proxy chaning and ssl did not work well in rabbit, I have created a simple fix. http://www.khelekore.org/rabbit/RabbIT2.0.32-pre5.tar.gz Please help test it. It works fine for me with two rabbit proxies in a chain. If it fails for you can you give me the network traffic that rabbit sends and recieves? (using ethereal or something similar). /robo |
From: Robert O. <ro...@kh...> - 2004-05-23 15:19:34
|
Costa Tsaousis wrote: >I am testing RabbIT2 for providing my users with an Internet Accelerator. >I have about 10.000 modems to serve > Ok, bigger than I expected. >It is not necessary to chain proxy SSL, but how >can I tell this to RabbIT2? > > Currently you can not, but you tell the clients to use another proxy for ssl. Having a proxy chain for ssl introduce latency and that is one of the things you want to minimize for a proxy accellerator (that is why you have keepalive/persistent connections in HTTP/1.1). Since none of the proxies in the chain can cache the content it does not make much sense to have a chain for ssl. Client setup instructions may be simpler and your firewall setup may also be easier. I will look more at rabbit and see if I can fix the problem you have, so hopefully you wont have to tell rabbit anything. > <>Up to now, I plan the whole setup as follows: ... > What do you think? Well your setup makes sense. It is a bigger setup than I expected and for big sites you want redundancy and failover. So having squid for caching and load balancing seems wise. Having 1.5 TB of cache for rabbit is also something I would recomend against (rabbit was not designed for that size it was designed to handle a couple of GB, may work for a bit more, but that is untested). So having the netcaches behind rabbit seems like a good idea. And to be honest, I have not handled that many clients myself with any proxy so you are probably better qualified to design the setup. Have fun /robo |
From: Costa T. <kt...@ho...> - 2004-05-23 09:01:13
|
> What proxies are you trying to chain? and why? I am testing RabbIT2 for providing my users with an Internet Accelerator. I have about 10.000 modems to serve (not all of them will use acceleration, but many will do - it will be a free service). There is a large cluster of Network Appliances "NetCache C6XXX" (proxies) as the core proxies for the network. I should use these for all outgoing connections, since most probably the objects RabbIT2 will try to fetch will already be there. It is not necessary to chain proxy SSL, but how can I tell this to RabbIT2? Up to now, I plan the whole setup as follows: a. The back-end proxies will be the NetCaches since these are highly optimized caches and they have a huge cache (1,5TB). b. In front of these, there will be an array of RabbIT2 machines in compression only mode (no cache). c. In front of RabbIT2, there will be a single cluster of squid servers configured to load balance the RabbIT2 servers, that will provide the cache. I plan to do it this way because: 1. I trust squid for its reliability. In this setup, if a RabbIT2 server fails, squid is smart enough to re-route its requests to another RabbIT2 server. 2. Squid supports ICP for inter-squid communication about the objects in the cache and the objects being fetched. This will ensure that I will be compressing each object once. 3. I will be able to add as many RabbIT2 machines as needed without worring about load balancing them. Squid will do it. What do you think? Costa |
From: Robert O. <ro...@kh...> - 2004-05-23 08:32:39
|
Costa Tsaousis wrote: >Since you mentioned "my setup", I tested HTTPS with and without a back-end >proxy (proxyhost, proxyport). It seems that the problem appears only when >RabbIT2 is configured to use another proxy . I verified that if RabbIT2 >is standalone HTTPS is working properly. > > Ok, so rabbit fails to proxy chain ssl. I will see if I can verify this and if I can it ought to be easy to fix. However why do you have to proxy chain ssl? None of the proxies can read the data so all proxies will only tunnel the connection. What proxies are you trying to chain? and why? I would guess that you have a squid proxy. Note that rabbit is better at HTTP/1.1 than squid so I would expect rabbit to handle the cache _slightly_ better (squid is a _good_ proxy). /robo |
From: Costa T. <kt...@ho...> - 2004-05-22 23:25:43
|
> Rabbit can proxy https request, but it can not > filter the streams (since that requires cracking the > encryption). Yes, I know. > >Now RabbIT2 logs a CONNECT but the browser gives a DNS error. > >With other proxies, or directly, the URLs tried are working. > > > > > I am not sure I understand what error you get > here. Last time I checked both https proxying and > proxy chaining worked correctly. It would be > easier if you told me more about the setup > you are using. For this test I am just using a RabbIT 2.0.32-pre4 as proxy. Lets say IE6 is trying to fetch: https://digitalid.verisign.com/ When IE6 is attempting to load this, I see in RabbIT2 logs: X.X.X.X - - 22/May/2004:22:16:02 GMT "CONNECT digitalid.verisign.com:443 HTTP/1.0" 200 - but instead of delivering the page, IE6 just displays: --- The page cannot be displayed The page you are looking for is currently unavailable. The Web site might be experiencing technical difficulties, or you may need to adjust your browser settings. ... blah blah... Cannot find server or DNS Error Internet Explorer --- In my config there is: allowSSL=443,444,445 Since you mentioned "my setup", I tested HTTPS with and without a back-end proxy (proxyhost, proxyport). It seems that the problem appears only when RabbIT2 is configured to use another proxy . I verified that if RabbIT2 is standalone HTTPS is working properly. Costa |
From: Robert O. <ro...@kh...> - 2004-05-22 20:12:08
|
Costa Tsaousis wrote: >Is there something to be configured to make HTTPS connections work through >RabbIT2? I have declared port 443 as HTTPS. > > Rabbit can proxy https request, but it can not filter the streams (since that requires cracking the encryption). >Now RabbIT2 logs a CONNECT but the browser gives a DNS error. >With other proxies, or directly, the URLs tried are working. > > I am not sure I understand what error you get here. Last time I checked both https proxying and proxy chaining worked correctly. It would be easier if you told me more about the setup you are using. /robo |
From: Costa T. <kt...@ho...> - 2004-05-22 18:54:15
|
Is there something to be configured to make HTTPS connections work through RabbIT2? I have declared port 443 as HTTPS. Now RabbIT2 logs a CONNECT but the browser gives a DNS error. With other proxies, or directly, the URLs tried are working. Costa |
From: Robert O. <ro...@kh...> - 2004-05-20 23:29:04
|
Costa Tsaousis wrote: >I am going to have RabbIT2 between two other HTTP caches. How can I make >it compress but not cache anything at all? > >When I use the DontCacheFilter it seems that it does not convert images >(although it compresses HTMLs). > > To recode images rabbit need a cache. You can set the cache size to 0 though. A 0-sized cache will still create files and then when the cache cleaner runs it will try to adjust the cache size down to 0 again. /robo |
From: Costa T. <kt...@ho...> - 2004-05-20 23:21:15
|
Hi, I am going to have RabbIT2 between two other HTTP caches. How can I make it compress but not cache anything at all? When I use the DontCacheFilter it seems that it does not convert images (although it compresses HTMLs). Any help is appreciated. Costa PS: tested with v2.0.31 |
From: Sam <Sam...@Co...> - 2004-05-10 01:59:20
|
Seems to have been a temporary problem. I got a few reports and duplicated it myself but then it became "solved". Don't forget, I am not using cache with Rabbit. Still cannot for some reason but continue to work on the problem. I still get high, too high, memory usage when using cache. The bigger the cache the bigger the ram (1 GIG cache 256 megs ram, etc). With not using cache and setting the cleanloop to 600 I can get a good 24 hours before it reaches 512 megs ram. I also tell Rabbit it can use 512 megs RAM. I could add more ram but that would be pointless because I would only gain more time. I have a cron job that at 4:00 am it will "killall -9 java" and then restart the proxy. Bad but better than doing it every hour since it takes a few minutes for the proxy to "go" on startup. Sam Robert Olofsson wrote: > Samuel Hill wrote: > >> For example, when you go to yahoo and click on mail I see this in the >> logs... >> >> 63.170.141.40 - - 07/May/2004:15:33:09 GMT "GET >> http://www.yahoo.com/_ylh=X3oDMTB1M2EzYWFoBF9TAzI3MTYxNDkEdGVzdAMwBHRtcG >> wDaWUtYmV0YQ--/r/m1 HTTP/1.1" 200 0 >> But the page never shows up in explorer. >> Before I saw this with images but in this case it is the whole page. >> >> > Odd. yahoo mail is one of the things that I always test rabbit with. > What I can see the yahoo.com pages are not cached (at least not when > mozilla is used). > The images on yahoo are cached, but not the main page. > > I get something like: > "GET > http://www.yahoo.com/_ylh=X3oDMTB1c2ZmZzF2BF9TAzI3MTYxNDkEdGVzdAMwBHRtcGwDbnMtYmV0YQ--/r/m1 > HTTP/1.1" 200 - > > Can you check if yahoo.com is in rabbits cache? (any page in yahoo). > Can you check if yahoo.com is in IE's cache? (or remove temporary > internet files and try again). > > It may be that your browser tries a conditional request and does not > handle the > response correctly. > > /robo > > > ------------------------------------------------------- > This SF.Net email is sponsored by Sleepycat Software > Learn developer strategies Cisco, Motorola, Ericsson & Lucent use to > deliver higher performing products faster, at low TCO. > http://www.sleepycat.com/telcomwpreg.php?From=osdnemail3 > _______________________________________________ > Rabbit-proxy-users mailing list > Rab...@li... > https://lists.sourceforge.net/lists/listinfo/rabbit-proxy-users > > |
From: Robert O. <ro...@kh...> - 2004-05-09 14:23:20
|
Samuel Hill wrote: >For example, when you go to yahoo and click on mail I see this in the >logs... > >63.170.141.40 - - 07/May/2004:15:33:09 GMT "GET >http://www.yahoo.com/_ylh=X3oDMTB1M2EzYWFoBF9TAzI3MTYxNDkEdGVzdAMwBHRtcG >wDaWUtYmV0YQ--/r/m1 HTTP/1.1" 200 0 > >But the page never shows up in explorer. >Before I saw this with images but in this case it is the whole page. > > Odd. yahoo mail is one of the things that I always test rabbit with. What I can see the yahoo.com pages are not cached (at least not when mozilla is used). The images on yahoo are cached, but not the main page. I get something like: "GET http://www.yahoo.com/_ylh=X3oDMTB1c2ZmZzF2BF9TAzI3MTYxNDkEdGVzdAMwBHRtcGwDbnMtYmV0YQ--/r/m1 HTTP/1.1" 200 - Can you check if yahoo.com is in rabbits cache? (any page in yahoo). Can you check if yahoo.com is in IE's cache? (or remove temporary internet files and try again). It may be that your browser tries a conditional request and does not handle the response correctly. /robo |
From: Samuel H. <Sam...@Co...> - 2004-05-07 15:35:36
|
For example, when you go to yahoo and click on mail I see this in the logs... 63.170.141.40 - - 07/May/2004:15:33:09 GMT "GET http://www.yahoo.com/_ylh=X3oDMTB1M2EzYWFoBF9TAzI3MTYxNDkEdGVzdAMwBHRtcG wDaWUtYmV0YQ--/r/m1 HTTP/1.1" 200 0 But the page never shows up in explorer. Before I saw this with images but in this case it is the whole page. There is nothing in the error log. Sam |
From: Luis S. <lso...@gl...> - 2004-05-04 06:22:15
|
That was it. Setting ScrictHTTP=3Dfalse fixed the problem. Thanks for your help. --luis *********** REPLY SEPARATOR *********** On 5/4/2004 at 8:10 AM Robert Olofsson wrote: >Luis Soltero wrote: > >>any idea why this url fails? >>http://www.ndbc.noaa.gov/station_page.phtml?station=3D42020 >> >> >Do you run with strict http or not? "StrictHTTP=3Dtrue" >I would guess that the web site sends broken http header that only works= if >you do not use strict http. >I have not checked yet, but I will, but the fact that it works for Sam >seems to be an indication (he uses StrictHTTP=3Dfalse). > >Using strict http is in ways better and needed to pass the http test suite >but many web sites sends broken http headers... > >The normal cause for this is that they something like a perl program that >does this: > >print "Content-Type: text/html\n\n" >print "<html......" > >This is broken the http line separator is \r\n and the above ought to >have two >line separators after the content type. >Both IE and mozilla happily accepts the broken format.... > >/robo > > > >------------------------------------------------------- >This SF.Net email is sponsored by: Oracle 10g >Get certified on the hottest thing ever to hit the market... Oracle 10g. >Take an Oracle 10g class now, and we'll give you the exam FREE. >http://ads.osdn.com/?ad_id=3D3149&alloc_id=3D8166&op=3Dclick >_______________________________________________ >Rabbit-proxy-users mailing list >Rab...@li... >https://lists.sourceforge.net/lists/listinfo/rabbit-proxy-users Luis Soltero, Ph.D., MCS Director of Software Development Global Marine Networks, LLC StarPlilot, LLC Tel: 865-379-8723 Fax: 865-681-5017 E-Mail: lso...@gl... Web: http://www.globalmarinenet.net Web: http://www.starpilotllc.com Wireless E-Mail, Web Hosting, Weather and more... and StarPilot, the state of the art in navigation computations at your finger= tips... |
From: Robert O. <ro...@kh...> - 2004-05-04 06:10:31
|
Luis Soltero wrote: >any idea why this url fails? >http://www.ndbc.noaa.gov/station_page.phtml?station=42020 > > Do you run with strict http or not? "StrictHTTP=true" I would guess that the web site sends broken http header that only works if you do not use strict http. I have not checked yet, but I will, but the fact that it works for Sam seems to be an indication (he uses StrictHTTP=false). Using strict http is in ways better and needed to pass the http test suite but many web sites sends broken http headers... The normal cause for this is that they something like a perl program that does this: print "Content-Type: text/html\n\n" print "<html......" This is broken the http line separator is \r\n and the above ought to have two line separators after the content type. Both IE and mozilla happily accepts the broken format.... /robo |
From: Sam <Sam...@Co...> - 2004-05-04 03:05:10
|
What version of all are you using? What version of Rabbit and what version of java. It works fine in my setup but I have a pretty new version. Sam Luis Soltero wrote: >any idea why this url fails? > >http://www.ndbc.noaa.gov/station_page.phtml?station=42020 > >thanks, > >--luis > > >Luis Soltero, Ph.D., MCS >Director of Software Development >Global Marine Networks, LLC >StarPlilot, LLC >Tel: 865-379-8723 >Fax: 865-681-5017 >E-Mail: lso...@gl... >Web: http://www.globalmarinenet.net >Web: http://www.starpilotllc.com > >Wireless E-Mail, Web Hosting, Weather and more... > and >StarPilot, the state of the art in navigation computations at your finger tips... > > > > > >------------------------------------------------------- >This SF.Net email is sponsored by: Oracle 10g >Get certified on the hottest thing ever to hit the market... Oracle 10g. >Take an Oracle 10g class now, and we'll give you the exam FREE. >http://ads.osdn.com/?ad_id149&alloc_id66&op=click >_______________________________________________ >Rabbit-proxy-users mailing list >Rab...@li... >https://lists.sourceforge.net/lists/listinfo/rabbit-proxy-users > > > > |
From: Luis S. <lso...@gl...> - 2004-05-04 02:58:49
|
any idea why this url fails? http://www.ndbc.noaa.gov/station_page.phtml?station=3D42020 thanks, --luis Luis Soltero, Ph.D., MCS Director of Software Development Global Marine Networks, LLC StarPlilot, LLC Tel: 865-379-8723 Fax: 865-681-5017 E-Mail: lso...@gl... Web: http://www.globalmarinenet.net Web: http://www.starpilotllc.com Wireless E-Mail, Web Hosting, Weather and more... and StarPilot, the state of the art in navigation computations at your finger= tips... |