Re: [Rabbit-proxy-development] Problems when filtering sites
Brought to you by:
ernimril
From: Robert O. <ro...@kh...> - 2006-09-25 19:35:13
|
Rick Leir wrote: > Is it not possible for Rabbit to tell the web server that it cannot > accept compressed html? Then Rabbit just does filtering+repacking. > That might be faster, if Rabbit has a fast connection and it is cacheing > the page. Robo, please correct me. That is easy to test and should hopefully work. The speed may be slower or faster, it depends on the bandwidth and the latency to the real server. Probably hard to say if it will be faster or slower. Adding a NoZipFilter that checks all the accept-encoding headers and removes gzip and compress values is probably almost trivial to write. Maybe I will add that in a day or two... Adding unzip+filtering+zipping is also easy, I will try to add that later this week, rabbit is a spare time project so some development is slow. If any of you have patches available and care to share, then please do. /robo |