Re: [Rabbit-proxy-development] Problems when filtering sites
Brought to you by:
ernimril
From: Robert O. <ro...@kh...> - 2006-09-30 19:17:51
|
Robert Olofsson wrote: > Rick Leir wrote: >> Is it not possible for Rabbit to tell the web server that it cannot >> accept compressed html? Then Rabbit just does filtering+repacking. >> That might be faster, if Rabbit has a fast connection and it is cacheing >> the page. Robo, please correct me. I did write one such filter: NoGZipEncoding, I have only tested it lightly, but it seems to work. At least it makes google return non-gzipped data. That filter is not on by default, so remember to add that filter to httpinfilters. There is a first pre-release of 3.6 on the site, please help test it. Have fun. /robo |