Re: [Rabbit-proxy-development] Problems when filtering sites
Brought to you by:
ernimril
|
From: Rick L. <ri...@le...> - 2006-09-25 18:46:57
|
On Fri, 2006-09-22 at 15:04 -0700, rab...@li... wrote: > Rabbit will not filter pages that are already compressed. This is > probably your problem, but since you do not give any example site it is > hard to say. > Adding unpacking+filtering+repacking to FilterHandler is easy, but > it is not part of rabbit, at least not yet. Is it not possible for Rabbit to tell the web server that it cannot accept compressed html? Then Rabbit just does filtering+repacking. That might be faster, if Rabbit has a fast connection and it is cacheing the page. Robo, please correct me. > I guess I have to figure out what direction I want rabbit to go, > full filtering proxy or web accellerator proxy. For the first then I > really ought to add that unpack+filtering to FilterHandler. > If I take number 2 instead I am not sure that I want such features > since they would go directly against rabbits goal (introducing extra > latency is not making surfing faster). It is a big world in the internet, and I am all in favour of filtering out the 'bad' bits. Let's not discuss which bits are bad right now, but those of us who are parents will know what I mean. Maybe we should run Dansguardian upstream of Rabbit. But it might be simpler to do the filtering in Rabbit. cheers -- Rick |