From: Fabian G. <fa...@ya...> - 2005-09-21 04:20:34
|
Hi, I want to block requests from some aggresive bots and spiders to my site. So far I've tried implementing AbstractHttpHandler and providing the following handler() method: public void handle(String pathInContext, String pathParams, HttpRequest request, HttpResponse response) throws HttpException, IOException { if (ignoreAddresses.contains(request.getRemoteAddr())) { response.sendError(HttpResponse.__403_Forbidden); request.setHandled(true); return; } // if // String agent = request.getField(HttpFields.__UserAgent); for (int i = 0; i < ignoreAgents.length; i++) { String ignoreString = ignoreAgents[i]; if (agent.contains(ignoreString)) { request.setHandled(true); return; } // if // } // for // } // log // This way I can filter by IP addres or by user agent. I then add the handler to the web application context by adding the following to my XML configuration file: <Call name="getContext"> <Arg>/</Arg> <Call name="addHandler"> <Arg><New class="com.goldengateimages.photobase.PhotobaseHTAccessHandler"> <Set name="ignoreAgents"> <Array type="String"> <Item>BecomeBot</Item> <Item>larbin</Item> <Item>NG/2.0</Item> </Array> </Set> </New></Arg> </Call> </Call> However, this doesn't seem to do anything. My guess is that it doesn't work because a previously registered handler handles the request, and my filter thus does nothing. Any ideas on how I can get this working properly? Thanks, - Fabian Fine Art Prints & Stock Images http://www.goldengateimages.com/ __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com |