[mod-security-users] Re: Using Dshield Data to Block Well-Known Web Attackers
Brought to you by:
victorhora,
zimmerletw
|
From: Ryan B. <rcb...@gm...> - 2006-04-19 20:08:11
|
Oops, I had to update the subject line - it is attackers and not attacks. -Ryan On 4/19/06, Ryan Barnett <rcb...@gm...> wrote: > > For those of you who are interested in creating ACLs (with Apache or > Mod_Security) to block access from well-known web attackers, I thought I > would present this small section of info from my book - Preventing Web > Attacks with Apache ( > http://www.amazon.com/gp/product/0321321286/ref=3Dsr_11_1/104-3385017-897= 3538?%5Fencoding=3DUTF8 > ) > > This is a complimentry method to those presented by the GotRoot blacklist > data. The data below shows how to use the Apache Deny directive, however > similar Mod_Security rules could be created to block access from these > hosts. > > I hope this is useful. > > *Blocking Well-Known Offenders* > > Utilization of IP based block lists has been common place for years in > combating email abusers. There are many community project sites that make > block lists available to the public so that they can download it and then > implement access control lists to deny access attempts from these IP > addresses/network blocks to their SMTP servers. The use of the data in th= ese > lists effective, however they need to be constantly updated as the SPAMME= RS > leverage new IP addresses. > > > > The Dshield.org <http://dshield.org/> web site ( www.dshield.org ) tracks > Internet traffic and calls itself a distributed intrusion detection syste= m. > Dshield gathers its information by allowing anyone to submit their firewa= ll > and intrusion detection logs. There are client programs for the various > security applications that will convert the logs into the correct Dshield > format and forward them onto the web site. One of the resources available > from Dshield is their own block list of the top twenty network blocks tha= t > have exhibited suspicious scanning activity - > http://feeds.dshield.org/block.txt. While this data does illustrate the > fact that these network blocks are conduction suspicious network > connections, it does not provided the type of fidelity required to > accurately categorize their activities. Are they SPAMMERS or Brute Forcin= g > password protected sites? We just don't know. > > > > It was this issue that prompted me to contact Johannes Ullrich of Dshield > and the SANS Internet Storm Center. I asked him if it would be possible t= o > generate a list of only HTTP/Port 80 attackers. At first, he was a bit > skeptical of the true value of this information as web attackers are > constantly changing their IP addresses as they compromise more systems or > loop through proxies. I agreed that any sort of port 80 block list would > have to be dynamic and the hosts identified would only be valid for a sho= rt > period of time, however I still believed there was value in this list. I > expres sed to Johannes that I was looking for a list of web attackers tha= t > I could import daily into my Apache server and then create deny rules for > these hosts. The real value of using the Dshield information is that they > have a much larger view of the Internet than most other individual > organizations would have. A Dshield block list would be ba sed on > information gathered from across the globe. Think of it as a cyber-ba sed= community watch program. > > > > It wasn't until I gave this analogy to Johannes that he finally agreed > with me on this concept. I said to imagine that you were in charge of > security at a bank. You had the option of posting up the FBI's Top Ten Mo= st > Wanted Criminal posters or the FBI's Top Ten Most Wanted Bank Robbers. Wh= ich > one would you choose? Most people would choose the later as the bank robb= ers > present the greater threat to the bank. With regards to web security, a > block list of port 80 attackers would be more relevant than a block list = of > generic Internet hooligans. After this exchange, Johannes went ahead and > created a PHP web page that would extract out the information I desired. > Here is the URL - www.dshield.org/topportsource.php?port=3D80&num=3D20. Y= ou > can change the port number if you are interested in services other the ht= tp > and you can also change the number of records returned. In the link above= , I > am querying for the top twenty port 80 attackers. Here is an example repo= rt > returned by the link. > > > > # Port 80 top 20 records ordered by number of targets hit. > > # > > # compiled Fri, 20 May 2005 03:02:51 +0000 > > # > > # columns: > > # Source IP <tab> Targets Hit <tab> Total Records > > # > > # enjoy. > > 218.083.155.079 71199 193929 > > 206.123.216.023 65011 118102 > > 148.245.122.012 64071 116805 > > 064.080.123.138 7724 8262 > > 064.080.123.122 4897 5102 > > 061.222.211.118 3370 3370 > > 219.140.162.215 2192 2192 > > 221.230.192.152 1341 1729 > > 084.244.002.104 1331 1331 > > 062.002.157.178 759 5575 > > 213.202.216.156 757 807 > > 219.159.102.184 612 627 > > 207.044.142.115 586 808 > > 063.151.041.210 546 902 > > 066.193.175.084 531 1554 > > 065.078.035.101 508 1014 > > 193.146.045.103 436 870 > > 221.201.184.165 421 421 > > 216.167.232.087 408 1222 > > 217.160.188.180 314 530 > > > > We are interested in the first column as that lists the specific client I= P > address of the web attacker. I created a quick shell script that will > automatically download an updated list daily using wget and then converts > that data into the appropriate Apache deny directive format. Here is an > example of manually running the script called dshield_blocklist.sh. > > > > *# cat dshield_blocklist.sh * > > #!/bin/sh > > > > /usr/bin/wget "http://www.dshield.org/topportsource.php?port=3D80&num=3D2= 0 " > > > > for f in `cat topport* | grep -v "#" | awk '{print $1}' | head -20 | sed > -e 's/^0//g' -e 's/\.0/\./g' =96e 's/\.0/\./g'` ; do echo "Deny from $f" = > > /usr/local/apache/conf/blocklist.txt ; done > > > > exit > > *# ./dshield_blocklist.sh* > > *# cat /usr/local/apache/conf/blocklist.txt* > > Deny from 218.83.155.79 > > Deny from 206.123.216.23 > > Deny from 148.245.122.12 > > Deny from 64.80.123.138 > > Deny from 64.80.123.122 > > Deny from 61.222.211.118 > > Deny from 219.140.162.215 > > Deny from 221.230.192.152 > > Deny from 84.244.02.104 > > Deny from 62.2.157.178 > > Deny from 213.202.216.156 > > Deny from 219.159.102.184 > > Deny from 207.44.142.115 > > Deny from 63.151.41.210 > > Deny from 66.193.175.84 > > Deny from 65.78.35.101 > > Deny from 193.146.45.103 > > Deny from 221.201.184.165 > > Deny from 216.167.232.87 > > Deny from 217.160.188.180 > > > > The script places the converted data into a file called blocklist.txt in > the Apache conf directory. I then reference this file with an include > statement in my DocumentRoot directory directive like this =96 > > > > <Directory "/usr/local/apache/htdocs"> > > Options -Indexes -Includes -FollowSymLinks -Multiviews > > AllowOverride None > > Order deny,allow > > Allow from all > > *include conf/blocklist.txt* > > > > <LimitExcept GET POST> > > Order allow,deny > > Deny from all > > </LimitExcept> > > </Directory> > > > > This blocklist is reactivated every night at midnight when I conduct my > normal log rotation and restart Apache. This technique proves extremely e= asy > to implement and does provide protection from web clients who are up to n= o > good. > > -- > Ryan C. Barnett > Web Application Security Consortium (WASC) Member > CIS Apache Benchmark Project Lead > SANS Instructor: Securing Apache > GCIA, GCFA, GCIH, GSNA, GCUX, GSEC > Author: Preventing Web Attacks with Apache > -- Ryan C. Barnett Web Application Security Consortium (WASC) Member CIS Apache Benchmark Project Lead SANS Instructor: Securing Apache GCIA, GCFA, GCIH, GSNA, GCUX, GSEC Author: Preventing Web Attacks with Apache |