From: Peter B. <be...@an...> - 2009-06-11 05:56:03
|
A common attack these days is to try a few thousand HTTP URLs looking for scripts or code or pages that are running open source software with some sort of vulnerability or "feature" that sends email out. Often these attacks are run on servers that can generate thousands of requests per second, overwhelming systems and servers that usually handle hundreds of connections per minute. I use lighttpd, but it could be extended to Apache. For example (actual IP and hostname replaced to protect, well something): 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:31 +0000] "GET /modules/jinzora/backend/classes.php?include_path=../lib/jinzora.js%00 HTTP/1.1" 404 5069 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:31 +0000] "GET /cgi-bin//plugins/db/mysql/mysql.inc.php HTTP/1.1" 404 5039 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:31 +0000] "GET /cgi-bin/index.php?blog=1&title='&more=1&c=1&tb=1&pb=1 HTTP/1.1" 404 5074 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:31 +0000] "GET /scripts/ideabox/include.php?ideaDir=http://xxxxxxxx HTTP/1.1" 404 5051 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:31 +0000] "GET //plugins/db/mysql/mysql.inc.php HTTP/1.1" 404 5031 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:32 +0000] "GET /cgi-bin/ideabox/include.php?ideaDir=http://xxxxxxxx HTTP/1.1" 404 5051 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:32 +0000] "GET /ideabox/include.php?ideaDir=http://xxxxxxxx HTTP/1.1" 404 5043 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:32 +0000] "GET /include.php?ideaDir=http://xxxxxxxx HTTP/1.1" 404 5035 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" 198.6.1.1 4.3.2.1 - [01/Jan/2009:13:31:32 +0000] "GET /ideabox/include.php?ideaDir=http://xxxxxxxx HTTP/1.1" 404 5043 "-" "Mozilla/4.75 [en] (X11, U; Nessus)" My concern is that there are potentially two IP addresses in the log file, the REMOTE_ADDR and the server IP, HTTP_HOST. There isn't really much of an "error" here. Nothing is wrong, other than the end user generated over 8,000 404 (Not Found) messages in a matter of a few minutes. I realize there are bandwidth limiters and other sorts of software to block stuff like this, but I really like sshguard, and this seems like the kind of thing it can do well. Would this work? tail -n0 -F httpd.log | grep ' 404 ' | sshguard -a 100 -s 60 -p 1200 That would strip out the 404's from the log, and only those would be passed to sshguard, which would block them upon more than 100 404 messages in 60 seconds. Thoughts? What happens when there are multiple IP addresses in the log file line? --------------------------------------------------------------------------- Peter Beckman Internet Guy be...@an... http://www.angryox.com/ --------------------------------------------------------------------------- |
From: Peter B. <be...@an...> - 2009-06-26 18:23:40
|
On Thu, 11 Jun 2009, Peter Beckman wrote: > Would this work? > > tail -n0 -F httpd.log | grep ' 404 ' | sshguard -a 100 -s 60 -p 1200 Unfortunately this doesn't work. The problem, however, is not SSHguard, but pipes. Once you run tail -n0 -F httpd.log | grep ' 404 ' It outputs as expected to stdout. However, when you add pipe number two, piping to sshguard, the output doesn't continue as tail processes. I'm not sure if it gets buffered somewhere or what, but SOMETHING prevents the output you can see from grep to getting to sshguard. Try it out: tail -n0 -F httpd.log | grep ' 200 ' | cat If you just do: cat httpd.log | grep ' 200 ' | cat Works just fine. But there is something about tail that screws up multiple pipes. Anyone know what's up here? I tried installing gtail (didn't work), tried to figure out how to configure lighttpd to spit only 404's to a certain local0 syslog facility so I could pipe it to lighttpd, I even googled "'tail -f' multiple pipes" and read a bunch of stuff. I've looked for unbuffering functionality in grep, egrep, sed, tail, gtail and others. Most solutions I did find simply worked around the issue of multiple pipes by combining commands into a single pipe after tail -F. People doing: tail -n0 -F httpd.log | grep 'foo' | grep -v 'bar' were told to use awk and a single pipe. So what's the deal? Why does tail not play nice with multiple pipes? In theory, something like this would work like a charm: tail -n0 -F httpd.log | sed -n -E 's/^(.+?) .+ 404 .+$/\1 404 access denied/p' | sshguard -a 100 -s 60 -p 1200 (if it only worked) to only get the 404's out of the log file, and then rewrite the log entry to meet sshguard's criteria for blocking. The '-n' and trailing 'p' flag in s// allow me to NOT pipe non-replaced lines to sshguard, for efficiency. But this doesn't work (tested with sshguard -d). If you can think of how to use SSHguard to block people who attempt brute force HTTP scans of 404 links and get around the multiple pipes issue, I'd love to hear it. Lighttpd doesn't log 404 errors to the error log, and it doesn't seem to be able to only send 404 errors to a different file than the set access log file. I'd still need to pipe access log entries sent to syslog through sed and then sshguard, which MIGHT work, but then I lose access logs, which are kinda important. Plus I'm not sure what kind of overhead that might generate. Beckman --------------------------------------------------------------------------- Peter Beckman Internet Guy be...@an... http://www.angryox.com/ --------------------------------------------------------------------------- |
From: Kacper W. <ka...@gm...> - 2009-06-26 23:36:09
|
On Fri, Jun 26, 2009 at 8:22 PM, Peter Beckman<be...@an...> wrote: > On Thu, 11 Jun 2009, Peter Beckman wrote: > >> Would this work? >> >> tail -n0 -F httpd.log | grep ' 404 ' | sshguard -a 100 -s 60 -p 1200 > > Unfortunately this doesn't work. The problem, however, is not SSHguard, > but pipes. Once you run I believe grep sends an untimely EOF to end the tail stream. I haven't tried it in this particular case, but maybe you could try using socat: http://www.dest-unreach.org/socat/doc/socat.html which is a pretty useful tool for cases like this. HTH, -- http://kacper.doesntexist.org http://windows.dontexist.net Employ no technique to gain supreme enlightment. - Mar pa Chos kyi blos gros |
From: Mij <mi...@bi...> - 2009-07-03 15:21:58
|
always nice to see quests for collateral applications of SSHGuard. Thanks. Please submit these lines to http://sshguard.sourceforge.net/newattackpatt.php we look in there to decide what to support in next releases (I'm not saying that the post is off topic, we just want a reference in there). Grep: what for? The parser is already a grep itself. Some users hinted voodoo beliefs of performance load with the parser. I don't know where they come from, but forget it. The parser boils down to a state machine. At the first character which does not comply with some pattern, the deal is over. When you filter with regular expressions or grep, they have to scan through the entire line instead. multiple addresses in log line: nothing. A regular expression could be confused, the context-free parser SSHGuard uses is not. |