Thread: [mod-security-users] Include rules file per Directory
Brought to you by:
victorhora,
zimmerletw
|
From: Justin G. <web...@sw...> - 2006-04-17 18:40:22
|
hi, I'm looking into using gotroot's blacklist.conf but would like to restrict processing rules in this file only to specific scripts that need it, not load it like any other rules file, since the load goes very high on a busy server. thanks, Justin |
|
From: Ivan R. <iva...@gm...> - 2006-04-19 09:42:38
|
On 4/17/06, Justin Grindea <web...@sw...> wrote: > hi, > > I'm looking into using gotroot's blacklist.conf but would like to restric= t > processing rules in this file only to specific scripts that need it, not = load > it like any other rules file, since the load goes very high on a busy ser= ver. You can do that, simply do something like: <Location /xyz> Include conf/blacklist.conf </Location> But using blacklist.conf is not a good idea (that's the one with many IP addresses in it?) because ModSecurity needs to test for each IP address individially (and that's slow when you have thousands of IP addresses to check). From what I've heard Mike (GotRoot) will be maintaining a proper RBL to replace blacklist.conf (and ModSecurity 2.x already supports RBLs). The combination will be an order of magnitude faster. -- Ivan Ristic, Technical Director Thinking Stone, http://www.thinkingstone.com ModSecurity: Open source Web Application Firewall |
|
From: Michael S. <mi...@go...> - 2006-04-19 19:03:50
|
On Wed, 2006-04-19 at 10:42 +0100, Ivan Ristic wrote: > On 4/17/06, Justin Grindea <web...@sw...> wrote: > > hi, > > > > I'm looking into using gotroot's blacklist.conf but would like to restrict > > processing rules in this file only to specific scripts that need it, not load > > it like any other rules file, since the load goes very high on a busy server. > > You can do that, simply do something like: > > <Location /xyz> > Include conf/blacklist.conf > </Location> > > But using blacklist.conf is not a good idea (that's the one with many > IP addresses in it?) blacklist.conf has all the spammer URLs in it. > because ModSecurity needs to test for each IP > address individially (and that's slow when you have thousands of IP > addresses to check). From what I've heard Mike (GotRoot) will be > maintaining a proper RBL to replace blacklist.conf (and ModSecurity > 2.x already supports RBLs). The combination will be an order of > magnitude faster. Yep. badips.conf has the IPs, and is no longer maintained as its now in RBL form. I just haven't published the root zone yet for outside use. :-) I'll try to get it published this week. > > -- > Ivan Ristic, Technical Director > Thinking Stone, http://www.thinkingstone.com > ModSecurity: Open source Web Application Firewall > > > ------------------------------------------------------- > This SF.Net email is sponsored by xPML, a groundbreaking scripting language > that extends applications into web and mobile media. Attend the live webcast > and join the prime developer group breaking into this new coding territory! > http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642 > _______________________________________________ > mod-security-users mailing list > mod...@li... > https://lists.sourceforge.net/lists/listinfo/mod-security-users -- Michael T. Shinn KeyID:0xDAE2EC86 Key Fingerprint: 1884 E657 A6DF DF1B BFB9 E2C5 DCC6 5297 DAE2 EC86 http://pgp.mit.edu:11371/pks/lookup?op=get&search=0xDAE2EC86 Got Root? http://www.gotroot.com modsecurity rules: http://www.modsecurityrules.com Troubleshooting Firewalls: http://troubleshootingfirewalls.com |
|
From: Ivan R. <iva...@gm...> - 2006-04-19 19:56:22
|
On 4/19/06, Michael Shinn <mi...@go...> wrote: > On Wed, 2006-04-19 at 10:42 +0100, Ivan Ristic wrote: > > On 4/17/06, Justin Grindea <web...@sw...> wrote: > > > hi, > > > > > > I'm looking into using gotroot's blacklist.conf but would like to res= trict > > > processing rules in this file only to specific scripts that need it, = not load > > > it like any other rules file, since the load goes very high on a busy= server. > > > > You can do that, simply do something like: > > > > <Location /xyz> > > Include conf/blacklist.conf > > </Location> > > > > But using blacklist.conf is not a good idea (that's the one with many > > IP addresses in it?) > > blacklist.conf has all the spammer URLs in it. The next dev. release of ModSecurity will have the SURBL support. You should be able to use that to replace blacklist.conf, right (i.e. just do a single DNS lookup to verify a URI instead)? -- Ivan Ristic, Technical Director Thinking Stone, http://www.thinkingstone.com ModSecurity: Open source Web Application Firewall |
|
From: Michael S. <mi...@go...> - 2006-04-19 20:33:27
|
On Wed, 2006-04-19 at 20:56 +0100, Ivan Ristic wrote: > On 4/19/06, Michael Shinn <mi...@go...> wrote: > > On Wed, 2006-04-19 at 10:42 +0100, Ivan Ristic wrote: > > > On 4/17/06, Justin Grindea <web...@sw...> wrote: > > > > hi, > > > > > > > > I'm looking into using gotroot's blacklist.conf but would like to restrict > > > > processing rules in this file only to specific scripts that need it, not load > > > > it like any other rules file, since the load goes very high on a busy server. > > > > > > You can do that, simply do something like: > > > > > > <Location /xyz> > > > Include conf/blacklist.conf > > > </Location> > > > > > > But using blacklist.conf is not a good idea (that's the one with many > > > IP addresses in it?) > > > > blacklist.conf has all the spammer URLs in it. > > The next dev. release of ModSecurity will have the SURBL support. You > should be able to use that to replace blacklist.conf, right (i.e. just > do a single DNS lookup to verify a URI instead)? Yep. Will I be able to extract multiple URIs from a POST? As an aside, that zone is going to grow fast soon. I'm already at over 10K unique SLDs used by spammers in test ruleset, the production ruleset is well over 7K at last count, and thats all manually added by me. So getting rid of that ruleset will help with any performance issues. Once I put bring autofeeder out of testing the number of new URIs will probably grow by a 100-1000 a day. So lookups will be the only way to support that in the future, and I may have the autofeeder online this month. :-) Trying to keep up with the spammers manually is just not practical anymore for me, so its time to put those sneaky honeypots to work automatically feeding the SURBLs. Ah... robots fighting robots. Now all we need is a cool Anime theme song to go with it. -- Michael T. Shinn KeyID:0xDAE2EC86 Key Fingerprint: 1884 E657 A6DF DF1B BFB9 E2C5 DCC6 5297 DAE2 EC86 http://pgp.mit.edu:11371/pks/lookup?op=get&search=0xDAE2EC86 Got Root? http://www.gotroot.com modsecurity rules: http://www.modsecurityrules.com Troubleshooting Firewalls: http://troubleshootingfirewalls.com |
|
From: Ivan R. <iva...@gm...> - 2006-04-19 20:45:27
|
On 4/19/06, Michael Shinn <mi...@go...> wrote: > On Wed, 2006-04-19 at 20:56 +0100, Ivan Ristic wrote: > > On 4/19/06, Michael Shinn <mi...@go...> wrote: > > > On Wed, 2006-04-19 at 10:42 +0100, Ivan Ristic wrote: > > > > On 4/17/06, Justin Grindea <web...@sw...> wrote: > > > > > hi, > > > > > > > > > > I'm looking into using gotroot's blacklist.conf but would like to= restrict > > > > > processing rules in this file only to specific scripts that need = it, not load > > > > > it like any other rules file, since the load goes very high on a = busy server. > > > > > > > > You can do that, simply do something like: > > > > > > > > <Location /xyz> > > > > Include conf/blacklist.conf > > > > </Location> > > > > > > > > But using blacklist.conf is not a good idea (that's the one with ma= ny > > > > IP addresses in it?) > > > > > > blacklist.conf has all the spammer URLs in it. > > > > The next dev. release of ModSecurity will have the SURBL support. You > > should be able to use that to replace blacklist.conf, right (i.e. just > > do a single DNS lookup to verify a URI instead)? > > Yep. Will I be able to extract multiple URIs from a POST? If not in 2.0 then in 2.1 for sure (I have a very tight deadline for 2.0). Were you thinking of having ModSecurity extract the URIs from the request parameters? -- Ivan Ristic, Technical Director Thinking Stone, http://www.thinkingstone.com ModSecurity: Open source Web Application Firewall |
|
From: Michael S. <mi...@go...> - 2006-04-19 21:15:57
|
On Wed, 2006-04-19 at 21:45 +0100, Ivan Ristic wrote: > > > > Yep. Will I be able to extract multiple URIs from a POST? > > If not in 2.0 then in 2.1 for sure (I have a very tight deadline for > 2.0). Were you thinking of having ModSecurity extract the URIs from > the request parameters? That would be ideal. Did you have another thought about a better way to do this? -- Michael T. Shinn KeyID:0xDAE2EC86 Key Fingerprint: 1884 E657 A6DF DF1B BFB9 E2C5 DCC6 5297 DAE2 EC86 http://pgp.mit.edu:11371/pks/lookup?op=get&search=0xDAE2EC86 Got Root? http://www.gotroot.com modsecurity rules: http://www.modsecurityrules.com Troubleshooting Firewalls: http://troubleshootingfirewalls.com |
|
From: Ivan R. <iva...@gm...> - 2006-04-20 11:25:04
|
On 4/19/06, Michael Shinn <mi...@go...> wrote: > On Wed, 2006-04-19 at 21:45 +0100, Ivan Ristic wrote: > > > > > > Yep. Will I be able to extract multiple URIs from a POST? > > > > If not in 2.0 then in 2.1 for sure (I have a very tight deadline for > > 2.0). Were you thinking of having ModSecurity extract the URIs from > > the request parameters? > > That would be ideal. Did you have another thought about a better way to > do this? No (but it's not like I've spent a lot of time thinking about it :). That and the check of the referrer URI should do it. -- Ivan Ristic, Technical Director Thinking Stone, http://www.thinkingstone.com ModSecurity: Open source Web Application Firewall |