Aredridel schrieb:
> > Jon =C5slund schrieb:
> > > My Shakespeare Programming Language web page (built using PhpWiki) did
> > > just get slashdotted. The web server died hard, very quickly. :)
> >
> > how exactly? any noises and symptoms? what hardware?
> >
> > I would temp. block with ipchains, as I did last month with code red.
> > http://screaming-penguin.com/phorum/read.php?f=3D1&i=3D686&t=3D625
> >
> > I really have to write the dynamic robot blocking code now...
> > Every user has to be counted, php4 sessions would be the best.
> > for my php4 e-commerce project it works fine. (file sessions and mysql
> > sessions)
>
> What I'm going to do (NBTSWikiWiki took a hit tonight from a search engine)
> is make robots.txt a script, and have it log the IP/useragent pair, and the
> wiki code just dump the raw wiki pages with only links linked instead of
> full rendering for anything with that IP/Useragent combinatiion that's hit
> robots in the last 24 hours. Should be doable with a minimum of fuss,
> I think... about 25 lines of code, I'd think.
yes. I already wrote this.
http://xarch.tu-graz.ac.at/home/rurban/acadwiki-1.3.6pre/viewsrc.php?show=lib/robots.php#src
I'm constantly hit by search engines. some behave okay, some get nested in
edit and diff links. I blocked these statically.
search engines and specific misbehaving agents are very easy to block
(see the robot blocking code)
the technical problem is dynamic blocking which can take a site down.
thttpd for example does this on the server side. apache is too stupid.
(bandwidth limiting)
mod_php4 should work for thttpd also, but php compilation is that tricky
that I wouldn't try that out just for fun. I just broke a german phpwiki
of mine just because of a new php binary (gettext issues).
--
Reini Urban
http://xarch.tu-graz.ac.at/home/rurban/
|