Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo


#826 What do we call "securecomputing dot com" ?


They aren't a search engine really, they look at the pages to see if there are any security problems like viruses etc... I think I could put them in a list of bots to ignore and that would work, but if there are more like them it might be a good idea to have a list for them specifically. At this point Awstats only counts them as one hit for the day (I think) so this isn't a huge problem yet. They load lots of pages so they show up high on some lists. They aren't too abusive right now, one page per minute, but it never ends. If there were 10 of these it could get annoying, I hope they would all get together and share databases somehow.

Browser string: "page_verifier http://www.securecomputing.com/goto/pv"


  • Jim

    If I block them using the below code, apache throws a 403 error at them for every hit and sends zero bytes. So far they haven't stopped, but at least they are taking less bandwidth.
    Not that this is a solution for awstats, which should let us configure how to handle these types of bots. In my situation, I think they know what my site is all about and should now go away, and also respect the 403 - go away error :)

    This is what I put in my ".htaccess" file to block them, anyone can modify it to do the same to other bots, just change the site name, it's like a keyword with wildcards on each side. This code may come in handy (more examples on the apache pages).

    SetEnvIfNoCase User-Agent .securecomputing. spammer=yes

    order Allow,Deny
    allow from All
    deny from env=spammer