|
From: Neal R. <ne...@ri...> - 2003-04-17 18:06:06
|
Hey all,
I'm looking through this function and I have a couple questions.
1) The function reparses 'bad_extensions' & 'valid_extensions' each time
through. This seems wastefull. And good reason to do this?
2) Toward the end of the function, just before we test the URL against
'limits' & 'limit_normalized', we check the server's robots.txt file.
Wouldn't it make sense to do the robots.txt check AFTER the limits
check, so as not to waste network connections on servers that will get
rejected by the next two tests?
Thanks.
Neal Richter
Knowledgebase Developer
RightNow Technologies, Inc.
Customer Service for Every Web Site
Office: 406-522-1485
|