i'm working with PGV 4.1.5rel CORE... without the sitemap module what are
spiders like googlebot going to see?
what are other spiders (ie: msn, yahoo, etc) going to see?
how many spiders are actually recognized?
the reason i ask is that my site(s) are regularly "gobbled" by all of the major
SE spiders and probably 1/2 of the secondary level spiders... i'm not to the
point of being ready to let any of them in but i want to protect myself, further
than robots.txt and .htaccess, from them and then to be able to provide specific
access to certain ones and not all of them...
so, where do i stand and how can i ensure that i can protect my 24000+ IND,
9800+ MAR, site from being ransacked??? this is especially important as i've not
yet found a way to (internally) set server-a for select and server-b for
update/add/replace queries...
i hope to be placing a "interface" or possible a "why can't i" "bug" report... i
just don't know how to list such with two slave database servers that i
want/need to distribute the queries over :? :(
ideally, my setup would make queries to one server and then lookup, on a round
robin, where to make update/replace/initial entries...
so, is there any help for me at this point or is this something for v5 or v6?? ;)
--
NOTE: NEW EMAIL ADDRESS!!
_\/
(@@) Waldo Kitty, Waldo's Place USA
__ooO_( )_Ooo_____________________ telnet://bbs.wpusa.dynip.com
_|_____|_____|_____|_____|_____|_____ http://www.wpusa.dynip.com
____|_____|_____|_____|_____|_____|____ ftp://ftp.wpusa.dynip.com
_|_Eat_SPAM_to_email_me!_YUM!__|_____ wkitty42 -at- windstream.net
---
avast! Antivirus: Outbound message clean.
Virus Database (VPS): 080930-0, 09/30/2008
Tested on: 10/1/08 00:52:17
avast! - copyright (c) 1988-2008 ALWIL Software.
http://www.avast.com
|