|
From: Geoff H. <ghu...@ws...> - 2002-06-06 20:49:53
|
> *what is the maximum indexing speed? > *what's the peak query rate? What's your hardware? How large are the documents that you're indexing? How fast is your network, or will you be indexing the local server rather than across the network? For queries, will you be running other services on the server while queries are performed? How many documents are likely to be returned? Are you talking about queries on a single server or a server farm? No offense, but I can't give you any reasonable number here. Suffice to say, ht://Dig is quite fast and well within the realm of commercial products. Personally, I'd be wary of someone giving you an actual number from your query alone. > *what are the rough disk space requirements for the software (not the > index, I mean for gcc/g++,ht://dig,and any other extra software > downloads I may need)? For the software? Depends on what sort of server you're running. Most UNIX servers have gcc/g++ already installed. The size of the ht://Dig binaries varies a bit by platform, but is probably in the realm of 2-3MB. > *what are the webserver requirements? Hardware? Software? For indexing, ht://Dig can index any webserver that understands HTTP (i.e. all of them), though there have been reports of strange quirks with Lotus Notes webservers. For running results, you simply need the htsearch CGI and a CGI-webserver (i.e. just about everything). UNIX-based servers are preferred, but there are users who run ht://Dig on Windows as well--though it's flakier. -- -Geoff Hutchison Williams Students Online http://wso.williams.edu/ |