I'm new to ht:/dig , but have seen the results of using it on one
website. The thing is, the results are uneven on the website and
similar word-counts and other factors result in reproducible but hugely
variant scores (like 6.9344338 vs 2.2333322 with no visually
recognizable differences in the pages or keywords).
In the past, when I've worked on projects, one of the ways for me to see
where changes produce effects was to manually calculate a few results
and compare them to the output of the compiled program.
However, while there is a page that outlines the weighting factors
[Ranking pages and the use of Meta tags with ht://Dig], its description
of the actual program is that the "htsearch... uses a complex rule to
rank the pages."
Is there some source for the "complex rule" in pseudocode or some other
simplified presentation format, that explains...better yet, demonstrates
the complex rule?
Thanks for any help. If the only answer is to download the htsearch
module, I'm about to do that...