We have identified a number of new spider IP addresses from Google and other indexers being responsible for vastly inflating our stats. I've created a local spider filter list with the IP addresses and I am running the stats updater:
dspace stats-util -m
to reprocess the stats and mark them appropriately, then will remove them via:
dspace stats-util -f
However the mark is taking hours. Likewise if I go ahead and just delete them based on the new rules, via:
dspace stats-util -i
Is that normal? We only have about 200,000 views to process.
Version 1.6.2 but about to rollout an upgrade. If the performance has improved in 1.8.2 we can wait a week or so.