This idea is simple, use a filter to exclude visitor traffic that stays for less then 1 second.
Why is this feature important? Many people criticize AwStats Numbers as being bloated with Bot SPAM and etc. If I were a programmer for a robot, I would want it to get on and off the pages as fast as possible, faster than a human could. So, my idea is to have an option similar to the "SkipHost" command that lets you filter out traffic less than a specified time. The specification is important, because some Users may want a different number than the 1 second that I propose.
This idea may be dangerous (as it assumes robots visit pages faster than a human) and may filter out lots of people clicking on your pages. However, in all honesty, even when I am researching via Google on the web, it takes me AT LEAST 1 second to decide that I don't like the website and to hit BACK.