Version 1.6 fixed a tokeniser bug which can incorrectly tokenise in the case of repeating whitespace delimiting charcters. This inadvertantly adjusted token based metric scores in a few rare cases.
Log in to post a comment.