Saxon makes an incorrect inference for the static type
of the min() and max() functions: it assumes that the
result type will be the same as the item type of the
argument. Under the current spec, this is no longer the
case - when the argument is a sequence of untyped
atomic values, the result will be an xs:double.
This affects an expression such as
a/b/c = max(d/e/f)
It is only likely to affect Saxon-B, because the
inference that nodes will be untyped cannot usually be
made under Saxon-SA.
As a consequence of this incorrect inference, when the
result of max() is compared to a value that is known to
be untyped atomic, Saxon allocates a comparator that is
expecting to compare two untyped atomic values. At
run-time the software correctly decides that it
actually has to convert both values to doubles, but the
comparator is not equipped to do this comparison, so
complains with the (curious) message that it cannot
compare a double to a double.
Source fix: the fix affects modules
net.sf.saxon.style.StandardNames, and will be uploaded