On Fri, Feb 3, 2012 at 12:37 AM, Raymond Toy <firstname.lastname@example.org>
(I was surprised to see ecl choose the smallest string type to fit a literal string.)
Why? I mean, an ordinary unicode string without utf-8 packing takes 4 times the room of a base string. Since ECL is using a string buffer to read objects, then why shouldn't it pack them to the smallest representation when possible?