On Fri, Feb 3, 2012 at 12:37 AM, Raymond Toy <toy.raymond@gmail.com> wrote:
(I was surprised to see ecl choose the smallest string type to fit a literal string.)

Why? I mean, an ordinary unicode string without utf-8 packing takes 4 times the room of a base string. Since ECL is using a string buffer to read objects, then why shouldn't it pack them to the smallest representation when possible?

Juanjo

--
Instituto de Física Fundamental, CSIC
c/ Serrano, 113b, Madrid 28006 (Spain)
http://juanjose.garciaripoll.googlepages.com