On Thu, Feb 2, 2012 at 3:49 PM, Juan Jose Garcia-Ripoll <email@example.com>
Why? I mean, an ordinary unicode string without utf-8 packing takes 4 times the room of a base string. Since ECL is using a string buffer to read objects, then why shouldn't it pack them to the smallest representation when possible?
On Fri, Feb 3, 2012 at 12:37 AM, Raymond Toy <firstname.lastname@example.org>
(I was surprised to see ecl choose the smallest string type to fit a literal string.)
I was surprised for several reasons:
1. You have to check after reading the string to see what it contains. (I guess a very small compile-time cost.)
2. Because I didn't think any lisp did that, but it's not illegal to do so.
3. It's a burden on the user if the type of a constant string depends on what's in it. Being illiterate, I only know ASCII, so, perhaps this isn't a problem in practice.
(Getting crufty old f2cl code to convert declarations like (simple-array character (*)) to just string is a pain, but that's my problem, not yours.)