From: Hoehle, Joerg-C. <Joe...@t-...> - 2004-12-08 09:38:56
|
Hi, Sam wrote: >I don't see how encoding_min/max_bytes can help with ENCODING-ZEROES, >but we can certainly call an equivalent of ENCODING-ZEROES in >MAKE-ENCODING and set encoding_min/max_bytes there. Encoding-min tells us the length of the smallest conversion. If 0-terminatednes makes sense everywhere, one can expect that to be the number of zero bytes. At least, I don'f find this assumption worse than the one currently built into the encoding-zeroes function. It's a different assumption than expecting a NUL character being present in every encoding. I'm not sure changing make-encoding is enough. What about non-internal encodings? IIRC, 0-termination does not make sense with UTF-7 (neither would it with Base64 encoding), but who cares? >UTF-8 is, I think, much more valuable. UTF-8 is not interesting because it needs a single zero termination byte. What's the problem? Regards, Jorg |