>> Change foreign.d to not assume that O(foreign_encoding) is
>1:1. Yes this
>> implies some changes to the FFI. char[n] means n bytes,
>maybe fewer than n
>I think Joerg said that he was working on this...
Nope. I mentioned the problem and probably stated that I don't see a solution, that's all.
Please define usable behaviour for (c-array character N) for non 1:1 encodings.
How much buffer space shall CLISP allocate?
What buffer size (int) shall it pass to the foreign function?
How many characters shall it get out of it?
Think hard about :out and :in-out parameters modes and about :allocation modes.
Then consider that we only have partial information about encodings, i.e. encoding_min and encoding_max_bytes_per_char are not even correct for all encodings.
The only thing that's doable is 2:1 and 4:1 encodings, like UTF-16, maybe using a new D-STRING or W-STRING type, and D/WCHAR or something like that. That's easy, copy&paste work.
I believe that people beg for UTF-16, because that's what some APIs use. It doesn't require a general solution. Just add 16bit character types, even though this partial solution may feel ugly.
Using with-foreign-string and c-pointer, any encoding is doable today already, with a little code overhead. I think I've provided examples in some e-mail.
There could be other solution paths: just make c-string :in work (most common) and have other uses handled via with-foreign-object (admitedly ugly).
Or provide a macro that provides a n:m encoding def-call-out facility and expands to variable array and string code (using with-foreign-object/string.