The manpages do not clearly state that when reading in
binary data, f.ex. with [fconfigure -encoding binary]
the string representation is such that each character
in the string is the same as an octet in the original
data. The same applies to [binary format] and anything
else that generates binary data.
The reason this is not self-evident is that equally one
might assume that when binary data is accessed as a
string, the above representation is not made and
instead two octets could form a single character of
unicode.
I am not quite sure what the correct place to document
this behaviour is, as it applies to all binary data (or
anything that is internally a ByteArray object). Still,
this relationship should be made clear.
Logged In: YES
user_id=79902
Already fixed in HEAD. I can't be bothered to backport. :)