From: <je...@fo...> - 2025-07-08 12:42:46
|
On 2025-07-08 03:41, Enno Rehling wrote: > I believe the \u escape sequence is strictly defined for 16-bit > unicode characters only. try using \U: > > char *emoji = "\U0001F92A"; > > Enno. Be aware that FXString::unescape(), at this time, does not interpret \UXXXXXXXX, only \uXXXX. But it does know about surrogate pairs, so it will convert a pair of escaped surrogate pairs into the proper UTF8 representation. unescape keeps prior unescaped character, so if it processes a follower of a surrogate pair, and the prior was a leader of a surrogate pair, then the converted utf8 will be the unpacked wide character. Maybe I should add the \UXXXXXXXX very wide character decode capability as well, since it makes sense to add it now... -- JVZ |