| Summary: | Can't render characters with codepoints greater than 2^16 | ||
|---|---|---|---|
| Product: | SDL_ttf | Reporter: | heinrietman |
| Component: | misc | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | ||
| Version: | 2.0.13 | ||
| Hardware: | x86_64 | ||
| OS: | Linux | ||
| Attachments: | Test case | ||
I looked into this, and FreeType itself is rejecting the character:
frame #0: 0x00000001002d6991 libfreetype.6.dylib`tt_cmap4_char_index(cmap=0x0000000100609970, char_code=128513) at ttcmap.c:1465
1462 FT_UInt32 char_code )
1463 {
1464 if ( char_code >= 0x10000UL )
-> 1465 return 0;
If I remove that line, it really can't find the character in the font character map.
If I open the font in Font Book, I can see the character, so I'm not sure how to get FreeType to see it. I found the right character map to find the character, a fix is on the way! Fixed, thanks! https://hg.libsdl.org/SDL_ttf/rev/31a3181ae289 |
Created attachment 2866 [details] Test case I was trying to create a program to render emoji using SDL_ttf, but it didn't work. After investigating the source code of SDL_ttf I figured out codepoints are represented using Uint16 which cannot represent the code points in the range used by emoji. I attached a small test program which illustrates the problem.