(If the topic has already existed ,I am sorry for repeatting . )
Hi , look at the following code:
utf8* insertGlyphs = (utf8*)"[color=FF0000]CEGUI世界你好[/color]";
CEGUI::Font * font = CEGUI::System::getSingleton().getDefaultFont() ;
try{
CEGUI::String glyphSets(font->getAvailableGlyphs());
glyphSets += insertGlyphs ;
font->defineFontGlyphs(glyphSets);
}catch (CEGUI::Exception & e )
{
// skip the exception and go on
}
The program will not work nicely , words will be lost on the widget
and the excptions said:
Exception: Imageset::defineImage - An image with the name [color=FF0000]'C' already exists[/color] in Imageset 'SimHei-10_auto_glyph_images'.
........
So if some character has already existed , how can I insert Glyphs into the Font dynamicly ? Such as when chatting .
Error : using Font
Moderators: CEGUI MVP, CEGUI Team
- CrazyEddie
- CEGUI Project Lead
- Posts: 6760
- Joined: Wed Jan 12, 2005 12:06
- Location: England
- Contact:
Re: Error : using Font
Each codepoint passed in the supplied string must be unique. There's nothing to be gained from specifying the same codepoint twice for the same font.
CE.
CE.
Re: Error : using Font
If there is a few codepoint , I can define a set of Glyphs , but if there is much more codepoints ,it will not work any more ,so I must cache the codepoint dynamicly .
Now ,repeatting will appear . if I initialize all the codepoints at one time ,it will waste long time and there is a lot of wastes. It is a pain .
What can I do ? Would you like give me some suggestions ?
Thanks .
Now ,repeatting will appear . if I initialize all the codepoints at one time ,it will waste long time and there is a lot of wastes. It is a pain .
What can I do ? Would you like give me some suggestions ?
Thanks .
- CrazyEddie
- CEGUI Project Lead
- Posts: 6760
- Joined: Wed Jan 12, 2005 12:06
- Location: England
- Contact:
Re: Error : using Font
Before adding a codepoint, just see if it already exists, if it does, do not add it again.
CE.
CE.
Re: Error : using Font
So I should build a manager to manage this ,right ?
By the way , when I turn on the input method , data type of the character which I catched is [color=0000FF]char[/color] and it occupys two bytes(16 bits , such as '好'),I have not found the method to transform it to codepoint,I appreciate if you could help me .
Any way ,thanks.
By the way , when I turn on the input method , data type of the character which I catched is [color=0000FF]char[/color] and it occupys two bytes(16 bits , such as '好'),I have not found the method to transform it to codepoint,I appreciate if you could help me .
Any way ,thanks.
- CrazyEddie
- CEGUI Project Lead
- Posts: 6760
- Joined: Wed Jan 12, 2005 12:06
- Location: England
- Contact:
Re: Error : using Font
A full blown manager might be overkill, all you really need is a single method to take a CEGUI::Font pointer, and CEGUI::String as input. The method needs to get the currently available glyphs from the font, and then you just compare each codepoint in the string you're adding with what's currently available, any codepoint not in the available string should be added, then you submit the final set back to the font for creation:
This is not the only approach you could use for this, but is just a simple example.
Where are you getting your input from, you did not say? If it is via Win32 messages, then the following need to be considered.
Unicode in Win32 is not as wonderful as they'd have you believe, since you're mostly stuck with using a single UTF-16 codeunit to represent any codepoint; so this immediately chucks a whole load of supplementary plane glyph sets straight out of the window.
If you want to use Unicode at all then you must set up and compile your application as a Unicode application as opposed to a Ansi application. The difference is that with Unicode you'll get proper UTF-16 (or UTF-32, more on that in a minute) standard codepoint values, but with Ansi the values you get will be from some locale determined code page, and so will require translation into UTF-32 (which is beyond the scope of what I can go into here).
Once you have the app set up properly, you have two options for getting these inputs; WM_CHAR and WM_UNICHAR.
WM_CHAR will give you a UTF-16 codepoint. I'm not entirely sure whether they use the low 16 bits only, or whether they use the full 32 bits making use of surrogate pairs where required (the MS docs are particularly sketchy on this, so I assume you're stuck with the single 16 bit codeunit). If surrogate pairs are used though, then you'll need to detect and decode those (http://www.unicode.org for info).
WM_UNICHAR in Windows XP give you a UTF-32 codepoint and can be passed directly to CEGUI.
CE.
Code: Select all
void addFontGlyphs(CEGUI::Font* font, const CEGUI::String& new_glyphs)
{
using namespace CEGUI;
// get glyphs currently available.
String glyphset(font->getAvailableGlyphs());
// for each glyph in the given set which we may have to add
for (int i = 0; i < new_glyphs.length(); ++i)
{
// see if the glyph is already defined for the font
if (glyphset.find(new_glyphs[i]) == String::npos)
{
// glyph was not defined so add it to the set which we
// wil submit back to the font.
glyphset.append(new_glyphs[i];
}
}
// submit the new glyph set back to font for creation.
font->defineFontGlyphs(glyphset);
}
This is not the only approach you could use for this, but is just a simple example.
Where are you getting your input from, you did not say? If it is via Win32 messages, then the following need to be considered.
Unicode in Win32 is not as wonderful as they'd have you believe, since you're mostly stuck with using a single UTF-16 codeunit to represent any codepoint; so this immediately chucks a whole load of supplementary plane glyph sets straight out of the window.
If you want to use Unicode at all then you must set up and compile your application as a Unicode application as opposed to a Ansi application. The difference is that with Unicode you'll get proper UTF-16 (or UTF-32, more on that in a minute) standard codepoint values, but with Ansi the values you get will be from some locale determined code page, and so will require translation into UTF-32 (which is beyond the scope of what I can go into here).
Once you have the app set up properly, you have two options for getting these inputs; WM_CHAR and WM_UNICHAR.
WM_CHAR will give you a UTF-16 codepoint. I'm not entirely sure whether they use the low 16 bits only, or whether they use the full 32 bits making use of surrogate pairs where required (the MS docs are particularly sketchy on this, so I assume you're stuck with the single 16 bit codeunit). If surrogate pairs are used though, then you'll need to detect and decode those (http://www.unicode.org for info).
WM_UNICHAR in Windows XP give you a UTF-32 codepoint and can be passed directly to CEGUI.
CE.
Re: Error : using Font
The flowing
I have made it , thanks.
the only left thing is how to use WM_CHAR ,for other reason , I will not use WM_UNICHAR .
I'll go to http://www.unicode.org for some info ,thanks .
Code: Select all
void addFontGlyphs(CEGUI::Font* font, const CEGUI::String& new_glyphs)
{
using namespace CEGUI;
// get glyphs currently available.
String glyphset(font->getAvailableGlyphs());
// for each glyph in the given set which we may have to add
for (int i = 0; i < new_glyphs.length(); ++i)
{
// see if the glyph is already defined for the font
if (glyphset.find(new_glyphs[i]) == String::npos)
{
// glyph was not defined so add it to the set which we
// wil submit back to the font.
glyphset.append(new_glyphs[i];
}
}
// submit the new glyph set back to font for creation.
font->defineFontGlyphs(glyphset);
}
I have made it , thanks.
the only left thing is how to use WM_CHAR ,for other reason , I will not use WM_UNICHAR .
I'll go to http://www.unicode.org for some info ,thanks .
Return to “Modifications / Integrations / Customisations”
Who is online
Users browsing this forum: Bing [Bot] and 5 guests