How Can Display 2Bytes Code Character ?
Korean and Japanes, China Character compose 2Byte Code..
but OpenGL can Display only one byte!! Is it right??
Hm.... is It impossible to display 2Byte Code Character in the Windows?
Thanks for your reading and Sorry my unskilled English...
If you find no other way and are desperate, you could always just create the characters in a paint program and make a texture out of them... That way you can at least have them on screen somehow.
S.
S.
October 19, 2000 07:55 PM
Under NT, you can use UNICODE strings with the wgl font functions...you can''t use DBCS, and you can''t use either in Win98.
Soooo....to get DBCS strings in NT, use MultiByteToWideChar() to convert the DBCS string to UNICODE. Then use the UNICODE string in calls to wgl font funcs and the associated glCallLists() call.
So:
1) Convert DBCS character set to UNICODE using MultiByteToWideChar.
2) Build the display lists using wglUseFontWhatever. You need to know the UNICODE index range for the DBCS character set for the "first" and "count" parameters of wglUseFontOutlines.
3) Set the display list base to the same base used in step 2 using glListBase.
4) Convert the output string from DBCS to UNICODE using MultiByteToWideChar.
5)Call glCallLists with the UNICODE string.
Soooo....to get DBCS strings in NT, use MultiByteToWideChar() to convert the DBCS string to UNICODE. Then use the UNICODE string in calls to wgl font funcs and the associated glCallLists() call.
So:
1) Convert DBCS character set to UNICODE using MultiByteToWideChar.
2) Build the display lists using wglUseFontWhatever. You need to know the UNICODE index range for the DBCS character set for the "first" and "count" parameters of wglUseFontOutlines.
3) Set the display list base to the same base used in step 2 using glListBase.
4) Convert the output string from DBCS to UNICODE using MultiByteToWideChar.
5)Call glCallLists with the UNICODE string.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement