Search code examples
c#unicodec++-cliunicode-literals

Unicode string literals in C# vs C++/CLI


C#:
char z = '\u201D';
int i = (int)z;

C++/CLI:
wchar_t z = '\u201D';
int i = (int)z;

In C# "i" becomes, just as I expect, 8221 ($201D). In C++/CLI on the other hand, it becomes 65428 ($FF94). Can some kind soul explain this to me?

EDIT: Size of wchar_t can not be of issue here, because:

C++/CLI:
wchar_t z = (wchar_t)8221;
int i = (int)z;

Here too, i becomes 8221, so wchar_t is indeed up to the game of holding a 16-bit integer on my system. Ekeforshus


Solution

  • You want:

    wchar_t z = L'\x201D';
    

    from here. \u is undefined.