Search code examples
c++stringc++11unicodewstring

Convert from std::wstring to std::string


I'm converting wstring to string with std::codecvt_utf8 as described in this question, but when I tried Greek or Chinese alphabet symbols are corrupted, I can see it in the debug Locals window, for example 日本 became "日本"

std::wstring_convert<std::codecvt_utf8<wchar_t>> myconv; //also tried codecvt_utf8_utf16
std::string str = myconv.to_bytes(wstr);

What am I doing wrong?


Solution

  • std::string simply holds an array of bytes. It does not hold information about the encoding in which these bytes are supposed to be interpreted, nor do the standard library functions or std::string member functions generally assume anything about the encoding. They handle the contents as just an array of bytes.

    Therefore when the contents of a std::string need to be presented, the presenter needs to make some guess about the intended encoding of the string, if that information is not provided in some other way.

    I am assuming that the encoding you intend to convert to is UTF8, given that you are using std::codecvt_utf8.

    But if you are using Virtual Studio, the debugger simply assumes one specific encoding, at least by default. That encoding is not UTF8, but I suppose probably code page 1252.

    As verification, python gives the following:

    >>> '日本'.encode('utf8').decode('cp1252')
    '日本'
    

    Your string does seem to be the UTF8 encoding of 日本 interpreted as if it was cp1252 encoded.

    Therefore the conversion seems to have worked as intended.


    As mentioned by @MarkTolonen in the comments, the encoding to assume for a string variable can be specified to UTF8 in the Visual Studio debugger with the s8 specifier, as explained in the documentation.