I'm handling a lot of Unicode file paths in my C++ project. I peform a check in my code , if they are fine enough to fit in Multibyte String , i keep it as a normal string (std::string) variable,where else if the string doesn't fit in Multibyte i use it as a wide char string.
My question is whether i can use the paths totally as wstrings ..? would it affect performance, i have to do some string manipulations,file open, create,rename and delete with the wstring. So rather that checking multibyte or wide char string, i would like to use it directly as wstring which would save me a lot of if/else.
bool IsUnicodeWString(const std::wstring &_WStr)
{
WCHAR* posUnicodePath = (WCHAR*)_WStr.c_str();
size_t multiByteLen = wcstombs(NULL, posUnicodePath, 0) + 1;
int tempLength = 0;
if (multiByteLen > 0)
{
TCHAR* _tmpTChar = new TCHAR[multiByteLen + 1];
memset(_tmpTChar, '\0', multiByteLen + 1);
tempLength = wcstombs(_tmpTChar, posUnicodePath, multiByteLen);
if (tempLength == std::string::npos)
{
multiByteLen = 0;
}
delete[] _tmpTChar;
}
if(multiByteLen == 0 || multiByteLen == std::string::npos) { // Is Unicode file
return true;
}
else{
return false;
}
}
if(IsUnicodeWString) {
// Use wstring [ Operations - String Manipulations,FilePath used for Open,Read,Write,Create,Delete,Rename,etc]
} else {
//string [ Operations - String Manipulations,FilePath used for Open,Read,Write,Create,Delete,Rename,etc]
}
Please share your thoughts ...
In Windows, Try to use wchar_t
as much as posible. Because it is default character representation in Windows, kernel also using wchar_t
as default. All of ANSI APIs are the wrapper of UNICODE APIs. If you disassembly ANSI APIs, you will known the truth.
Also, Use ATL::CString
instead std::(w)string
if possible. Because its used reference counting and the size of the class is equal to pointer size (4 bytes in 32-bits and 8 bytes in 64-bits). That mean you can return ATL::CString
directly from the functions without performance penalty.