When allocating an empty BSTR
, either by SysAllocString(L"")
or by SysAllocStringLen(str, 0)
you always get a new BSTR
(at least by the test I made). BSTR
s aren't typically shared (like Java/.NET interment) since they are mutable but an empty string is, for all intents and purposes, immutable.
My question (at long last) is why doesn't COM use the trivial optimization of always returning the same string when creating an empty BSTR
(and ignoring it in SysFreeString
)? Is there a compelling reason not to do so (because my reasoning is flawed) or is it just that it wasn't thought to be important enough?
I'd guess (and yes, it's only a guess) that this optimization wasn't deemed important enough to perform.
While for many things from Windows's past memory consumption was a major factor in the API design (cf. Raymond Chen's articles), unlike Java's or .NET's string interning the benefits are rather small since they only apply to a single string which is only six bytes long. And how many empty strings a program has to keep in memory at any single point in time? Is that number large enough to warrant that optimization or is it actually negligible?