string y = null;
string x = y + null; // empty string
and
string x = null + null; // doesn't compile
Question: why this inaccurate behaviour and why doesn't null + null
result in a null
?
In the expression string x = null + null;
the type of null
-s is unknown, so the compiler can't choose the correct operator +
to apply.
To get the equivalent of the first example from the question, you should write the following string x = (string)null + null;
. And this will compile and return the empty string.
As for the case when (string)null + null
is not null
, but empty string: this is because the String.Concat()
method is used under the hood. And docs for this method explicitly say that it will always return non-null string and String.Empty
is used in place of any null arguments. So, Concat(null, null)
become Concat(String.Empty, String.Empty)
and returns String.Empty
.
Personally I think this is a good design decision, so we don't need to check for null's after each string operation. And is pretty much the same as for the API returning the collections - the best practices recommendation is never return the null
, but use the empty collection instead.