In my own code, and on numerous mailing list postings, I've noticed confusion due to Nothing being inferred as the least upper bound of two other types.
The answer may be obvious to you*, but I'm lazy, so I'm asking you*:
Under what conditions is inferring Nothing in this way the most desirable outcome?
Would it make sense to have the compiler throw an error in these cases, or a warning unless overridden by some kind of annotation?
* Plural
It's impossible to infer Nothing
as the least upper bound of two types unless those two types are also both Nothing
. When you infer the least upper bound of two types, and those two types have nothing in common, you'll get Any
(In most such cases, you'll get AnyRef
though, because you'll only get Any
when a value type like Int
or Long
is involved.)