I ran into an interesting situation. I wanted to implement something that looked like the following.
object Test {
abstract class Key[A]
class Constraint[-A] {
def doSomething(a: A): String = ""
}
object DesiredKeyConstraints {
case class KeyConstraint[A](val key: Key[A], constraint: Constraint[A])
val data: Map[Key[_], KeyConstraint[_]] = Map()
}
def useTheKeyConstraints[A](key: Key[A], value: A): String = {
DesiredKeyConstraints.data.get(key).fold[String]("") {
case DesiredKeyConstraints.KeyConstraint(_, constraint) => constraint.doSomething(value)
}
}
def main(args: Array[String]) {
println("hi")
}
}
Unfortunately, when I pull a KeyConstraint out of the map, I no longer know its type. So, when I try to call doSomething
, the types don't check out. This all seems to behave as expected. What was interesting is that elsewhere in the codebase, we have something that looks like the following: (replacing DesiredKeyConstraints
with WorkingKeyConstraints
)
object Test {
abstract class Key[A]
class Constraint[-A] {
def doSomething(a: A): String = ""
}
object WorkingKeyConstraints {
sealed trait SuperTrait[A, B] {
val key: Key[A]
}
case class KeyConstraint[A](val key: Key[A], constraint: Constraint[A]) extends SuperTrait[A, Unit]
val data: Map[Key[_], SuperTrait[_, _]] = Map()
}
def useTheKeyConstraints[A](key: Key[A], value: A): String = {
WorkingKeyConstraints.data.get(key).fold[String]("") {
case WorkingKeyConstraints.KeyConstraint(_, constraint) => constraint.doSomething(value)
}
}
def main(args: Array[String]) {
println("hi")
}
}
This one compiles and runs just fine. For some reason, having the super-type means that when we extract the KeyConstraint from the Map, it treats it as a KeyConstraint[Any]
rather than a KeyConstraint[_]
. Because Constraint
's are contravariant, we can treat a Constraint[Any]
as a Constraint[A]
and so the code compiles. The key problem/question here is, why does having the super type cause the type-checker to treat it as a KeyConstraint[Any]
?
Also, as further information, I played around with this some more, and it is something specific to having a super-type that has two generic type parameters. If I do the child-class with two generic types or a parent with a single generic type, it still fails. See my other failed attempts below:
object AnotherCaseThatDoesntWorkKeyConstraints {
case class KeyConstraint[A, B](val key: Key[A], constraint: Constraint[A])
val data: Map[Key[_], KeyConstraint[_, _]] = Map()
}
object AThirdCaseThatDoesntWorkKeyConstraints {
sealed trait SuperTrait[A] {
val key: Key[A]
}
case class KeyConstraint[A](val key: Key[A], constraint: Constraint[A]) extends SuperTrait[A]
val data: Map[Key[_], SuperTrait[_]] = Map()
}
I assume this is some sort of bug in the Scala type checker, but perhaps I am missing something.
tl;dr Type erasure and pattern matching
Typing the Map
with SuperTrait
concealed information about the type, and caused the pattern matching to assume a broad type for your extractor.
This is a similar example, but using Any
instead of your SuperTrait
. This example also shows how to produce a runtime exception out of it.
case class Identity[A : Manifest]() {
def apply(a: A) = a match { case a: A => a } // seemingly safe no-op
}
val myIdentity: Any = Identity[Int]()
myIdentity match {
case f@Identity() => f("string") // uh-oh, passed String instead of Int
}
throws an exception
scala.MatchError: string (of class java.lang.String)
at Identity.apply(...)
f@Identity()
pattern matches the Any
as an Identity[Any]
, and due to type erasure, this matched the Identity[Int]
, which turned into the error.
In constrast if we change Any
to Identity[_]
,
case class Identity[A : Manifest]() {
def apply(a: A) = a match { case a: A => a }
}
val myIdentity: Identity[_] = Identity[Int]()
myIdentity match {
case f@Identity() => f("string")
}
correctly fails to compile.
found : String("string")
required: _$1 where type _$1
case f@Identity() => f("string")
It knows that f
is the existential type Identity[T] forSome {type T}
, and it can't show that String
conforms to the wildcard type T
.
In the first example, you were effectively pattern matching as
DesiredKeyConstraints.KeyConstraint[Any](_, constraint)
In the second, there was more information, and you were matching as
DesiredKeyConstraints.KeyConstraint[T](_, constraint) forSome {type T}
(This is just illustrative; you currently can't actually write type parameters when pattern matching.)