Search code examples
scalaapache-sparkjmock

using jmock in Scala with type parameterized function


I want to test a function that writes output from in RDD in Scala Spark.

Part of this test is mocking a map on an RDD, using jmock

val customerRdd = mockery.mock(classOf[RDD[Customer]], "rdd1")
val transformedRddToWrite = mockery.mock(classOf[RDD[TransformedCustomer]], "rdd2")

mockery.checking(new Expectations() {{
  // ...
  oneOf(customerRdd).map(
    `with`(Expectations.any(classOf[Customer => TransformedCustomer]))
  )
  will(Expectations.returnValue(transformedRddToWrite))
  // ...
}})

However, whenever I try to run this test, I get the following error: not all parameters were given explicit matchers: either all parameters must be specified by matchers or all must be specified by values, you cannot mix matchers and values, despite the fact that I have specified matchers for all parameters to .map.

How do I fix this? Can jMock support matching on Scala functional arguments with implicit classtags?


Solution

  • jMock I thought has been abandoned since 2012. But if you like it, then more power to you. One of the issues is that map requires a ClassTag[U] according to the signature :

    def map[U: ClassTag](f: T => U): RDD[U] where U is the return type of your function.

    I am going to heavily assume that if you were to make this work with a Java mocking framework, go under the assumption that map's signature is public <U> RDD<U> map(scala.Function1<T, U>, scala.reflect.ClassTag<U>);

    Hope that would work.