My code compiles against scala 2.12.10 but has to run on scala 2.12.15. My code:
import import scala.tools.nsc.Settings
val settings = new Settings
settings.usejavacp.value = false
usejavacp
is throwing
java.lang.NoSuchMethodError: scala.tools.nsc.Settings.usejavacp()Lscala/tools/nsc/settings/AbsSettings$AbsSetting;
because Settings
is a StandardScalaSettings
where the definition of the class changed like so (only including relevant API):
2.12.10:
public interface scala.tools.nsc.settings.StandardScalaSettings {
// other APIs
public abstract scala.tools.nsc.settings.MutableSettings$BooleanSetting usejavacp();
}
to
2.12.15:
public interface scala.tools.nsc.settings.StandardScalaSettings {
// other APIs
public abstract scala.tools.nsc.settings.AbsSettings$AbsSetting usejavacp();
}
Is there any way I can make this work without upgrading my dependencies? Can I use reflection?
Yeah, runtime reflection works both in Scala 2.12.10 and 2.12.15.
You can replace
settings.usejavacp.value = false
with
import scala.reflect.runtime
import scala.reflect.runtime.universe._
val rm = runtime.currentMirror
val method = typeOf[MutableSettings].member(TermName("usejavacp")).asMethod
rm.reflect(settings).reflectMethod(method)().asInstanceOf[settings.BooleanSetting].value = false
Reflection to call method that had its name changed in an upgrade?
By the way, regarding your deleted question IncompatibleClassChangeError: org.apache.spark.sql.catalyst.plans.logical.LeafNode
You can try runtime compilation. It seems to work both in Spark 3.1.2 and 3.3.0.
// Spark 3.1.2
import scala.reflect.runtime.universe._
import scala.reflect.runtime
import scala.tools.reflect.ToolBox // libraryDependencies += scalaOrganization.value % "scala-compiler" % scalaVersion.value exclude("org.scala-lang.modules", "scala-xml_2.12")
object App {
val rm = runtime.currentMirror
val tb = rm.mkToolBox()
val catalyst = q"org.apache.spark.sql.catalyst"
val Attribute = tq"$catalyst.expressions.Attribute"
val PredicateHelper = tq"$catalyst.expressions.PredicateHelper"
val LeafNode = tq"$catalyst.plans.logical.LeafNode"
val sym = tb.define(
q"""
case class MyClass() extends $LeafNode with $PredicateHelper {
override def output: Seq[$Attribute] = Seq()
}
""".asInstanceOf[ClassDef]
).asClass
}
// Spark 3.3.0
import scala.reflect.runtime.universe._
object Main {
def main(args: Array[String]): Unit = {
println(App.tb.eval(q"new ${App.sym}()")) // MyClass
}
}