I'm following the documentation example Example: Estimator, Transformer, and Param
And I got error msg
15/09/23 11:46:51 INFO BlockManagerMaster: Registered BlockManager Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror; at SimpleApp$.main(hw.scala:75)
And line 75 is the code "sqlContext.createDataFrame()":
import java.util.Random
import org.apache.log4j.Logger
import org.apache.log4j.Level
import scala.io.Source
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.rdd._
import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.ml.param.ParamMap
import org.apache.spark.mllib.linalg.{Vector, Vectors}
import org.apache.spark.mllib.recommendation.{ALS, Rating, MatrixFactorizationModel}
import org.apache.spark.sql.Row
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.DataFrame
import org.apache.spark.sql.functions._
object SimpleApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]");
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
val training = sqlContext.createDataFrame(Seq(
(1.0, Vectors.dense(0.0, 1.1, 0.1)),
(0.0, Vectors.dense(2.0, 1.0, -1.0)),
(0.0, Vectors.dense(2.0, 1.3, 1.0)),
(1.0, Vectors.dense(0.0, 1.2, -0.5))
)).toDF("label", "features")
}
}
And my sbt is like below:
lazy val root = (project in file(".")).
settings(
name := "hello",
version := "1.0",
scalaVersion := "2.11.4"
)
libraryDependencies ++= {
Seq(
"org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
"org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
"org.apache.spark" % "spark-hive_2.11" % "1.4.1",
"org.apache.spark" % "spark-mllib_2.11" % "1.4.1" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.4.1" % "provided",
"org.apache.spark" %% "spark-streaming-kinesis-asl" % "1.4.1" % "provided"
)
}
I tried to search around and found this post which is very similar to my issue, and I tried to change my sbt setting for spark versions (spark-mllib_2.11 to 2.10, and spark-1.4.1 to 1.5.0), but it came even more dependency conflicts.
My intuition is it's some version problem but cannot figure it out myself, could anyone please help? thanks a lot.
It's working now for me, and just for the record, referencing @MartinSenne answer.
what I did is as below:
@note: