the SparkContext in SparkR (v1.5.1) is a
Java ref type org.apache.spark.api.java.JavaSparkContext
however when creating my class:
.jnew("com.example.MyClass","sc")
for my scala class: class TableReader(sc: JavaSparkContext), I'm getting a: java.lang.NoSuchMethodError:
What is this "Java ref type" and how can I get the actual context from it to send through rJava?
SparkR seems to have its own Java interoperability implemented in backend.R. Calls are made in the form SparkR:::newJObject(className, args)
, though I can't find any specific documentation, other than in tests in the same project.
sqlContext needs to be initialized and relevant jars loaded during startup using --jars {csv jars} or --packages as noted in the documentation.