Search code examples
scalaapache-spark-sqlscala-ide

Scala IDE Call Spark But Executor Not Start


I have Scala IDE Problem. The code below

val conf = new SparkConf().setAppName("xxx").setMaster("local[*]")

running in Scala IDE works fine, But

val conf = new SparkConf().setAppName("xxx").setMaster("spark://ipOfMyPC:7077")

can't work. and error message is

WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

I have checked with Spark-Shell, Spark-Shell Web UI use port 4040 and work fine. That's the reason I found Executor did not start.

Scala IDE service SparkUI use port 4041 automatically and I found Executor did not start, only driver exists. I have tried this code below, but not work

val conf = new SparkConf().setAppName("xxx").set("spark.executor.instances", "2").set("spark.executor.memory", "1g").setMaster("spark://ipOfMyPC:7077")

How to solve this issue in Scala IDE?

My Platform is windows 8.1 and firewall is disabled. Thank you very much.


Solution

  • Although Scala IDE service SparkUI use port 4041 instead 4040 automatically. After Stop Spark-Shell(on port 4040), The Scala IDE job can run successfully.