Search code examples
apache-sparkpysparkamazon-emr

Minimal PySpark in AWS EMR fails to create a spark context


I am trying use PySpark in a fresh AWS EMR Spark cluster, and it is failing with the following error:

py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
    at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
    at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
    at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:151)
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:238)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:214)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 20 more

What I am doing:

  1. On EMR console, create a new cluster (emr-5.12.1 with Spark, m4.large, 2 instances)
  2. SSH its master
  3. create a new virtualenv with Python 3, install pyspark, and create a context:

    export PROJECT=example
    python3 -m venv ~/.virtualenvs/$PROJECT
    source ~/.virtualenvs/$PROJECT/bin/activate
    pip install pyspark
    export PYSPARK_PYTHON=/home/hadoop/.virtualenvs/$PROJECT/bin/python
    export YARN_CONF_DIR=/etc/hadoop/conf
    python -c "import pyspark; conf = pyspark.SparkConf().setMaster('yarn-client').setAppName('testing'); sc = pyspark.SparkContext(conf=conf)"
    

which raises the exception above.

What I also tried:

  1. use Python 2 instead (virtualenv ~/.virtualenvs/$PROJECT)
  2. use the default Python installation, not creating any virtualenv.

The exception is closely related to this issue in Spark, but that was being caused by a old Hadoop version (2.6, 2.7), which is not the case in the configuration emr-5.12.1, that uses Hadoop 2.8.

I reproduced the same setup in Google Platform, and it works.

Note that if I call pyspark from the shell, it will work. However, the master will be set local:

 pyspark
 >>> sc
 <SparkContext master=local[*] appName=PySparkShell>

which is useless for distributed jobs.


Solution

  • The solution is to also export SPARK_HOME. I.e. on the step 3, use

    export SPARK_HOME=/usr/lib/spark/
    export PYSPARK_PYTHON=/home/hadoop/.virtualenvs/$PROJECT/bin/python
    export YARN_CONF_DIR=/etc/hadoop/conf
    python -c "import pyspark; conf = pyspark.SparkConf().setMaster('yarn-client').setAppName('testing'); sc = pyspark.SparkContext(conf=conf)"
    

    This addresses the problem in both Python 2 and 3.