Search code examples
hadoopapache-sparkversion

How to use two versions of spark shell?


I have Spark 1.6.2 and Spark 2.0 installed on my hortonworks cluster.

Both these versions are installed on a node in the Hadoop Cluster of 5 nodes.

Each time I start the spark-shell I get:

$ spark-shell
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default

When I check the version I get:

scala> sc.version
res0: String = 1.6.2

How can I start the other version(spark-shell of Spark2.0)?


Solution

  • export SPARK_MAJOR_VERSION=2 
    

    You just need to give the major version 2 or 1.

    $ export SPARK_MAJOR_VERSION=2
    $ spark-submit --version
    SPARK_MAJOR_VERSION is set to 2, using Spark2
    Welcome to
       ____              __
      / __/__  ___ _____/ /__
     _\ \/ _ \/ _ `/ __/  '_/
    /___/ .__/\_,_/_/ /_/\_\   version 2.0.0.2.5.0.0-1245