Search code examples
scalaapache-sparksbtspark-shell

Run spark-shell from sbt


The default way of getting spark shell seems to be to download the distribution from the website. Yet, this spark issue mentions that it can be installed via sbt. I could not find documentation on this. In a sbt project that uses spark-sql and spark-core, no spark-shell binary was found.

How do you run spark-shell from sbt?


Solution

  • From the following URL:

    https://bzhangusc.wordpress.com/2015/11/20/use-sbt-console-as-spark-shell/

    If you already using Sbt for your project, it’s very simple to setup Sbt Console to replace Spark-shell command. Let’s start from the basic case. When you setup the project with sbt, you can simply run the console as sbt console

    Within the console, you just need to initiate SparkContext and SQLContext to make it behave like Spark Shell

    scala> val sc = new org.apache.spark.SparkContext("localhell")
    scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)