1.2.0-bin-hadoop2.4
and my Scala version is 2.11.7
. I am getting an error so I can't use sbt.
~/sparksample$ sbt
Starting sbt: invoke with -help for other options [info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)
> sbt compile
[info] Updating {file:/home/beyhan/sparksample/}default-f390c8... [info] Resolving org.scala-lang#scala-library;2.11.7 ... [info] Resolving org.apache.spark#spark-core_2.11.7;1.2.0 ... [warn] module not found: org.apache.spark#spark-core_2.11.7;1.2.0 [warn] ==== local: tried [warn] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public: tried [warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11.7/1.2.0/spark-core_2.11.7-1.2.0.pom [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.11.7;1.2.0: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11.7;1.2.0: not found [error] Total time: 2 s, completed Oct 15, 2015 11:30:47 AM
Any suggestions? Thanks
There exists no spark-core_2.11.7
jar file. You have to get rid of the maintenance version number .7
in the spark dependencies because spark-core_2.11
exists. All Scala versions with version 2.11
should be compatible.
A minimal sbt file could look like
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"