Simple code following offical doc:
public static void main(String[] args) throws Exception {
SparkConf conf = new SparkConf().setAppName("MyApp")
.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
JavaSparkContext sc = new JavaSparkContext(conf);
Configuration cfg = HBaseConfiguration.create();
// cfg.set("hbase.zookeeper.quorum", "localhost");
JavaHBaseContext hc = new JavaHBaseContext(sc, cfg);
JavaRDD<List<String>> rdd = sc.parallelize(Arrays.asList(Tom, Jerry));
System.out.println(rdd.collect());
}
And pom in Maven:
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>2.0.0-alpha-1</version>
</dependency>
I get an error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
How fix that ?
org/apache/spark/Logging was removed after spark 1.5.2. So I think you are getting this error.
Can you try to put spark-core_2.11-1.5.2.logging.jar separately under your project jar directory and rerun your application.