Search code examples
javaeclipseapache-sparkcassandraspark-cassandra-connector

Need some help on setting up spark for cassandra on java


Setting up spark to access cassandra on java is throwing NoClassDefFoundError

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(Unknown Source)
    at java.security.SecureClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.access$100(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at Client.main(Client.java:22)
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    ... 13 more

Two jar files are added. spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar & spark-core_2.10-0.9.0-incubating.jar. spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar is build against scala 2.10. Typing scala -version on command prompt showing scala code runner version 2.11.6. Accessing spark from spark-shell have no issue. Even access in cassandra column family from spark-shell is working fine.

import java.util.*;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import com.datastax.spark.connector.*;
import com.datastax.spark.connector.cql.*;
import com.datastax.spark.*;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.PairFunction;
//import scala.Tuple2;
import org.apache.spark.api.java.*;

public class Client {
    public static void main(String[] a)
    {
        SparkConf conf = new SparkConf().setAppName("MTMPNLTesting").setMaster("192.168.1.15");
    }
}

What might be the reason of the error??


Solution

  • Try also to include Scala Jar at your class path. If you do not use Maven, download the jar and include it in the Project build properties.