Search code examples
javaapache-sparkguavadatastax-enterprisespring-data-cassandra

Detected Guava issue #1635 when using Spark and Cassandra Java Driver


I am using spring-data-cassandra 1.5.1(which uses cassandra java driver 3.x) in our spark application. When running the spark-submit command, I got the error below.

Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use.  This introduces codec resolution issues and potentially other incompatibility issues in the driver.  Please upgrade to Guava 16.01 or later.
    at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62)
    at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36)
    at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:68)
    ... 71 more

It seems like cassandra driver is looking for Guava version > 16.0.1 and it is failing because it found version < 16.0.1. I made sure that the spark uber jar which is built has only Guava version 19.0. But still I get the same error when I execute spark-submit.

After further analysis, I found that spark-2.0.1-bin-hadoop2.7/jars has Gava v14.0.1 and this is getting loaded when I execute spark-submit without considering the Guava v19.0 in the spark application jar.

Then I replaced the v14.0.1 with v19.0 in spark-2.0.1-bin-hadoop2.7/jars and now I do not get any error and the application runs fine. But I think this is not a good approach and do not want to do that in prod.

If I run the same spark job in eclipse(by setting conf master=local in code and Run as Java program) it works fine.

I found similar issues in SO but did not find any resolution. Let me know if anyone faced the same issue and has a resolution for this.

Using Datastax Enterprise Cassandra 5.x

Thank You!!!


Solution

  • Spark 2.0.1 has Guava 14.x jar, cassandra-java-driver requires Guava version > 16.0.1. When we submit the spark job using spark-submit, the guava version in spark overrides the one that is in our spark application jar which results in the error in question. The issue is resolved by overriding the spark guava 14.x jar with the guava 19.0.jar

    1. Overriding the spark guava 14.x jar by passing the config below in the spark submit command --conf spark.driver.extraClassPath=/path/to/guava-19.0.jar --conf spark.executor.extraClassPath=/path/to/guava-19.0.jar

    2. Make sure our spark application jar does not contain any guava dependency(exclude transitive dependencies as well) version < 16.0.1 ... or u can include latest versions in pom.xml such that that version will be included in final jar/war