Search code examples
maventomcatapache-sparkhbaseresteasy

spark application has thrown java.lang.NoSuchMethodError: javax.ws.rs.core.Response.readEntity(Ljava/lang/Class;)Ljava/lang/Object


I have an application in java that uses spark and hbase. We need to hit a url deployed in tomcat(jersey). So, we have used resteasy client to do that.

When i execute a standalone java code to hit the url using rest-easy client, it works fine

However, when i use the same code in my another application that uses spark for some processing, then it throws the error as shown in the title. I am using maven as build tool in eclipse. After building it, i am creating a runnable jar and selecting the option "extract required libraries into generated jar". For executing the application i am using the command:

nohup spark-submit --master yarn-client myWork.jar myProperties 0 &

The dependency for rest-easy client code:

<dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
    <dependency>
    <groupId>org.jboss.resteasy</groupId>
    <artifactId>resteasy-client</artifactId>
    <version>3.0.11.Final</version>    
</dependency>
  </dependencies>

I am unable to figure out that during compile time , it does not throw any error, but during runtime, although the jar has each and every library packed in(including that of spark and hbase), it throws error saying no such method. Please help.


Solution

  • have tried changing the version of resteasy-client but it didn't help. during compile time i can see the class, how come at runtime it is missing

    Possible reasons could be reasons

    1) If you are using maven scope might be provided. so that your jar wont be copied to your distribution.

    This is ruled out by above configuration you have mentioned.

    2) You are not pointing to correct location from your execution script may be shell script.

    3) Your are not passing this jar with --jars option or --driverclasspath --executorclasspath etc...

    I doubt issue is because of second or third reasons.

    Also have a look at https://spark.apache.org/docs/1.4.1/submitting-applications.html

    EDIT :

    Question : spark-submit --conf spark.driver.extraClassPath=surfer/javax.ws.rs-api-2.0.1.jar:surfer/jersey-client-2.25.jar:surfer/jersey-common-2.25.jar:surfer/hk2-api-2.5.0-b30.jar:surfer/jersey-guava-2.25.jar:surfer/hk2-utils-2.5.0-b30.jar:surfer/hk2-locator-2.5.0-b30.jar:surfer/javax.annotation-api-1.2.jar artifact.jar againHere.csv

    now it throws different exception : Exception in thread "main" java.lang.AbstractMethodError: javax.ws.rs.core.UriBuilder.uri(Ljava/lang/String;)Ljavax/ws/rs/core/UriBuilder; i have also tried searching for the class Response$Status$Family somewhere in classpath other than what i am supplying. i used the command grep Response$Status$Family.class /opt/mapr/spark/spark-1.4.1/lib/*.jar And i found that spark also has this class. May be this is the issue. but how to forcefully tell the jvm to use the class supplied by me at runtime and not that of spark, i don't know! can you help?

    Since you provided external jar in the classpath

    You can use below options to tell framework that it has to use external jar provided by you. This can be done in 2 ways

    1. through spark submit
    2. conf.set...

    Since you are using 1.4.1 see configuration options

    spark.executor.userClassPathFirst false (Experimental) Same functionality as spark.driver.userClassPathFirst, but applied to executor instances.

    spark.driver.userClassPathFirst false (Experimental) Whether to give user-added jars precedence over Spark's own jars when loading classes in the the driver. This feature can be used to mitigate conflicts between Spark's dependencies and user dependencies. It is currently an experimental feature. This is used in cluster mode only. can be used to to tell framework