Search code examples
javagoogle-cloud-storagegoogle-api-clientnosuchmethoderrorgoogle-cloud-dataproc

NoSuchMethodError while reading from google cloud storage from Dataproc using java


This is my method to read the cloud storage file

public static String getStringObject(String bucketName, String fileName) throws Exception{
        BlobId blobId = BlobId.of(bucketName, fileName);
        byte[] content = storage.readAllBytes(blobId);
        String contentString = new String(content, UTF_8);
        return contentString;
    }

When I call this method from my dev environment to read a file from the bucket, it works fine. But when I call this method from Dataproc cluster while running a spark job, it throws the following Error.

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
    at com.google.api.gax.retrying.BasicRetryingFuture.<init>(BasicRetryingFuture.java:77)
    at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:75)
    at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:53)
    at com.google.cloud.storage.StorageImpl.readAllBytes(StorageImpl.java:460)`

Here is parts of my maven pom.xml

<dependency>
            <groupId>com.google.cloud</groupId>
            <artifactId>google-cloud-storage</artifactId>
            <version>1.4.0</version>
        </dependency>
        <dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <version>19.0</version>
</dependency>
<dependency>
    <groupId>com.google.api-client</groupId>
    <artifactId>google-api-client</artifactId>
    <version>1.22.0</version>
</dependency>

What am i doing wrong here ?


Solution

  • When running on Dataproc, you will often find Hadoop libraries on client application classpaths. This allows interaction with HDFS and GCS, but also means that the Hadoop version of guava is present.

    You can find a method of working around this using shade in this SO answer