I have a maven java project that iterates over the files of a directory and uploads them to google cloud storage bucket. The code is something similar to the following
public void uploadFiles(String projectId, String bucketName, Path sourceDirectory){
var storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();
try(var files = java.nio.file.Files.walk(sourceDirectory)){
files.forEach{ file-> {
var blobId = BlobId.of(bucketName, getBlobPath(file)); \\getBlobPath gets the relative path for the file in google cloud bucket
var blobInfo = BlobInfo.newBuilder(blobId).build();
storage.createFrom(blobInfo, file, BlobWriteOption.detectContentType());
}
}
}
}
Here is my pom
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>26.19.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
#All google libraries I am using in various parts of the project
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-nio</artifactId>
</dependency>
This works fine, but if I update the version of libraries to latest(26.22.0) or any higher version at all (26.20.0), the upload fails with the following error (I have removed the url from the error message)
com.google.cloud.storage.StorageException: Client side data loss detected. Attempt to append to a resumable session with an offset higher than the backend has
|> PUT https://storage.googleapis.com/upload/storage/v1/b/......
|> content-range: bytes 0--1/0
|
|< HTTP/1.1 400 Bad Request
|< content-length: 37
|< content-type: text/plain; charset=utf-8
|< x-guploader-uploadid:
|<
|< Failed to parse Content-Range header.
|
at com.google.cloud.storage.JsonResumableSessionFailureScenario.toStorageException(JsonResumableSessionFailureScenario.java:185)
at com.google.cloud.storage.JsonResumableSessionFailureScenario.toStorageException(JsonResumableSessionFailureScenario.java:117)
at com.google.cloud.storage.JsonResumableSessionPutTask.call(JsonResumableSessionPutTask.java:204)
at com.google.cloud.storage.JsonResumableSession.lambda$put$0(JsonResumableSession.java:81)
at com.google.cloud.storage.Retrying.lambda$run$0(Retrying.java:102)
at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)
at com.google.cloud.RetryHelper.run(RetryHelper.java:76)
at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)
at com.google.cloud.storage.Retrying.run(Retrying.java:99)
at com.google.cloud.storage.JsonResumableSession.put(JsonResumableSession.java:68)
at com.google.cloud.storage.StorageImpl.createFrom(StorageImpl.java:258)
at com.google.cloud.storage.StorageImpl.createFrom(StorageImpl.java:224)
I could not find any explanation or anyone having similar issues after searching. Any help is appreciated. Thanks
Was able to get it working by changing to using inputstream instead of the Path to source.
public void uploadFiles(String projectId, String bucketName, Path sourceDirectory){
var storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();
try(var files = java.nio.file.Files.walk(sourceDirectory)){
files.forEach{ file-> {
var blobId = BlobId.of(bucketName, getBlobPath(file)); \\getBlobPath gets the relative path for the file in google cloud bucket
var blobInfo = BlobInfo.newBuilder(blobId).build();
storage.createFrom(blobInfo, new BufferedInputStream(Files.newInputStream(file)), BlobWriteOption.detectContentType());
}
}
}
}