Search code examples
google-cloud-platformdata-processingdata-ingestiongoogle-cloud-data-fusiondata-pipeline

Google data fusion Execution error "INVALID_ARGUMENT: Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 2048.0."


I am trying load a Simple CSV file from GCS to BQ using Google Data Fusion Free version. The pipeline is failing with error . it reads

com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 2048.0.
    at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:49) ~[na:na]
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72) ~[na:na]
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60) ~[na:na]
    at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97) ~[na:na]
    at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68) ~[na:na]

same error is repeated for both Mapreduce and Spark execution pipeline. Appreciate any help in fixing this issue . Thanks

Regards KA


Solution

  • It means that the requested total compute disks would put the project over the GCE quota for the project. There are both project wide and regional quotas. You can see that documentation here: https://cloud.google.com/compute/quotas

    To resolve this, you should increase the quota in your GCP project.