Search code examples
pythongcloudgsutilgoogle-python-api

Does gcloud storage python client API support parallel composite upload?


The gsutil command has options to optimize upload/download speed for large files. For example

GSUtil:parallel_composite_upload_threshold=150M
GSUtil:sliced_object_download_max_components=8

see this page for reference.

What is the equivalence in the google.cloud.storage python API? I didn't find the relevant parameters in this document.

In general, does the client API and gsutil have one to one correspondence in terms of functionalities?


Solution

  • I think it's not natively supported.

    However (!) if you're willing to decompose files then use threading or multiprocessing, there is a compose method that should help you assemble the parts into one GCS object.

    Ironically, gsutil is written in Python but it uses a library gslib to implement parallel uploads. You may be able to use gslib as a template.