Trying to use Python to get and iterate through all of the files inside of a Cloud Storage bucket I own. I'm using the official library, google-cloud-storage
.
Using gsutil
, I can run commands like gsutil ls gs://my-composer-bucket/dags/composer_utils/
. Does the google-cloud-storage
library offer an equivalent method to gsutil ls
? I'd like to use the Python client rather than shell out to gsutil
(don't want to install and authenticate the GCloud SDK inside of a Docker image).
I've tried a few different things which have left me confused on how blobs work:
>>> dag_folder_blob = cloud_composer_bucket.blob(bucket, 'dags/')
>>> dag_folder_blob.exists()
True
>>> util_folder_blob = cloud_composer_bucket.blob(bucket, 'dags/composer_utils/') # directory exists
>>> util_folder_blob.exists()
False
>>> util_file_blob = cloud_composer-bucket.blob(bucket, 'dags/composer_utils/__init__.py')
>>> util_file_blob.exists()
True
You will want to use the list_blobs method of a Bucket object. Read more about listing objects in Cloud Storage.