I'm trying to download a file from Cloud Storage from my trainer application which runs in Cloud ML engine. However I'm getting the following error when I try to download the file. I do have access to the Cloud Storage path.
Error:
blob.download_to_filename(destination_file_name) File "/root/.local/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 482, in download_to_filename self.download_to_file(file_obj, client=client) File "/root/.local/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 464, in download_to_file self._do_download(transport, file_obj, download_url, headers) File "/root/.local/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 418, in _do_download download.consume(transport) File "/root/.local/lib/python2.7/site-packages/google/resumable_media/requests/download.py", line 101, in consume self._write_to_stream(result) File "/root/.local/lib/python2.7/site-packages/google/resumable_media/requests/download.py", line 62, in _write_to_stream with response: AttributeError: __exit__
Here is the code for downloading the GCS file:
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(key)
blob.download_to_filename(destination_file_name)
I'm not providing any GCP credentials to Client
as the trainer application running can access other files using
tf.train.string_input_producer
Any help would be appreciated.
The trainer application accesses other files via TensorFlow's file_io module. This post has a few tips, but if you want to open a file:
with file_io.FileIO("gs://my_bucket/myfile") as f:
f.read()
There is also a copy(..)
function and read_file_to_string(...)
, if those fit your needs better.