I'm using Lambda to unzip files in S3 using boto3, Zipfile (as zf), and BytesIO. When the function is triggered, the following code runs (I have already initalized the s3 client at this point. I used this example for the basis of my code. The bucketname and zip_key values are found without issue. The s3_uri is also generated appropriately. :
bucketname = event["Records"][0]['s3']['bucket']['name']
zip_key = event["Records"][0]['s3']['object']['key']
s3_uri = 's3://'+bucketname+'/'
#get zip folder from S3
zip_obj = s3_client.Object(bucket_name=bucketname, key=zip_key)
#create IO buffer
buffer = BytesIO(zip_obj.get()["Body"].read())
z = zf.ZipFile(buffer)
for filename in z.namelist():
file_info = z.getinfo(filename)
#Ignore folders generated by Mac zip operation
if "__MACOSX" in str(file_info):
print("")
else:
s3_client.meta.client.upload_fileobj(z.open(filename),Bucket=bucketname,Key=f'output/{filename}')
When running the code, I get the following error when creating the IO buffer:
NoSuchKey: An error occurred (NoSuchKey) when calling the GetObject operation: The specified key does not exist.
Traceback (most recent call last):
File "/var/task/app.py", line 22, in lambda_handler
buffer = BytesIO(zip_obj.get()["Body"].read())
The zip file has a main folder, with 4 subfolders, each with Excel files in them that need to be read and processed. I noticed this works with a zip file with a single excel file in it, however it doesn't work with multiple folders with files within them. Am I missing something in operation when creating the IO buffer? Any help is appreciated
Problem not in the zip file, problem is in the s3_client.Object
, which doesn't have key Body
. Possibly, URI is incorrect?