I have a set of boto3 powered commands in python. they seem to be working great to download files from S3.
bucket = s3_client.Bucket(bucket_name)
for obj in bucket.objects.filter(Prefix=key_dir):
# Iterate over all files to download
The trouble is, I've now tested uploading a file manually via the GUI to this same bucket and prefix area.. I can see the file is listing, it is of a non-zero size... all uploaded well from the GUI.. BUT
When I repeat the API calls like above... all the same older files come back just fine.. but my brand new file is omitted from the list..
Do I need to bust a cache or something? Seems the AWS Boto3 library by default is only seeing long lived files..
The behavior seems isolated to a bucket made in 2018.. so I plan to just ignore this and create a new bucket.. once I had a new bucket this behavior stopped altogether.. strange, but hope it helps others who might see this