cursor.execute("""
copy dimDate from 's3://BUCKETURI/output/dimDate.csv'
credentials 'aws_iam_role=arn:aws:iam::ACCOUNTID:role/role-s3-to-redshift-and-vice-versa'
delimiter ','
region 'us-east-1'
IGNOREHEADER 1
""")
Error output:
ProgrammingError: {'S': 'ERROR', 'C': 'XX000', 'M': "Load into table 'dimdate' failed. Check 'stl_load_errors' system table for details.", 'F': '../src/pg/src/backend/commands/commands_copy.c', 'L': '737', 'R': 'CheckMaxRowError'}
I'm pretty sure there is something to do with the credentials. I used both the iam role and the keys, but this won't budge. Any help would be highly appreciated! I also checked the stl load errors and it says it has to do something with the date as well. I'm confused.
I tried to load and copy s3 contents into my redshift database. Didn't work after trying for more than an hour and I'm pretty sad.
It is saying that the operation ran, but had data errors.
Run this command for more information:
select * from stl_load_errors order by starttime desc
It will show the most recent load errors.
From STL_LOAD_ERRORS - Amazon Redshift:
STL_LOAD_ERRORS contains a history of all Amazon Redshift load errors.