I am following AWS documentation on how to transfer DDB table from one account to another. There are two steps:
I was able to do the first step. Unfortunately the instructions don't say how to do the second step. I have worked with Glue a couple of times, but the console UI is very user un-friendly and I have no idea how to achieve it.
Can somebody please explain how to import the data from S3 into the DDB?
I just used AWS Glue for this purpose. You will need to create a new IAM role for your Glue service, it should have access to S3 and DynamoDB.
def MyTransform(glueContext, dfc) -> DynamicFrameCollection:
S3bucket_node = dfc["AmazonS3_node123456789"]
ApplyMapping_node2 = ApplyMapping.apply(
frame=S3bucket_node,
mappings=[
("Item.digest.S", "string", "digest", "string"),
("Item.locale.S", "string", "locale", "string"),
("Item.value.S", "string", "value", "string"),
("Item.translation.S", "string", "translation", "string"),
("Item.created_at.S", "string", "created_at", "string"),
],
transformation_ctx="ApplyMapping_node2"
)
S3bucket_node3 = glueContext.write_dynamic_frame.from_options(
frame=ApplyMapping_node2,
connection_type="dynamodb",
connection_options={"dynamodb.output.tableName": "<destination-dynamo-tablename>"}
)
return DynamicFrameCollection({"S3bucket_node3": S3bucket_node3}, glueContext)
Replace the "AmazonS3_node123456789" with your S3 node ID (you can find your ID in the script) and the mappings list with your table fields. Also, don't forget to replace "destination-dynamo-tablename" with your dynamo table name.