What is the easiest approach (easiest implies low number of service maintenance overhead. Would prefer server less approach if possible) to copy data from a DDB table in one account to another, preferably in server less manner (so no scheduled jobs using Data pipelines).
I was exploring possibility of using DynamoDB streams, however this old answer mentions that is not possible. However, I could not find latest documentation confirming/disproving this. Is that still the case?
Another option I was considering: Update the Firehose transform lambda that manipulates and then inserts data into the DynamoDB table to publish this to a Kinesis stream with cross account delivery enabled triggering a Lambda that will further process data as required.
This should be possible
DynamoDBCrossAccountRole
in the destination account with permissions to do necessary operations on the destination DDB table (this role and destination DDB table are in the same account) sts:AssumeRole
permissions to your Lambda function's execution role in addition to logs
permissions for CloudWatch so that it can assume the cross-account rolests:AssumeRole
from within your lambda function and configure DynamoDB client with these permissions, example:client = boto3.client('sts')
sts_response = client.assume_role(RoleArn='arn:aws:iam::<999999999999>:role/DynamoDBCrossAccountRole',
RoleSessionName='AssumePocRole', DurationSeconds=900)
dynamodb = boto3.resource(service_name='dynamodb', region_name=<region>,
aws_access_key_id = sts_response['Credentials']['AccessKeyId'],
aws_secret_access_key = sts_response['Credentials']['SecretAccessKey',
aws_session_token = sts_response['Credentials']['SessionToken'])