I've got a simple script that uses boto3 to pull a dynamoDB table. I then need to use aws-encryption-sdk to decrypt an entry (Can't just use kms with boto3 since it was encrypted with aws-encryption-sdk)
My problem is that I'm using a named profile for boto3 to pull creds, but I can't seem to pass this session on to the AWS encryption sdk. If I paste the raw environment variables into my CLI session, then it all works though.
Is there a way for me to set my environment variables with my active boto3 session so that the aws sdk will use them?
I found the issue, it seems aws_encryption_sdk wants a botocore_session=
parameter passed to it, which is not really clear in the docs imo.
import boto3
import base64
import aws_encryption_sdk
from aws_encryption_sdk.identifiers import CommitmentPolicy
import botocore.session
botocore_session = botocore.session.get_session()
REGION = "us-west-2"
session = boto3.Session(profile_name='admin', region_name=REGION, botocore_session=botocore_session)
ciphertext = 'very encrypted text'
client = aws_encryption_sdk.EncryptionSDKClient(commitment_policy=CommitmentPolicy.FORBID_ENCRYPT_ALLOW_DECRYPT)
kms_prime_key = aws_encryption_sdk.StrictAwsKmsMasterKeyProvider(
key_ids=['arn:aws:kms:us-west-2:123456789012:key/1234-acb-ffff-22222-4444'], botocore_session = botocore_session
)
my_ciphertext, encryptor_header = client.decrypt(
source=base64.b64decode(ciphertext),
key_provider=kms_prime_key
)
plaintext_pass = my_ciphertext.decode("utf-8")
print(plaintext_pass)