I'm currently using boto3 to get and put items into DynamoDB. I can get and put like this:
def get_item(key):
# key is a dict like {"partitionkey": "abc", "sortkey": "123"}
table = boto3.resource("dynamodb").Table(DYNAMO_TABLE)
response = table.get_item(Key=key)
return response.get("Item")
def put_item(key, item):
# item is a dict of data to store, e.g. {"name": "uhoh", "size": 100}
table = boto3.resource("dynamodb").Table(DYNAMO_TABLE)
return table.put_item(Item={**key, **item})
This works great! I pass Python dictionary to the put_item
function, and I get a Python dictionary back when I call get_item
. I don't need to do any of the DynamoDB encoding stuff1, it just works.
Now I'm looking at introducing update_item
in addition to put_item
. It seems I cannot pass these Python types to the update_item
method in the same way; when I try I get
table.update_item(Key=my_key, UpdateExpression="SET #X = :x",
ExpressionAttributeNames={"#X": "my_property"},
ExpressionAttributeValues={":x": somedict})
botocore.exceptions.ParamValidationError: Parameter validation failed: Unknown parameter in ExpressionAttributeValues.:x: "my_key", must be one of: S, N, B, SS, NS, BS, M, L, NULL, BOOL
My question: is there a way to have the update_item
handle this transformation for me the same as put_item
does? Or is there a utility that can transform back and forth between these two formats? At this point I'm thinking update_item
isn't worth the extra effort and lines of code.
1 Transformation from [{"my_key": "my_value"}]
to {"L": [{"M": {"my_key": {"S": "my_value"}}}]}
is handled auto-magically!
Yes. You can use the Resource Client to use native JSON dicts. Have a read of this blog which outlines the differences and shows how to achieve it:
https://aws.amazon.com/blogs/database/exploring-amazon-dynamodb-sdk-clients/