It's my first time with AWS Lambdas and DynamoDB, in Python.
I have my Table with only 474 records and a total weight of 890,4 kilobytes. The filtered selection is around only 380 records.
Quite small, but still it takes severall seconds to query. Really too slow.
I use a global secondary index, as it's supposed to be the way to filter the collection.
dynamo = boto3.resource('dynamodb', region_name='us-east-1',
aws_access_key_id='my_key',
aws_secret_access_key= 'my_key')
table = dynamodb.Table('my_table_name')
response = table.query(
IndexName='venta_arge_ok-index',
KeyConditionExpression=Key('venta_arge_ok').eq('True'))
When I test the Lambda in the console I get this figures:
Run the test several times and got consistent times around 3 sec:
What am I missing here? Thanks!
As Noel Llevares pointed, the issue was related with memory size (and it's related CPU power increase).
The same Lambda gave the following results for each memmory size tested:
128 MB - 3 sec 256 MB - 1.5 sec 512 MB - 0.75 sec 1024 MB - 0.4 sec