I'm working with some Kaggle project. Using Python library for BigQuery on my laptop, I can successfully download the dataset after passing the authentication credential by environment variable GOOGLE_APPLICATION_CREDENTIALS
. As the documentation explains, this environment variable points to the location of a JSON file containing the credential.
Now I want to run this code on Amazon Lambda using Chalice. I know there's an option for environment variable in Chalice, but I don't know how to include a JSON file inside of a Chalice app and pass its location as an environment variable. Moreover, I'm not sure whether it's safe to pass the credential as a JSON file in Chalice.
Does anyone have some experience on how to pass Google Credential as an environment variable for Chalice app?
You could just embed the contents of the JSON file as an environment variable in Chalice, and then use the GCP Client.from_service_account_info()
method to load credentials from memory instead of a file. This would not be advised since your private GCP credentials would then likely be committed to source control.
Might I suggest that you entertain other approaches to passing your GCP credentials other than environment variables. You could store this JSON object in AWS System Manager Parameter Store as a secure parameter. Your AWS Lambda function could then use the boto3 ssm.get_parameter()
method when needed.
You could also consider AWS Secrets Manager as another similar alternative.