Search code examples
google-bigquerygoogle-cloud-functionsservice-accountsgoogle-cloud-iam

Providing keyFilename of Google Client Service Account from Google Cloud Storage


To connect to Google Cloud BigQuery that exists in a different GCP project from a Google Cloud Function, I am creating the BigQuery Client as follows:

const {BigQuery} = require('@google-cloud/bigquery');
const options = {
    keyFilename: 'path/to/service_account.json',
    projectId: 'my_project',
  };
const bigquery = new BigQuery(options);

But instead of storing the service_account.json in my Cloud Function, I want to store the Service Account in Google Cloud Storage and provide the Google Cloud Storage path in the keyFilename above. I couldn't find any documentation if it is possible to provide a google cloud storage path instead of a local path.


Solution

  • You can not provide the Google Cloud Storage Path. Assuming you deployed your function with the right permissions to access the blob (key.json file) from your bucket, then you can download your file from Google Cloud Storage to \tmp directory of your Cloud Function.

    Downloading objects

    const {Storage} = require('@google-cloud/storage');
    const {BigQuery} = require('@google-cloud/bigquery');
    
    // Creates a client
    const storage = new Storage();
    
    async function downloadFile() {
      const options = {
        // The path to which the file should be downloaded, e.g. "./file.txt"
        destination: \tmp\key.json,
      };
    
      // Downloads the file
      await storage
        .bucket(bucketName)
        .file(srcFilename)
        .download(options);
    
      console.log(
        `gs://${bucketName}/${srcFilename} downloaded to ${destFilename}.`
      );
    }
    
    downloadFile().catch(console.error);
    
    const options = {
        keyFilename: '/tmp/key.json',
        projectId: 'my_project',
      };
    
    const bigquery = new BigQuery(options);
    
    
    
    

    A better solution would be to store the key.json file with Google Secret Manager. Then assign to your cloud function the role secretmanager.secretAccessor and access the secret from you cloud function.

    Creating secrets and versions

    /**
     * TODO(developer): Uncomment these variables before running the sample.
     */
    // const name = 'projects/my-project/secrets/my-secret/versions/5';
    // const name = 'projects/my-project/secrets/my-secret/versions/latest';
    
    // Imports the Secret Manager library
    const {SecretManagerServiceClient} = require('@google-cloud/secret-manager');
    
    // Instantiates a client
    const client = new SecretManagerServiceClient();
    
    async function accessSecretVersion() {
      const [version] = await client.accessSecretVersion({
        name: name,
      });
    
      // Extract the payload as a string.
      const payload = version.payload.data.toString('utf8');
    
      // WARNING: Do not print the secret in a production environment - this
      // snippet is showing how to access the secret material.
      console.info(`Payload: ${payload}`);
    }
    
    accessSecretVersion();