Search code examples
pythonazureazure-devopsazure-storageazure-batch

Moving output files from Azure Batch to Data Lake


I'm following this tutorial (https://learn.microsoft.com/en-us/azure/batch/quick-run-python) to run a series of python scripts in Azure Batch. These tasks generate a few files that I want to move to a datalake container found in a storage account that is already linked to the batch account where I am running the job. My question is: how can I do this without uploading the keys as part of the resource files? That is, I want the keys to never leave my local computer. Can I access the as environment variables or something like that?

Thank you for your help!


Solution

  • Use Batch pool managed identities where the identity has access to your ADLS. Use the identity in the task to perform the output/upload action.