I currently have created an Azure storage account. Inside of this storage, I've created two file shares. I've uploaded files into each file share, and would like to access these files from within the Azure DevOps pipeline.
I've researched online how to do this, and have not found a resource detailing how to do this. Has anyone done this before? If yes, what are the steps to read file share files from an azure devOps pipeline?
Thanks.
People asked
I found a solution using the Microsoft Azure File Share Storage Client Library for Python. I ran the following steps inside of my Azure pipeline to connect to my File Share. Below is an example that connects to the file share and shares all its contents:
- task: UsePythonVersion@0
displayName: 'Set Python 3.8.3'
inputs:
versionSpec: 3.8.3
addToPath: true
name: pyTools
- script: $(pyTools.pythonLocation)/bin/pip3 install azure-storage-file-share
displayName: Install azure-storage-file-share module
- task: PythonScript@0
displayName: Show files and directories inside of File Share
inputs:
scriptSource: 'inline'
script: |
import platform
from azure.storage.fileshare import ShareDirectoryClient
connection_string = "DefaultEndpointsProtocol=https;AccountName=<storage-name>;AccountKey=<access-key>==;EndpointSuffix=core.windows.net"
parent_dir = ShareDirectoryClient.from_connection_string(conn_str=connection_string, share_name=<file-share-name>, directory_path="")
my_list = list(parent_dir.list_directories_and_files())
print(my_list)
print(platform.python_version()