I am attempting to automate the activation of SSIS Integration Runtimes by running a pipeline containing a Custom Activity in Azure Data Factory.
I have set the Batch Service up with a linked storage account and have successfully started to run a .ps1 file in the linked storage account. I know it find the file OK because I can see a node is running and I get an adfjob set of logs in my storage account.
The Powershell script is a simple one liner:
Start-AzDataFactoryV2IntegrationRuntime -Name SSIS -ResourceGroupName <RG Name> -DataFactoryName <ADF Name> -Force
However, the output log file says that it cannot find the cmdlet:
The term 'Start-AzDataFactoryV2IntegrationRuntime' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
So I take it from the log that Powershell is available on the node but the Az
module is not. I find this extremely surprising given it's an Azure Batch Service node. I've tried adding an Install-Module Az ...
to the start of this script, the result is it appears to be hanging and I don't know how to track if it is doing anything or not, but in any case I cancelled after 8mins because I'm pretty sure it would have installed by then.
So I am therefore wondering where the Az module should be installed and how to go about doing so?
You could install the Az
module with your Batch Start task in order for your task to use it.
By associating a start task with a pool, you can prepare the operating environment of its nodes. For example, you can perform actions such as installing the applications that your tasks run, or starting background processes.