im following the documentation for CI CD Data Factory https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery-improvements
I have already done this process before in another accounts and it works, but if i wanted to replicate, i got this error
`Task : npm
Description : Install and publish npm packages, or run an npm command. Supports npmjs.com and authenticated registries like Azure Artifacts.
Version : 1.238.3
Author : Microsoft Corporation
Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/package/npm
==============================================================================
/opt/hostedtoolcache/node/18.20.4/x64/bin/npm --version
10.7.0
/opt/hostedtoolcache/node/18.20.4/x64/bin/npm config list
; "env" config from environment
userconfig = "/home/vsts/work/1/npm/741.npmrc"
; node bin location = /opt/hostedtoolcache/node/18.20.4/x64/bin/node
; node version = v18.20.4
; npm local prefix = /home/vsts/work/1/s/build
; npm version = 10.7.0
; cwd = /home/vsts/work/1/s/build
; HOME = /home/vsts
; Run `npm config ls -l` to show all defaults.
/opt/hostedtoolcache/node/18.20.4/x64/bin/npm run build export /home/vsts/work/1/s /subscriptions/SUBSCRIPTION/resourceGroups/RG/providers/Microsoft.DataFactory/factories/DATAFACTORY ArmTemplate
> build
> node node_modules/@microsoft/azure-data-factory-utilities/lib/index export /home/vsts/work/1/s /subscriptions/SUBSCRIPTION/resourceGroups/RG/providers/Microsoft.DataFactory/factories/DATAFACTORY ArmTemplate
Downloading bundle from: https://adf.azure.com/assets/cmd-api/main.js
Process cwd: /home/vsts/work/1/s/build
Bundle downloaded successfully, saved in: /home/vsts/work/1/s/build/downloads/main.js
Executing bundle...
Inserting the following arguments: export /home/vsts/work/1/s /subscriptions/$(SUBSCRIPTION)/resourceGroups/$(RG)/providers/Microsoft.DataFactory/factories/$(DATAFACTORY) ArmTemplate
Executing bundle file, full command:
node /home/vsts/work/1/s/build/downloads/main.js export /home/vsts/work/1/s /subscriptions/$(SUBSCRIPTION)/resourceGroups/$(RG)/providers/Microsoft.DataFactory/factories/$(DATAFACTORY) ArmTemplate
CmdApiApp: Initializing application.
Resource: /subscriptions/c86828e7-97bf-4d44-8693-11edaef80c32/resourceGroups/tutorialtati/providers/Microsoft.DataFactory/factories/tatidatatest
RootFolder: /home/vsts/work/1/s/
ModelService: synchronize - start
ModelService: Dynamic connector - Start registering dynamic connectors
DynamicConnectorService: Dynamic connector - Start registering connector: Dataworld
DynamicConnectorService: Dynamic connector - Finished registering connector: Dataworld
DynamicConnectorService: Dynamic connector - Start registering connector: Asana
DynamicConnectorService: Dynamic connector - Finished registering connector: Asana
DynamicConnectorService: Dynamic connector - Start registering connector: Twilio
DynamicConnectorService: Dynamic connector - Finished registering connector: Twilio
DynamicConnectorService: Dynamic connector - Start registering connector: AppFigures
DynamicConnectorService: Dynamic connector - Finished registering connector: AppFigures
DynamicConnectorService: Dynamic connector - Start registering connector: GoogleSheets
DynamicConnectorService: Dynamic connector - Finished registering connector: GoogleSheets
CmdApiApp: Initializing resource registries...
CmdApiApp: Initializing model service...
ModelService: Dynamic connector - Finished registering dynamic connectors
ModelService: _createAndFetchEntities synchronizeInternal - start
ModelService: _createAndFetchEntities synchronizeInternal - end
BaseFileResourceProviderService: populateAllResources - start
CmdApiApp: Initializing publish config service...
PublishConfigService: _getLatestPublishConfig - retrieving config file.
LocalFileClientService: Unable to list files for: integrationRuntime, error: Error: ENOENT: no such file or directory, scandir '/home/vsts/work/1/s/integrationRuntime'
LocalFileClientService: Unable to list files for: pipeline, error: Error: ENOENT: no such file or directory, scandir '/home/vsts/work/1/s/pipeline'
LocalFileClientService: Unable to list files for: dataset, error: Error: ENOENT: no such file or directory, scandir '/home/vsts/work/1/s/dataset'
LocalFileClientService: Unable to list files for: linkedService, error: Error: ENOENT: no such file or directory, scandir '/home/vsts/work/1/s/linkedService'
LocalFileClientService: Unable to list files for: trigger, error: Error: ENOENT: no such file or directory, scandir '/home/vsts/work/1/s/trigger'
Execution finished....
##[warning]Couldn't find a debug log in the cache or working directory
##[error]Error: Npm failed with return code: 1`
My pipeline is looking like this:
# Sample YAML file to validate and export an ARM template into a build artifact
# Requires a package.json file located in the target repository
trigger:
- main #collaboration branch
pool:
vmImage: 'ubuntu-22.04'
variables:
- group: DataFactory
steps:
# Installs Node and the npm packages saved in your package.json file in the build
- task: UseNode@1
inputs:
version: '18.x'
displayName: 'Install Node.js'
- task: Npm@1
inputs:
command: 'install'
workingDir: '$(Build.Repository.LocalPath)/build' #replace with the package.json folder
verbose: true
displayName: 'Install npm package'
# Validates all of the Data Factory resources in the repository. You'll get the same validation errors as when "Validate All" is selected.
# Enter the appropriate subscription and name for the source factory. Either of the "Validate" or "Validate and Generate ARM temmplate" options are required to perform validation. Running both is unnecessary.
- task: Npm@1
inputs:
command: 'custom'
workingDir: '$(Build.Repository.LocalPath)/build' #replace with the package.json folder
customCommand: 'run build validate $(Build.Repository.LocalPath) /subscriptions/$(Subscription)/resourceGroups/$(ResourceGroup)/providers/Microsoft.DataFactory/factories/$(DataFactory)'
displayName: 'Validate'
# Validate and then generate the ARM template into the destination folder, which is the same as selecting "Publish" from the UX.
# The ARM template generated isn't published to the live version of the factory. Deployment should be done by using a CI/CD pipeline.
- task: Npm@1
inputs:
command: 'custom'
workingDir: '$(Build.Repository.LocalPath)/build' #replace with the package.json folder
customCommand: 'run build export $(Build.Repository.LocalPath) /subscriptions/$(Subscription)/resourceGroups/$(ResourceGroup)/providers/Microsoft.DataFactory/factories/$(DataFactory) "ArmTemplate"'
displayName: 'Validate and Generate ARM template'
# Publish the artifact to be used as a source for a release pipeline.
- task: PublishPipelineArtifact@1
inputs:
targetPath: '$(Build.Repository.LocalPath)/build/ArmTemplate' #replace with the package.json folder
artifact: 'ArmTemplates'
publishLocation: 'pipeline'
And the structure of the folders: Files
And the raw file: Raw File
After comparing your full debug log with my debug log, I found the difference.
Your log:
2024-08-29T14:31:16.1491161Z [0m CmdApiApp: Publishable resource count: 0
2024-08-29T14:31:16.1491388Z [0m CmdApiApp: Publishable parameters count: 0
2024-08-29T14:31:16.1492466Z [31m ERROR === CmdApiApp: Failed to export ARM template. Error: {"stack":"Error: No resource found in specified input path: /home/vsts/work/1/s. Please set correct path and try again.\n at Iw.<anonymous> (/home/vsts/work/1/s/build/downloads/main.js:2:16164635)\n at Generator.next (<anonymous>)\n at s (/home/vsts/work/1/s/build/downloads/main.js:2:13876326)","message":"No resource found in specified input path: /home/vsts/work/1/s. Please set correct path and try again."}
My log:
2024-08-29T05:14:02.9684555Z [0m CmdApiApp: Publishable resource count: 2
2024-08-29T05:14:02.9685098Z [0m Validator: Start validation for: pipeline - pipeline1
2024-08-29T05:14:02.9685401Z [0m Validator: Start validation for: pipeline - pipeline2
2024-08-29T05:14:02.9685609Z [0m CmdApiApp:
2024-08-29T05:14:02.9685778Z Validation finished. No errors found.
The structure of the folders difference:
As you can see from the log and the screenshot, I added two pipelines in the Data Factory for test. But it seems that your Data Factory is empty. There is nothing in the Data Factory that can be export, so that the pipeline failed.
When I created a new empty Data Factory, I can reproduce the same error in my test pipeline. Please add something in your Data Factory and try again.