I am using the SQL Deploy DACPAC community contributed step to deploy my dacpac to the server within Octopus.
It has been setup correctly and has been working fine until the below situation occurs.
I have a situation where I am dropping columns but the deploy keeps failing due to rows being detected. I am attempting to use /p:BlockOnPossibleDataLoss=false
as an "Additional deployment contributor arguments" but it seems to be ignored.
Can anyone guide me to what is wrong?
After multiple updates by the original author, this issue was still not resolved. The parameter has actually since been completely removed since version 11.
Initially, I added a pre-deployment script that copied all the data from the tables that were expected to fail, delete all the data, allow the table schema to update as normal, and in a post-deployment script re-insert all the data into the new structure. The problem with this was that for data that could be lost, a pre-deployment and post-deployment script was required when it wasn't really needed.
Finally, I got around this by duplicating the community step "SQL - Deploy DACPAC" (https://library.octopus.com/step-templates/58399364-4367-41d5-ad35-c2c6a8258536/actiontemplate-sql-deploy-dacpac) by saving it as a copy from within Octopus. I then went into the code, into the function Invoke-DacPacUtility
, and added the following code:
[bool]$BlockOnPossibleDataLoss
into the parameter listWrite-Debug (" Block on possible data loss: {0}" -f $BlockOnPossibleDataLoss)
into the list of debuggingif (!$BlockOnPossibleDataLoss) { $dacProfile.DeployOptions.BlockOnPossibleDataLoss = $BlockOnPossibleDataLoss; }
into the list of deployment optionsThen, I went into the list of parameters and added as follows:
With this, I am able to change the value of this parameter with the checkbox when using the step in the process of the project.