Search code examples
c#azurepowershellazure-devopsazure-cli

I cannot extract .sql CREATE TABLEs for each table in my Azure SQL Database in my Azure Repo


Like the question says, how can I do that? This is the script I am currently using through the generation of a .dacpac file, but I've seen that this can be done with a PowerShell script instead of AzureCLI@2?

I tried this AzureCLI@2 script:

- task: AzureCLI@2
  inputs:
    azureSubscription: '$(azureSubscription)'
    scriptType: 'bash'
    scriptLocation: 'inlineScript'
    inlineScript: |
      set -e
      echo "Starting DACPAC extraction..."     
      # Obtain an access token
      echo "Obtaining access token..."
      accessToken=$(az account get-access-token --resource https://database.windows.net/ --query accessToken --output tsv)
      echo "Access token obtained."
      # Check if the access token was retrieved
      if [ -z "$accessToken" ]; then
        echo "Failed to obtain access token. Exiting."
        exit 1
      fi
      # Create directories if they don't exist
      mkdir -p $(Build.SourcesDirectory)/temp/DACPAC
      mkdir -p $(Build.SourcesDirectory)/some/ran/dir/
      # Extract DACPAC file to a temporary directory
      echo "Extracting DACPAC file..."
      ./sqlpackage/sqlpackage /Action:Extract /SourceServerName:$(sqlServerName) /SourceDatabaseName:$(sqlDatabaseName) /TargetFile:$(Build.SourcesDirectory)/temp/DACPAC/$(sqlDatabaseName).dacpac /AccessToken:$accessToken
      # Unzip the extracted DACPAC file
      echo "Unzipping DACPAC file..."
      unzip -o $(Build.SourcesDirectory)/temp/DACPAC/$(sqlDatabaseName).dacpac -d $(Build.SourcesDirectory)/temp/DACPAC     
      # Move extracted schema files to the target directory, replacing existing files
      echo "Replacing existing .sql files with updated schema..."
      find $(Build.SourcesDirectory)/temp/DACPAC -name '*.sql' -exec cp -v {} $(Build.SourcesDirectory)/some/ran/dir/ \; 2>error.log      
      echo "Schema extraction and replacement completed."
      echo "Listing files in target directory..."
      ls -l $(Build.SourcesDirectory)/some/ran/dir/  
      ls -l $(Build.SourcesDirectory)/temp/DACPAC
      # Commit and push the changes
      echo "Committing and pushing changes..."
      cd $(Build.SourcesDirectory)
      git config --global user.email "$(sqlAdminLogin)"
      git config --global user.name "Name"
      git add --all
      git status
      git commit -m "Update schema files from DACPAC extraction" || echo "No changes to commit"
      git push https://$(azureDevOpsPat)@dev.azure.com/my/git/repo/here HEAD:$(Build.SourceBranch) --force
      ls -l $(Build.SourcesDirectory)/some/ran/dir/
      echo "Changes committed and pushed."

I was expecting it to commit the files to the designated location, but it just says detached head. Trying to change the repo branch through using checkout and creating a temporary branch doesn't work either.

Update #1

This is what my pipeline shows for the above-mentioned task:

HEAD detached at fc510bf
nothing to commit, working tree clean
HEAD detached at fc510bf
nothing to commit, working tree clean
No changes to commit
Everything up-to-date

It does, however, show each script in the following dir:

-rw-r--r-- 1 vsts docker  995 Jul 24 06:42 Surgeons.sql

Update #2

The following path works, but I added it to my .gitignore as I am looking to have only the .sql scripts for each table:

$(Build.SourcesDirectory)/temp/DACPAC

Update #3

The script pulls this script:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Contact Report](
    [Id] [int] IDENTITY(1,1) NOT NULL,
    [Report Date] [datetime] NOT NULL,
    [Report Description] [text] NOT NULL,
    [Customer ID] [varchar](50) NOT NULL,
    [Location] [varchar](100) NOT NULL,
    [Follow Up] [varchar](100) NOT NULL,
    [Feedback] [text] NOT NULL,
    [Notes] [text] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
ALTER TABLE [dbo].[Contact Report] ADD PRIMARY KEY CLUSTERED 
(
    [Id] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
GO

But it should pull this script (As used in Azure Data Factory):

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Contact Report](
    [Id] [int] IDENTITY(1,1) NOT NULL,
    [Report Date] [datetime] NOT NULL,
    [Report Description] [text] NOT NULL,
    [Customer ID] [varchar](50) NOT NULL,
    [Location] [varchar](100) NOT NULL,
    [Follow Up] [varchar](100) NOT NULL,
    [Feedback] [text] NOT NULL,
    [Notes] [text] NOT NULL,
    [Staff ID] [varchar](10) NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
ALTER TABLE [dbo].[Contact Report] ADD PRIMARY KEY CLUSTERED 
(
    [Id] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
GO
ALTER TABLE [dbo].[Contact Report]  WITH CHECK ADD  CONSTRAINT [FK_ContactReport_Staff] FOREIGN KEY([Staff ID])
REFERENCES [dbo].[Staff] ([Staff ID])
GO
ALTER TABLE [dbo].[Contact Report] CHECK CONSTRAINT [FK_ContactReport_Staff]
GO

-Update #4

   echo "Extracting DACPAC file..."
   ./sqlpackage/sqlpackage /Action:Extract /SourceServerName:$(sqlServerName) /SourceDatabaseName:$(sqlDatabaseName) /TargetFile:$(Build.SourcesDirectory)/temp/DACPAC/$(sqlDatabaseName).dacpac /AccessToken:$accessToken
   # Unzip the extracted DACPAC file
   echo "Unzipping DACPAC file..."
   unzip -o $(Build.SourcesDirectory)/temp/DACPAC/$(sqlDatabaseName).dacpac -d $(Build.SourcesDirectory)/temp/DACPAC     
   # Move extracted schema files to the target directory, replacing existing files
   echo "Replacing existing .sql files with updated schema..."
   find $(Build.SourcesDirectory)/temp/DACPAC -name '*.sql' -exec cp -v {} $(Build.SourcesDirectory)/Database/dbo/Tables/ \; 2>error.log 

Solution

  • From the Bash script you shared, the cause of the issue could be that the following command will only extract single .dacpac file. There is no .sql files in the output.

    ./sqlpackage/sqlpackage /Action:Extract /SourceServerName:$(sqlServerName) /SourceDatabaseName:$(sqlDatabaseName) /TargetFile:$(Build.SourcesDirectory)/temp/DACPAC/$(sqlDatabaseName).dacpac /AccessToken:$accessToken
    

    To solve this issue, we need to add the argument: /p:ExtractTarget=File in the sqlpackage command. If you don't define the ExtractTarget argument, it will be set to DacPac by default.

    For example:

    ./sqlpackage/sqlpackage /Action:Extract /SourceServerName:$(sqlServerName) /SourceDatabaseName:$(sqlDatabaseName) /TargetFile:$(Build.SourcesDirectory)/temp/DACPAC/$(sqlDatabaseName).dacpac /AccessToken:$accessToken /p:ExtractTarget=File
    

    In this case, it will output a .dacpac single file and one or more .sql files.

    Then you can copy the .sql files to the target folder to push to Azure Repo.

    For more detailed info, you can refer to this doc: SqlPackage Extract parameters and properties