Search code examples
node.jssql-servertypescriptazure-devopssql-server-2017

Azure Devops Build Task executing SQL script aborts after a number of sp_rename calls


I want to build an Azure Devops build task that executes a list of SQL scripts against a SQL Server 2017 database. I followed this tutorial to build the task: https://learn.microsoft.com/en-us/azure/devops/extend/develop/add-build-task?view=azure-devops

The task generally runs successfully and I already executed various scripts against my local database (in SQL Server 2017 Express). I'm using the npm package "mssql/msnodesqlv8" (native SQL Server driver for mssql) to connect and execute the scripts directly in node.js.

async function executeBatches(script: string, pool: sql.ConnectionPool) {
    const batches = script.split("\r\nGO");
    for (const batch of batches) {
        console.log("Executing script:", batch);
        await pool.batch(batch);
    }
}

Now I found a script that fails in a very strange way. The script in question executes a number of renames over 4 tables in a transaction like this:

BEGIN TRANSACTION
EXEC sp_rename 'table' , 'newTable'
EXEC sp_rename 'newTable.column', 'newColumn', 'COLUMN'
(repeat for several columns)
EXEC sp_rename 'dbo.PK_table', 'PK_newTable'

(repeat for 3 more tables)
COMMIT

In SQL Server Management Studio the script executes correctly. But in the Devops task this script aborts after about 18 sp_rename calls. There is no error thrown and the transaction is left open. The client will continue running (since it got no error) and after executing some more queries SQL Server initiates a rollback and will of course roll back all changes since executing this script.

I switched the statements in the script around and tried commenting out some lines but it always aborts after about 18 sp_rename calls. When I remove enough lines so there are 18 or less sp_rename calls the task can run the script completely and commits the changes (it doesn't matter which lines).

When I remove the transaction it will execute all renames until that magical number and then still abort the script and leave the implicit transaction from the last statement open so it will still roll back all changes after some more queries.

I ran the SQL Profiler and it shows StmtStarting for a rename and then BatchCompleted with error "2 - Abort" but there is no other error or reason shown of why the batch was aborted.

The system_health session shows 2 errors when the script is executed:

Error #1

A connectivity error with tds_flags "DisconnectDueToReadError, NetworkErrorFoundInInputStream, NormalDisconnect" and tds_input_buffer_error 109.

Error #2

A security error with error_code 5023 (which means "The group or resource is not in the correct state to perform the requested operation.")

Searching for these errors online gave no usable results since they either have no solution or are related to login problems which I believe is not the case since I can execute other scripts just fine.

I also already checked the encoding and that the script is correctly read via nodes "fs" library.

Any help or pointers to where I could find a cause for this issue would be greatly appreciated.

EDIT: I got around to building a smaller example with just msnodesqlv8.

import tl = require("azure-pipelines-task-lib/task");
import fs = require("fs");
import path = require("path");
import util = require("util");

import { SqlClient } from "msnodesqlv8";
// tslint:disable-next-line: no-var-requires
const sqlClient: SqlClient = require("msnodesqlv8");
const open = util.promisify(sqlClient.open);
const query = util.promisify(sqlClient.query);

async function run() {
    try {
        const scriptDirectory = tl.getInput("ScriptDirectory", true) ?? "";
        const connectionString = "Driver={ODBC Driver 13 for SQL Server};Server={.\\SQLEXPRESS};Uid={sa};Pwd={start};Database={MyDatabase};Encrypt={yes};TrustServerCertificate={yes}";
        const scriptsDir = fs.readdirSync(scriptDirectory);
        const con = await open(connectionString);
        const close = util.promisify(con.close);
        try {
            const conQuery = util.promisify(con.query);
            for (const file of scriptsDir) {
                console.log("Executing:", file);
                const script = readFileWithoutBom(path.join(scriptDirectory, file));
                console.log("Executing script:", script);
                // await query(connectionString, { query_str: script, query_timeout: 120 });
                await conQuery({ query_str: script, query_timeout: 120 });

                const insert = `INSERT INTO AppliedDatabaseScript (ScriptFile, DateApplied) VALUES ('${file}', GETDATE())`
                console.log("Executing script:", insert);
                // await query(connectionString, { query_str: insert });
                await conQuery({ query_str: insert });
            }
        } finally {
            await close();
        }
    }
    catch (err) {
        console.error(err);
        tl.setResult(tl.TaskResult.Failed, err.message);
    }
}

// Strip BOM. SQL Server won't execute certain scripts with BOM.
function readFileWithoutBom(filePath: string) {
   return fs.readFileSync(filePath, "utf-8").replace(/^\uFEFF/, "");
}

The behavior is still the same. I tried with a common connection and separate connection for each query. It will rollback everything in the same connection and continue as if no error occurred. I also fiddled around with the query timeout but there is no relation to the error at all.


Solution

  • I managed to get it working by switching to the non-native driver tedious (also supported by mssql). It seems the native SQL Server driver for node.js is not working correctly.

    In case anyone is having problems connecting with tedious (since I couldn't find proper documentation online), these are the basic configuration options:

    {
        user: "username",
        password: "password",
        server: "hostname\\instancename",
        database: "database",
        port: 1433
    }
    

    You need to make sure the SQL Server Browser is running and the instance is configured for remote access (even when connecting via localhost) by activating TCP/IP in the SQL Server Configuration Manager.