Search code examples
azure-sql-database

Execute SQL Insert statement into Azure SQL from a file


I have a file with more than 300,000 INSERT command lines. I can use SSMS to copy some of those lines to insert my data, but I'm wasting my time loading my data by a group of 10,000 rows.

Can I execute all the Insert commands in my text file to my SQL Azure Database?

Thanks


Solution

  • You can use a script like below:

    IF  OBJECT_ID('Sample_SalesTable', 'U') IS  NULL
    BEGIN
    CREATE  TABLE Sample_SalesTable (
    Column1 NVARCHAR(50),
    Column2 INT,
    Column3 DECIMAL(10, 2)
    );
    END
    INSERT  INTO SalesTable (Column1, Column2, Column3)
    VALUES
    ('Value 1', 1, 1.23),
    ('Value 2', 2, 2.46),
    ('Value 3', 3, 3.69),
    ('Value 4', 4, 4.92),
    ('Value 5', 5, 6.15),
    ('Value 6', 6, 7.38),
    ('Value 7', 7, 8.61),
    ('Value 8', 8, 9.84),
    ('Value 9', 9, 11.07),
    ('Value 10', 10, 12.30),
    SET  NOCOUNT  ON
    BEGIN TRANSACTION
    COMMIT
    SET  NOCOUNT  OFF
    
    • The SET NOCOUNT ON statement is used to improve performance When it is set to 'ON' the server stops returning the number of rows affected by each statement as part of the result set.

    • BEGIN TRANSACTION statement marks the start of a transaction. Transactions are used to group a set of SQL statements together

    • The COMMIT statement is used to commit a transaction. When a transaction is committed, all the changes made by the transaction are made permanent and become visible to other users or processes accessing the data.

    • After the transaction is committed, the script sets NOCOUNT back to 'OFF' using the SET NOCOUNT OFF statement.

    However, Inserting 300,000 rows using the same script can be time-consuming and may not be the most efficient approach.

    As you are using the Azure Sql database one option is to use the BULK INSERT statement

    If you can Prepare a CSV file that consist the data of the table you can easily insert the data using the below code.

    CREATE MASTER KEY ENCRYPTION BY PASSWORD ='test1223$'
    
    CREATE DATABASE SCOPED CREDENTIAL blobcred
    WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
    SECRET = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
    
    CREATE EXTERNAL DATA SOURCE externaldatasource
    WITH(
       TYPE = BLOB_STORAGE,
       LOCATION = 'https://gen2june27.blob.core.windows.net',
       CREDENTIAL = blobcred
       )
    CREATE TABLE [dbo].[SalesTable02]
    (
        ID INT ,
        Column1 VARCHAR(50),
        Column2 INT,
        Column3 DECIMAL(10,2)
    )
    BULK INSERT salesTable02 FROM 
    'folder1/salesallrowsupdated.csv'
    WITH (DATA_SOURCE = 'externaldatasource', FIRSTROW =2 ,FIELDTERMINATOR = ',', ROWTERMINATOR ='\n')
    

    In order to execute the you will need to upload the CSV file which will consist of data to ADLS gen2

    For example:

    1. I have uploaded a CSV file to ADLS Gen 2. This CSV file have 50,000 records. enter image description here

    enter image description here

    1. CREATE EXTERNAL DATA SOURCE while you are using this command you will need the SAS token you will find the SAS token in storage account it self. enter image description here When you are on the Shared access Signature page you will see Allowed resource types as unchecked. Check them like the above image and scroll down you will see GENERATE SAS Token enter image description here

    Once you click the button you wll see the below enter image description here The SAS token looks something like this ?sv=2022-11-02.......... from this token remove the first '?' and save rest of the token in a notepad. And replace
    CREATE DATABASE SCOPED CREDENTIAL blobcred WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = 'xxxxxxxYOURSASTOKENxxxxxxxxxxxxxxx'

    1. When you are creating the CREATE EXTERNAL DATA SOURCE You need to replace this with your storage account name 'https://yourstorageaccount.blob.core.windows.net'

    Here is the BULK INSERT statement utility to load the data from a file directly into the table. This method is generally faster and more efficient for large data sets. BULK INSERT salesTable02 FROM 'folder1/salesallrowsupdated.csv' WITH (DATA_SOURCE = 'externaldatasource', FIRSTROW =2 ,FIELDTERMINATOR = ',', ROWTERMINATOR ='\n') I have used the same code to load 50,000 records into AZURE SQL database.

    enter image description here

    This above approach is more efficient for larger dataset and fast even for records more that 3,00,000.