Search code examples
sqlsql-serverbcp

SQL Server BCP Utility incorrect data import


I am using BCP utility to import a .csv file. One of the columns has a value separated by comma but its enclosed in doubled quotes. BCP tools is unable to detect it and is incorrectly uploading data to my table.

Here is a repro:

CREATE TABLE [dbo].[test_temptable]
(
    --[ID] [int] IDENTITY(1,1) NOT NULL,
    [Column1] [varchar](100) NULL,
    [Column2] [varchar](100) NULL,
    [Column3] [varchar](100) NULL,

    /*PRIMARY KEY CLUSTERED ([ID] ASC)
*/
) 
ON [PRIMARY]
GO

Test data in inputData.csv file:

value1,"test,test",value3

When I run the bcp command it works

bcp "testDB.dbo.test_temptable" in  c:\temp\test\inputData.csv -c -t,  -T -S "testDBServer" -e c:\temp\test\error.csv

1 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total     : 1      Average : (1000.00 rows per sec.)

However data in table is incorrect:

enter image description here

Column2 should have been test, test and column 3 value3.

What am I doing wrong? Any pointers? Thanks all.


Solution

  • Long story short.. it won't work with commas in your data and as a delimiter. Even though Excel handles it, SQL BCP isn't as friendly:

    sql server Bulk insert csv with data having comma

    I've had to do this in the past and you have some options:

    • Open up the .csv file in Excel, save it as a tab delimited file (which makes it a .txt), then you can run this to indicate you now are using a tab as the delimiter:

      bcp "testDB.dbo.test_temptable" in c:\temp\test\inputData.csv -c -t"\t" -T -S "testDBServer" -e c:\temp\test\error.csv

      Of course if you have a tab in your data.. same problem. However using the little example you have provided that does work.

    • You could try using BULK INSERT from inside SQL

    • You could look into using SSIS or DTX but honestly I try to stay away from that extra overhead/complexity.