Search code examples
sqlpowershellbcp

BCP update database table base on output from powershell


I have 4 files with the same csv header as following

Column1,Column2,Column3,Column4 


But I only required data from Column2,Column3,Column4 for import the data into SQL database using BCP . I am using the PowerShell to select the columns that I want and import the required data using BCP but my powershell executed with no error and there are not data updated in my database table. May I know how to set the BCP to import the output from Powershell to database table. Here are my powershell script

$filePath = Get-ChildItem -Path 'D:\test\*' -Include $filename
$desiredColumn = 'Column2','Column3','Column4'

foreach($file in $filePath)
{
  write-host $file
 $test = import-csv $file | select $desiredColumn
 write-host $test
 $action = bcp <myDatabaseTableName> in $test -T -c -t";" -r"\n" -F2 -S <MyDatabase>
}



These are the output from the powershell script

D:\test\sample1.csv
@{column2=111;column3=222;column4=333} @{column2=444;column3=555;column4=666}
D:\test\sample2.csv
@{column2=777;column3=888;column4=999} @{column2=aaa;column3=bbb;column4=ccc}

Solution

  • First off, you can't update a table with bcp. It is used to bulk load data. That is, it will either insert new rows or export existing data into a flat file. Changing existing rows, usually called as updating, is out of scope for bcp. If that's what you need, you need to use another a tool. Sqlcmd works fine, and Powershell's got Invoke-Sqlcmd for running arbitary TSQL statements.

    Anyway, the BCP utility has notoriously tricky syntax. As far as I know, one cannot bulk load data by passing the data as parameter to bcp, a source file must be used. Thus you need to save the filtered file and pass its name to bcp.

    Exporting a filtered CSV is easy enough, just remember to use -NoTypeInformation switch, lest you'll get #TYPE Selected.System.Management.Automation.PSCustomObject as your first row of data. Assuming the bcp arguments are well and good (why -F2 though? And Unix newlines?).

    Stripping double quotes requires another an edit to the file. Scrpting Guy has a solution.

    foreach($file in $filePath){
      write-host $file
      $test = import-csv $file | select $desiredColumn
      # Overwrite filtereddata.csv, should one exist, with filtered data
      $test | export-csv -path .\filtereddata.csv -NoTypeInformation
      # Remove doulbe quotes
      (gc filtereddata.csv) | % {$_ -replace '"', ''} | out-file filtereddata.csv -Fo -En ascii
      $action = bcp <myDatabaseTableName> in filtereddata.csv -T -c -t";" -r"\n" -F2 -S <MyDatabase>
    }
    

    Depending on your locale, column separator might be semicolon, colon or something else. Use -Delimiter '<character>' switch to pass whatever you need or change bcp's argument.

    Erland's got a helpful page about bulk operations. Also, see Redgate's advice.