I have a load of data to import into a Azure SQL database. The table I'm copying too has a time stamp field in smalldatetime format and then another 195 fields of floats. I'm trying to BCP a csv file into the table. The CSV file has the exact same format. The timestamps are identical in format but I keep getting an invalid date format error.
Example of TimeStamp from database:
2017-11-05 12:00:00
Example of TimeStamp from csv file:
2017-11-27 00:16:37
I have tried using bcp with -n and a format file generated with -n and a format file generated with -x. All give the invalid date format error.
bcp dbo.Meters format nul -n -S <URL> -d <databasename> -U <username> -P <password> -f format.fmt
bcp dbo.Meters IN "output.csv" -S <URL> -d <databasename> -U <username> -P <password> -f format.fmt -t ,
I have also tried using the -c option and this gives a different error:
Fractional second precision exceeds the scale specified in the parameter binding.
I couldn't find much on google relating to this but it seems to mean there are fractions of seconds in the timestamps in the csv file which exceed the size of the smalldatetime format in the database table. None of the timestamps in the csv file have any fractional seconds though.
Does anyone know why it thinks the dates are in different formats? Thanks in advance!
Sample of CSV file can be downloaded here.
Create table statement with different field names can be downloaded here.
Solution thanks to Nick.McDermaid:
I created a staging table on the database with all the same column names as the destination table but with each column type as varchar.
Then bcp worked fine using the character option -c to upload into the staging table.
bcp dbo.Staging IN "output.csv" -S <URL> -d <databasename> -U <username> -P <password> -c -t ,
Then all it needs is the following SQL code to update the actual table and remove the data from the staging table.
INSERT INTO dbo.Meters
SELECT * FROM dbo.Staging
GO
DELETE FROM dbo.Staging
GO