We have a large CSV file with couple of columns. One of the column has a large data many times spanning more than 4000 characters and gets truncated when importing using data flow.
What are the approaches which can allow for successful import of data w/o truncation
You can use normal copy activity to dump the data from csv file to SQL. we tried reciprocating the scenario wherein had a column in file with length > 5k and destination as NVARCHAR(max) and it was success without data getting truncated.
And @LeonYue, NVARCHAR(max) can store beyond 4k characters which is a whole seperate discussion as mentioned by @Dhruv