Planning to import 1000s of text file to sql server tables. All these files are having different structures and it goes to corresponding new tables in SQl. Different methods coming in mind is using of Biml /creating ssis packages with number of data flows/using import wizard.
What is the ssis design pattern to achieve this in most time efficient way. This is a onetime load though.
How to handle the failure ? : I am not considering the checkpoint because ,when Control Flow tasks are run in parallel checkpoints act a little erratically.
This might give you a starting point if you want to see what each file has for data, without opening every one of them. The DefaultDir will reside on your SQL Server that you are running from.
SELECT * FROM
OPENROWSET ('MSDASQL', 'Driver={Microsoft Access Text Driver (*.txt, *.csv)};DefaultDir=C:\PathtoFiles',
'select * from FileName.csv');