Search code examples
c#-4.0console-applicationbatch-processingflat-file

Best way to handle Flat File Import to SQL Server using C#.Net


  • I wrote a Console Application that reads list of flat files
  • and Parse the data type on a row basis
  • and inserts records one after other in respective tables.

there are few Flat Files which contains about 63k records(rows). for such files, my program is taking about 6 hours for one file of 63k records to complete.

This is a test data file. In production i have to deal with 100 time more load.

I am worried, if i can do this any better to speed up.

Can any one suggest a best way to handle this job?

Work Flow is as below:

  1. Read the FlatFile from Local Machine using File.ReadAllLines("location")
  2. Create a Record Entity object after parsing each field of the row.
  3. Insert this current row in to the Entity

Purpose of making this as console application is, this application should be run(scheduled application) on weekly basis and there is conditional logic in it, based on some variable there will be

  • full table replace or
  • update a existing table or
  • delete records in table.

Solution

  • You can try to use 'bulk insert' operation for inserting a huge data into database.