Search code examples
pythonpython-2.7pyodbcdbf

Inserting All Data from Multiple DBF files into Multiple Tables


I am working on pushing data from DBF files from a UNC to a sql server DB. There are about 50 DBF files, all of which with different schemas. Now I know I can create a program and list all 50 Tables and all 50 DBF files but this is going to take forever. Is there a way to derive the DBF field names somehow to do the insert rather then going through every DBF and typing out every field name in the DBF? Here's the code I have right now that inserts records from two fields in one DBF file.

import pyodbc
from dbfread import DBF

# SQL Server Connection Test
cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=**********;DATABASE=TEST_DBFIMPORT;UID=test;PWD=test')
cursor = cnxn.cursor()
dir = 'E\\Backups\\'
table = DBF('E:\\Backups\\test.dbf', lowernames=True)
for record in table.records:
    rec1 = record['field1']
    rec2 = record['field2']
    cursor.execute ("insert into tblTest (column1,column2) values(?,?)", rec1, rec2)
    cnxn.commit()

Solution

  • Some helpful hints using my dbf package:

    import dbf
    import os
    
    for filename in os.listdir('e:/backups'):
        with dbf.Table('e:/backups/'+filename) as table:
            fields = dbf.field_names(table)
            for record in table:
                values = list(record)
                # insert fields, values using odbc
    

    If you want to transfer all fields, then you'll need to calculate the table name, the field names, and the values; some examples:

    sql_table = os.path.splitext(filename)[0]
    
    fields = ','.join(fields)
    place_holders = ','.join(['?'] * len(fields))
    values = tuple(record)
    
    sql = "insert into %s (%s) values(%s)" % (sql_table, fields, place_holders)
    
    curser.execute(sql, *values)