I am trying to export data from a python pandas data frame to an existing MS Access table, I would like to replace the MS access table with data that has been updated (in python) I have tried to use pandas.to_sql, but I get an error message. I find this strange this using pandas.read_sql works seamlessly?
Here is my code:
import pyodbc
import pandas as pd
conn_str = (
r'DRIVER={Microsoft Access Driver (*.mdb, *.accdb)};'
r'DBQ=H:\Work\IndexTrader\Data\database\IndexData.accdb;'
)
cnxn = pyodbc.connect(conn_str)
SQL = 'SELECT * FROM Index_data;
Reading data from MS Access is fine, see below
dfins = pd.read_sql(SQL, cnxn)
However when I try to write back and replace the table in MS excess it doesn't work?
dfins.to_sql('Index_data', cnxn, if_exists='replace')
cnxn.close()
The error I get is:
DatabaseError: Execution failed on sql 'SELECT name FROM sqlite_master WHERE type='table' AND name=?;': ('42S02', "[42S02] [Microsoft][ODBC Microsoft Access Driver] The Microsoft Access database engine cannot find the input table or query 'sqlite_master'. Make sure it exists and that its name is spelled correctly. (-1305) (SQLExecDirectW)")
If there is an alternative way instead of pandas.to_sql, that would help as well, I just need to know how to export my data.
As said in the comments, to_sql
only supports sqlite3
An example approach to parse the data frame row by row, and insert each row into a table:
import pyodbc
import pandas as pd
conn_str = (
r'DRIVER={Microsoft Access Driver (*.mdb, *.accdb)};'
r'DBQ=C:\Users\Erik\Desktop\TestDb.accdb;'
)
cnxn = pyodbc.connect(conn_str)
SQL = 'SELECT * FROM Index_data;'
dfins = pd.read_sql(SQL, cnxn)
for index, row in dfins.iterrows():
with cnxn.cursor() as crsr:
crsr.execute("INSERT INTO Output(Field1, Field2) VALUES(?,?)", row["F1"], row["F2"] )
This inserts columns F1
and F2
of the DataFrame into fields Field1
and Field2
of a table named Output.
This has two main conditions to properly work:
You can use a pre-existing table, like the index table you're getting the data from, though I don't recommend it (risk of losing data). If you do so, you need to clear the table first. To clear out the index table:
with cnxn.cursor() as crsr:
crsr.execute("DELETE * FROM Index_data;")