Search code examples
pythondatabasesqlalchemypyodbc

How to stop sqlalchemy or pyodbc casting bigint to numeric when reflecting tables?


I am trying to reflect tables from a sql server database. One of the tables has an id column, type bigint and starts with the lower limit value, -9223372036854775808. I realized during the reflection, that I get a

DataError: (pyodbc.DataError) ('22003', '[22003] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Arithmetic overflow error converting bigint to data type numeric. (8115) (SQLExecDirectW)')

error which apparently it's trying to downconvert bigint to numeric. is there a way to prevent this? or give the precision I am looking for when converting to numeric?

I tried different ways:

  1. auto reflecting all tables :
engine = create_engine(db connection)
Base = automap_base()
Base.prepare(autoload_with=engine)

also by creating the table:

from sqlalchemy import create_engine, MetaData, Table
from sqlalchemy import BigInteger

metadata = MetaData()

# reflect the table
table = Table('OEData', metadata, autoload_with=engine)

and in both cases I get the same error shown above.

I am using :

sqlalchemy 2.0.21

MS Sql server 2014

thanks


Solution

  • You have encountered a bug in SQLAlchemy's reflection code for mssql. It has been reported here

    https://github.com/sqlalchemy/sqlalchemy/issues/10504