Software versions: alembic 1.0.5, SQLAlchemy 1.2.14, MySQL 5.7, Python 3.6.7
I am trying to use alembic to keep a MySQL database schema and the Python ORM representation in step.
The issue I am seeing is that the migrations always have redundant drop and create commands for foreign keys. It seems that autogenerate is seeing something as being different, but they are actually the same.
On repeated invocations of the commands:
alembic revision --autogenerate
alembic upgrade head
...will produce the same drop and create commands.
The logging to stdout shows something like (e.g.):
INFO [alembic.autogenerate.compare] Detected removed foreign key (t1_id)(id) on table table_two
INFO [alembic.autogenerate.compare] Detected added foreign key (t1_id)(id) on table test_fktdb.table_two
and the migration script has:
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint('fk_table1', 'table_two', type_='foreignkey')
op.create_foreign_key('fk_table1', 'table_two', 'table_one', ['t1_id'], ['id'], source_schema='test_fktdb', referent_schema='test_fktdb')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint('fk_table1', 'table_two', schema='test_fktdb', type_='foreignkey')
op.create_foreign_key('fk_table1', 'table_two', 'table_one', ['t1_id'], ['id'])
# ### end Alembic commands ###
This issue can be replicated and I've made a minimal example (a tar.gz on https://github.com/sqlalchemy/alembic/files/2625781/FK_test.tar.gz). The ORM in the example goes something like:
[...import and bobs...]
class TableOne(Base):
"""Class representing a table with an id."""
__tablename__ = "table_one"
id = Column(UNSIGNED_INTEGER, nullable=False, autoincrement=True, primary_key=True)
__table_args__ = (
dict(mysql_engine='InnoDB'),
)
class TableTwo(Base):
"""A table representing records with a foreign key link to table one."""
__tablename__ = "table_two"
id = Column(UNSIGNED_INTEGER, nullable=False, autoincrement=True, primary_key=True)
t1_id = Column(UNSIGNED_INTEGER, nullable=False)
__table_args__ = (
ForeignKeyConstraint(["t1_id"], ["test_fktdb.table_one.id"], name="fk_table1"),
dict(mysql_engine='InnoDB'),
)
Is there anything that can be done to make alembic 'see' the FKs in the database as being the same as those in the ORM? Applying some configuration via env.py
, for example?
I've had a look around for this problem and found some old issues in the alembic GitHub (see [1],[2],[3]). The issues that have solutions seem to deal with postgres databases and the schema being public. I'm not sure this applies to this case, as I am using MySQL; the related documentation for public postgres schemas is here: https://docs.sqlalchemy.org/en/latest/dialects/postgresql.html#remote-schema-table-introspection-and-postgresql-search-path
I've now added my own issue to the alembic GitHub repo: https://github.com/sqlalchemy/alembic/issues/519
Closed issues in alembic issue tracker, which show similar symptoms, but whose solutions don't apply (as far as I can see):
[1] https://github.com/sqlalchemy/alembic/issues/444
So, although this SO question is old and got me the Tumbleweed badge, I think it would be nice to answer it and close it off. I got a good answer from the package maintainer, Mike Bayer, on GitHub:
OK, so here is the thing. you are connecting with "test_fktdb" in your database URL as the default schema. which means, alembic is going to find your tables in that schema, and when it finds the foreign key, it will see the "schema_name" field in that FK as empty, because this is the default schema. So it doesn't match what you have in your metadata. Also you aren't adding "include_schemas=True" to the environment, so you will definitely not get reasonable results when your ORM models have "schema='test_fktdb'" in them.
there's two general worlds you can go into to fix this.
easy one. take out "schema" from your tables/metadata/foreign keys entirely. then everything works in test_fktdb as the default and everything matches.
hard one. you need to connect to a different database on your URL, then set up include_schemas=True in your envrionment, you probably also need a reasonable include_object() scheme so that it doesnt read in all the other databases, set up version_table_schema='test_fktdb', then that works too:
env.py:
SCHEMA_NAME = "NOT_test_fktdb"
def include_object(object, name, type_, reflected, compare_to):
if (type_ == "table"):
return object.schema == "test_fktdb"
else:
return True
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True,
compare_server_default=True,
include_schemas=True,
version_table_schema="test_schema",
include_object=include_object
)
# ...
the "schema" logic necessarily has to rely heavily on this concept of "default" schema being a blank string, so when you mix up the default schema also being present it confuses things.
There is more over on GitHub https://github.com/sqlalchemy/alembic/issues/519.
I found that the easy option worked and I made the following changes:
# instead of [...]:
# declarative_base(metadata=sqlalchemy.MetaData(schema=test_fktdb.SCHEMA_NAME))
Base = sqlalchemy.ext.declarative.declarative_base()
# instead of [...]:
# ForeignKeyConstraint(["t1_id"], ["test_fktdb.table_one.id"], name="fk_table1"),
ForeignKeyConstraint(["t1_id"], ["table_one.id"], name="fk_table1"),