Search code examples
pythonsqlalchemyalembic

Hard to upgrade my staging db through flask migrate


I want to apply the changes made on local db to the cloud db.

My local db has three tables, User, email_history, event_monitor. I have deleted my local migrations folder and then ran python manage.py db init, python manage.py db migrate commands.

It creates an revision file like below.

"""empty message

Revision ID: 9bd307a576ce
Revises: 
Create Date: 2017-03-01 00:10:32.344698

"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql

# revision identifiers, used by Alembic.
revision = '9bd307a576ce'
down_revision = None
branch_labels = None
depends_on = None


def upgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.drop_table('user')
    op.drop_table('email_history')
    op.drop_table('event_monitor')
    # ### end Alembic commands ###


def downgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.create_table('event_monitor',
    sa.Column('id', mysql.INTEGER(display_width=11), nullable=False),
    sa.Column('event_type', mysql.VARCHAR(length=80), nullable=True),
    sa.Column('event_description', mysql.TEXT(), nullable=True),
    sa.PrimaryKeyConstraint('id'),
    mysql_default_charset=u'latin1',
    mysql_engine=u'InnoDB'
    )
    op.create_table('email_history',
    sa.Column('id', mysql.INTEGER(display_width=11), nullable=False),
    sa.Column('user_id', mysql.INTEGER(display_width=11), autoincrement=False, nullable=False),
    sa.Column('email_type', mysql.ENUM(u'SAMPLE'), nullable=True),
    sa.Column('datetime_created', mysql.DATETIME(), nullable=True),
    sa.Column('datetime_sent', mysql.DATETIME(), nullable=True),
    sa.Column('status', mysql.TINYINT(display_width=1), autoincrement=False, nullable=False),
    sa.ForeignKeyConstraint(['user_id'], [u'user.id'], name=u'email_history_ibfk_1', ondelete=u'CASCADE'),
    sa.PrimaryKeyConstraint('id'),
    mysql_default_charset=u'latin1',
    mysql_engine=u'InnoDB'
    )
    op.create_table('user',
    sa.Column('id', mysql.INTEGER(display_width=11), nullable=False),
    sa.Column('username', mysql.VARCHAR(length=80), nullable=True),
    sa.Column('email', mysql.VARCHAR(length=120), nullable=False),
    sa.Column('password_hash', mysql.VARCHAR(length=256), nullable=True),
    sa.PrimaryKeyConstraint('id'),
    mysql_default_charset=u'latin1',
    mysql_engine=u'InnoDB'
    )
    # ### end Alembic commands ###

If I do

export config=prod && python manage.py db upgrade

sqlalchemy.exc.OperationalError: (_mysql_exceptions.OperationalError) (1051, "Unknown table 's2sdevdb.user'") [SQL: u'\nDROP TABLE user'] error. Yep, I already deleted my tables on cloud db.

And my question is, why the migrate command fails to create code for creating tables? How I achieve this task?


Solution

  • Locally, you deleted your migrations, but your data still exists. When you create a migration, Alembic looks at your code and finds some models, looks at the database and finds those tables still present, so does not generate code to create the tables.

    Presumably, there are no longer models representing the three tables it generated drop commands for.

    To create a migration representing all your models, your database must be empty. Either drop the tables or point to an empty database. Alembic generates an alembic_version table which you may also need to drop.


    Remotely, you dropped the tables, then tried to run a migration that drops the tables. It fails for the given reason: the tables don't exist to be dropped.

    Since you messed up and performed the migration manually, use manage.py db stamp head to tell Alembic that your database already represents the current migration.

    Assuming you really did reset the remote database, then the command you showed works fine to run your new migration.