I'm writing some custom code for Alembic to keep my database always updated in my dev environment for a project. The project involves a database with the following:
public
schema for shared dataprototype
for all of the client schemas (orgs)At this moment, I'm not worried about the multiple client schemas, only keeping the public
and prototype
schemas up-to-date. My env.py script works great for the public
schema, but not prototype
because alembic is trying to use the version table from public
when working with prototype
.
So, I thought I could use the version_table_schema
option to maintain one version table in the public
schema and one in the prototype
schema. However, as soon as I start using that, I get a 'relation "public.alembic_version" does not exist' error when I attempt to do the upgrade.
The only difference that I see is that, when I use version_table_schema
set to the appropriate schema, the generated revision scripts actually contain a line to op.drop_table('alembic_version')
. The line ONLY exists when version_table_schema
is in use.
I'm hoping that I'm just missing something minor.
The following are all of the relevant source files for reference. The only external dependency should be the appcontext
and getDeclarativeBase
. appcontext
is just for configuration and is definitely working (The database connection works properly). getDeclarativeBase
is a way to dynamically get a declarative base (and related metadata) for a schema. Based on the debug output, that seems to be working correctly as well. The metadata objects themselves are associated with the correct schemas when they are constructed.
Wrapper functions for running migrations. autoMigratePublic()
and autoMigrateOrgProto()
are the problematic methods in this case
""" A wrapper around Alembic that will assist with database migrations.
Our database system is soemwhat complex to upgrade. There is a public
schema which we use for shared things (such as authentication and job
management). Then there is the org prototype.
There are also individual schemas for each client org. These are trickier.
http://stackoverflow.com/a/35211383/703040
Also useful:
https://github.com/zzzeek/alembic/blob/0e3319bb36f1612e41a8d7da5a48ce1ca33a0b2b/alembic/config.py#L476
"""
import logging
import os
import time
import alembic.config
import alembic.util
CONFIG_ORG_PROTO = os.path.join("migrations", "alembic_orgproto.ini")
CONFIG_PUBLIC = os.path.join("migrations", "alembic_public.ini")
log = logging.getLogger("sys.migrations")
def autoMigratePublic():
try:
freezePublic()
except alembic.util.CommandError:
log.warn("[public] Problem creating migration revision. Skipping as this sometimes just means that we are working with a new DB.")
upgradePublic()
return
def autoMigrateOrgProto():
try:
freezeOrgProto()
except alembic.util.CommandError:
log.warn("[orgproto] Problem creating migration revision. Skipping as this sometimes just means that we are working with a new DB.")
upgradeOrgProto()
return
def freezePublic():
log.info("[public] Checking the database for revisions...")
alembicArgs = [
"--config", CONFIG_PUBLIC,
"--raiseerr",
"revision",
"--autogenerate",
"--message", "autogenerate {0}".format(makeRevisionName()),
]
runAlembic(alembicArgs)
return
def freezeOrgProto():
log.info("[orgproto] Checking the database for revisions...")
alembicArgs = [
"--config", CONFIG_ORG_PROTO,
"--raiseerr",
"revision",
"--autogenerate",
"--message", "autogenerate {0}".format(makeRevisionName()),
]
runAlembic(alembicArgs)
return
def makeRevisionName():
return time.strftime('%Y-%m-%d %H:%M:%S')
def upgradePublic():
log.info("[public] Performing database upgrade...")
alembicArgs = [
"--config", CONFIG_PUBLIC,
"--raiseerr",
"upgrade",
"head",
]
runAlembic(alembicArgs)
return
def upgradeOrgProto():
log.info("[orgproto] Performing database upgrade...")
alembicArgs = [
"--config", CONFIG_ORG_PROTO,
"--raiseerr",
"upgrade",
"head",
]
runAlembic(alembicArgs)
return
def upgradeOrg(schemaName):
log.info("[%s] Performing database upgrade...", schemaName)
alembicArgs = [
"--config", CONFIG_ORG_PROTO,
"--raiseerr",
"upgrade",
"head",
"-x", "schema={0}".format(schemaName),
]
runAlembic(alembicArgs)
return
def runAlembic(args):
return alembic.config.main(args)
Class for performing migrations. This class is referenced in the env.py files
import copy
import logging
import os
import re
import traceback
from logging.config import fileConfig
from sqlalchemy import create_engine
import core.context.appcontext
from core.database.declarative import getDeclarativeBase
logging.getLogger("alembic").setLevel(logging.DEBUG)
#==============================================================================
class Migrator(object):
def __init__(self):
from alembic import context
self.context = context
self.config = context.config
self.log = logging.getLogger("sys.migrations")
self.sys = core.context.appcontext.instance()
self.schema = self.config.get_main_option("schema")
if self.context.get_x_argument("schema"):
self.schema = self.context.get_x_argument("schema")
return
def start(self):
import core.database.tables # Make sure the metadata is defined
if self.context.is_offline_mode():
self.log.error("[%s] Can't run migrations offline", self.schema)
return
self.doMigration()
return
def doMigration(self):
targetMetadata = getDeclarativeBase(self.schema).metadata
engine = create_engine(self.sys.getConnectionUrl(), echo=False)
self.log.info("[%s] Engine: %s", self.schema, engine)
self.log.debug("[%s] Metadata: %s", self.schema, targetMetadata)
for t in targetMetadata.sorted_tables:
self.log.debug(" - %s", t)
conn = engine.connect()
try:
self.context.configure(
conn,
version_table_schema=self.schema,
target_metadata=targetMetadata,
process_revision_directives=self.process_revision_directives,
)
with self.context.begin_transaction():
self.context.run_migrations()
finally:
conn.close()
return
def process_revision_directives(self, context, revision, directives):
""" Used to prevent creating empty migrations
See: http://alembic.readthedocs.org/en/latest/cookbook.html#don-t-generate-empty-migrations-with-autogenerate
"""
if self.config.cmd_opts.autogenerate:
script = directives[0]
if script.upgrade_ops.is_empty():
self.log.debug("[%s] Auto-generated migration is empty. No migration file will be created.", self.schema)
directives[:] = []
else:
self.log.info("[%s] Creating new auto-generated migration revision.", self.schema)
return
Sample env.py
. Both the public and prototype upgrades have the same contents
from migrations.migrator import Migrator
Migrator().start()
Sample config ini The public and prototype migrations use almost the exact same file
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = migrations/orgproto
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
# max length of characters to apply to the
# "slug" field
#truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; this defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path
# version_locations = %(here)s/bar %(here)s/bat alembic/versions
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
sqlalchemy.url = <Not Loaded>
schema = orgproto
Mako File I'm using the default.
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}
I received a response on the sqlalchemy-alembic google group that resolved the issue. Posting it here in case it helps anyone else:
you're likely hitting the very confusing schema rules that apply to Postgresql. See http://docs.sqlalchemy.org/en/rel_1_0/dialects/postgresql.html#remote-schema-table-introspection-and-postgresql-search-path for details. Short answer is that schema of "blank" and schema of "public" are two different things on the Python side, leading to a lot of confusion.
In order to convince autogenerate to not affect alembic_version at all no matter where it pops up, you probably need to create an exclusion rule using include_object: http://alembic.readthedocs.org/en/latest/api/runtime.html?highlight=include_object#alembic.runtime.environment.EnvironmentContext.configure.params.include_object
def include_object(object, name, type_, reflected, compare_to): if (type_ == "table" and name == 'alembic_version'): return False else: return True