I am having a postgres production database in production (which contains a lot of Data). now I need to modify the model of the tg-app to add couple of new tables to the database.
How do i do this? I am using sqlAlchemy.
This always works and requires little thinking -- only patience.
Make a backup.
Actually make a backup. Everyone skips step 1 thinking that they have a backup, but they can never find it or work with it. Don't trust any backup that you can't recover from.
Create a new database schema.
Define your new structure from the ground up in the new schema. Ideally, you'll run a DDL script that builds the new schema. Don't have a script to build the schema? Create one and put it under version control.
With SA, you can define your tables and it can build your schema for you. This is ideal, since you have your schema under version control in Python.
Move data.
a. For tables which did not change structure, move data from old schema to new schema using simple INSERT/SELECT statements.
b. For tables which did change structure, develop INSERT/SELECT scripts to move the data from old to new. Often, this can be a single SQL statement per new table. In some cases, it has to be a Python loop with two open connections.
c. For new tables, load the data.
Stop using the old schema. Start using the new schema. Find every program that used the old schema and fix the configuration.
Don't have a list of applications? Make one. Seriously -- it's important.
Applications have hard-coded DB configurations? Fix that, too, while you're at it. Either create a common config file, or use some common environment variable or something to (a) assure consistency and (b) centralize the notion of "production".
You can do this kind of procedure any time you do major surgery. It never touches the old database except to extract the data.