I need to convert from gdbm to postgresql for my key/value store.
It appears that I will have to replace
import dbm.gnu
def get_value(db, key):
return json.loads(db[key])
db = dbm.gnu.open(...)
v = get_value(db, "foo")
with
import sqlalchemy
from sqlalchemy import Column, Text
from sqlalchemy.dialects.postgresql import JSONB
db = sqlalchemy.create_engine("...")
engine = db.connect()
meta = sqlalchemy.MetaData(engine)
id_col = Column('id', Text, primary_key=True)
data_col = Column('data', JSONB)
sqlalchemy.Table("my_table", meta, id_col, data_col)
meta.create_all()
# populate the table with 40M "id"-->JSON records
engine.execute(
my_table.update(),
id="foo",
data={"a":3, "b":17, "c":[2,6,0]})
my_table = sqlalchemy.table("my_table", id_col, data_col)
def get_value(db, key):
res = engine.execute(db.select().where(db.c.id == key)).fetchall()
assert len(res) == 1
return res[0][1]
v = get_value(my_table)
this looks somewhat scary (especially if I add echo
and see all the SQL
generated for these simple key-value operations).
Is there a better way?
PS. I could also use psycopg
instead of sqlalchemy
directly, but that would make me write SQL myself ;-(
You must not use postgresql as a key-value store.
The key-value store (except in rare cases) is structured around a schema based on key composition that layout the data in multidimensional space that may or may not map directly the concepts of SQL tables. Otherwise said, there is a database abstraction that lives in the key-value store.
There is not enough information to say, simply replace the key-value store with a 2 columns table. If you do that you will most likely end up with the worst of both worlds.