I have table A
and B
. B
has a a_id
foreign key to A
.
I do this in python:
a = create_A(some_json)
b = create_B(some_json)
b.a = a # (I have a sqlalchemy relationship)
existing_a = session.query(A).filter_by(name=a.name).first()
if existing_a:
a.id = existing_a.id
session.merge(a)
I get a json from which I create A and B, then if A already exists in the db, I want to update it.
The problem is that 'b.a' is pointing to a, but b.a_id is None. I have the id of 'a' but I wanted to avoid doing 'b.a_id = a.id' because I have a more complex structure with lists of objects and more nested objects.
If I manually set b.a_id to the correct value, I still get an error because session.commit()
tries to update a null value and not the value I set, I don't know why.
reproducible code:
@dataclass
class A(Base):
__tablename__ = "a"
a_id = Column(INTEGER, primary_key=True,
server_default=Identity(always=True, start=1, increment=1, minvalue=1,
maxvalue=2147483647, cycle=False, cache=1),
autoincrement=True)
name: str = Column(TEXT, nullable=True)
bs = relationship('B', back_populates='a')
@dataclass
class B(Base):
__tablename__ = "b"
b_id = Column(INTEGER, primary_key=True,
server_default=Identity(always=True, start=1, increment=1, minvalue=1,
maxvalue=2147483647, cycle=False, cache=1),
autoincrement=True)
a_id: int = Column(INTEGER, ForeignKey('a.a_id'), nullable=False)
name: str = Column(TEXT, nullable=True)
a = relationship('A', back_populates='bs')
Session = sessionmaker(bind=engine_sync)
session = Session()
a = A()
a.name = 'new name a'
b = B()
b.name = 'new name b'
b.a = a
existing_a = session.query(A).filter_by(name='name a').first()
a.a_id = existing_a.a_id
# a.bs[0].a_id = a.a_id - this doesn't help
session.merge(a)
session.commit()
In the db these rows exist: A: a_id: 1, name: 'name a' B: b_id: 1, a_id: 1, name: 'name b'
I get a json from which I create A and B, then if A already exists in the db, I want to update it.
A lot of SO questions involve blindly creating a new object and then trying to incorporate the new values into an existing object. IMO that is the wrong way to go about it. I would personally do something more like this:
import json
from dataclasses import dataclass
from sqlalchemy import Column, create_engine, ForeignKey, INTEGER, TEXT
from sqlalchemy.orm import declarative_base, relationship, Session
engine = create_engine("postgresql://scott:[email protected]/test")
Base = declarative_base()
@dataclass
class A(Base):
__tablename__ = "a"
a_id = Column(INTEGER, primary_key=True, autoincrement=False)
name: str = Column(TEXT, nullable=True)
status: str = Column(TEXT, nullable=True)
bs = relationship("B", back_populates="a")
@dataclass
class B(Base):
__tablename__ = "b"
b_id = Column(INTEGER, primary_key=True, autoincrement=False)
a_id: int = Column(INTEGER, ForeignKey("a.a_id"), nullable=False)
name: str = Column(TEXT, nullable=True)
version: str = Column(TEXT, nullable=True)
a = relationship("A", back_populates="bs")
Base.metadata.drop_all(engine)
Base.metadata.create_all(engine)
# set initial values
with engine.begin() as conn:
conn.exec_driver_sql(
"INSERT INTO a (a_id, name, status) VALUES (1, 'name a', 'original')"
)
conn.exec_driver_sql(
"INSERT INTO b (b_id, name, a_id, version) VALUES (1, 'name b', 1, 'v1')"
)
# JSON with new values
new_values_for_a = json.loads('{"name": "name a", "status": "updated"}')
new_values_for_b = json.loads('{"version": "v2"}')
with Session(engine) as session:
existing_a = (
session.query(A).filter_by(name=new_values_for_a["name"]).first()
)
for x in new_values_for_a.items():
setattr(existing_a, x[0], x[1])
# *assumes* that `bs` is not empty, and we want to update the first one
for x in new_values_for_b.items():
setattr(existing_a.bs[0], x[0], x[1])
session.commit()
print(existing_a) # A(name='name a', status='updated')
print(existing_a.bs) # [B(a_id=1, name='name b', version='v2')]