I need to call Update many times in minute. Is it the correct way to do this? I think not, cuz sometimes i get Exception - cursor already closed
. I call this func from many threads, may be need to lock this part?
conn = psycopg2.connect(dbname='MyDB', user='...', password='...', host='127.0.0.1')
cursor = conn.cursor()
def update_data(name,type, value):
cursor.execute("UPDATE Datas SET Value = %s WHERE Name = %s and Type = %s", (value, name, type))
conn.commit()
Or i need to open connection and initialize cursor everytime ? I'm calling update_data
about 600 times in minute.
I need to see live data updates on the site. This information may be needed.
Also below I use with conn.cursor()
, perhaps there is an intersection of two cursor objects and an error occurs. I didn't notice that I was using different options with the cursor, I was in a hurry.
def get_all_data():
with conn.cursor() as curs:
curs.execute('SELECT * FROM Datas')
datas = curs.fetchall()
return datas
After some tests with psycopg2, i started creating a connection and a cursor for each request and the errors went away. But I understand that this is not the best implementation
def get_all_datas():
with psycopg2.connect(dbname='DatasDB', user='fr3sto', password='********', host='127.0.0.1') as conn:
with conn.cursor() as curs:
curs.execute('SELECT * FROM Datas')
datas = curs.fetchall()
return datas