When running the following code to insert data to a database table, each time a DB connection appears in pg_stat_activity
which remains there in state idle:
column_names = ", ".join(columns)
query = f"INSERT INTO {table_name} ({column_names}) VALUES %s"
values = [tuple(record.values()) for record in records]
with psycopg2.connect(dbname=dbname, user=user, password=password, host=host, port=port, application_name=app_name) as conn:
with conn.cursor() as c:
psycopg2.extras.execute_values(cur=c, sql=query, argslist=values, page_size=batch_size)
UPDATE: Code looks like this now, after input from Adrian:
conn = psycopg2.connect(dbname=dbname, user=user, password=password, host=host, port=port, application_name=app_name)
with conn:
with conn.cursor() as c:
psycopg2.extras.execute_values(cur=c, sql=query, argslist=values, page_size=self._batch_size)
conn.commit()
conn.close()
del conn
When running multiple times, the following error is suddenly thrown, indicating that the connections are exhausted:
E psycopg2.OperationalError: connection to server at "localhost" (::1), port 5446 failed: Connection refused (0x0000274D/10061)
E Is the server running on that host and accepting TCP/IP connections?
E connection to server at "localhost" (127.0.0.1), port 5446 failed: FATAL: remaining connection slots are reserved for roles with the SUPERUSER attribute
Might it be that this happens when connecting to the DB through a ssh tunnel? Using port forwarding via VSCode to a DB in Azure cloud. I experience the same behaviour when using DBeaver DB client.
The problem arose from the port forwarding via VSCode from my local machine to the remote Postgres database server. I switched to the ssh command line client and now the connections are closed properly.