Search code examples
pythonmysqlsqlalchemymariadb

SQLalchemy wont execute large DB query


so I was trying to query around 200k rows from a mariaDB using the following code.

with engine.connect() as connection:
    print("test1")
    result = connection.execute(sqlalchemy.text(query))
    dates = [print("test2") for row in result]
    print(len(dates))
engine.dispose()

It works the first time. and then the second time it freeze up after printing test1. Then after interrupt it with ctrl + c, i can run it again . and so on

what is the possible cause, do i handle the connection incorrectly? is there a possible work around?

The database works fine with other system. and this does not happened for smaller query.

i found the following error on syslog: (I hide some details within <>)

Jul  5 16:52:46 <machine_name> mysqld[1571843]: 2024-07-05 16:52:46 3094934790 [Warning] Aborted connection 3094934790 to db: '<db_name>' user: '<db_user>' host: '<db_host>' (Got an error reading communication packets)

Solution

  • Turns out the problem was i did not fetch the result properly a and i did not close the result.

    here is how to do it properly

    with engine.connect() as connection:
            print("test1")
            result = connection.execute(sqlalchemy.text(query))
            while(True):
                row=result.fetchone() # get one row
                if row is None: # break when there is no row left
                    break
                print(row)
            result.close() # close the result 
    engine.dispose()
    
    

    better yet fetchall() method return all rows and close the result afterwards. so all i need to do was:

    with engine.connect() as connection:
            print("test1")
            result = connection.execute(sqlalchemy.text(query)).fetchall()
            for row in result:
                print(row[0])
    engine.dispose()