Search code examples
javapostgresqlsqliteormlitejava-stored-procedures

How to write several item in table at the same time using ORMLite


I use ORMLite on a solution made by server and clients.

On server side I use PostgreSql, on client side I use SQLite. In code, I use the same ORMLite methods, without taking care of the DB that is managed (PostgreSql or SQLite).

Let's say that:

  • Table A corresponds to class A
  • I have an Arraylist of objects A
  • I want to insert all items of ArrayList in DB.

Today I use a for() cycle, and I insert them one by one (doing it in Transaction Manager). When the items are few, no problem, but now the items are becaming more and this is not probably the best way, also because I lock the DB for long time.

I'm searching a way to insert all the items in one step, so to go quickly, to not lock the DB for long time. I understood that it should be a sort of Stored Procedures (I'm not expert...).

To be noted that some items could be new (that is it not exist already an item with the same primary key id), then must be performed and INSERT; other items could be existing, so it should be performed an UPDATE.

Thank you


Solution

  • I'm searching a way to insert all the items in one step, so to go quickly, to not lock the DB for long time.

    So there are two ways to do this that I know of: transactions and disabling auto-commit. If you are inserting into the database and it needs to all happen "at once" from a consistency standpoint, transactions are the only way to go. If you just want to insert and update a large number of records with higher performance then you can disable auto-commit, do the operations, and then commit. Depending on the database implementation, this is what the TransactionManager is really doing.

    I understood that it should be a sort of Stored Procedures...

    I don't see how stored procedures helps you at all. They aren't magic.

    but now the items are becoming more and this is not probably the best way, also because I lock the DB for long time.

    I don't think there is a magic solution to this. If you are pushing a large number of objects to the database and you need the data to be transactional, then locks are going to be have to be held during the updates. One thing to realize is that postgres should handle this a ton better than Sqlite. Sqlite does not (I don't think) have row level locking meaning that the whole DB is paused during transactions. Postgres has a much more mature locking system and should be more performant in this situation. This is also why Sqlite is so fast in many other operations because it doesn't have to burdened with the lock complexity.

    One thing to consider is to rearchitect your schema. Try to figure out the minimal amount of data that needs to be transactionally inserted. For example, maybe just the object relationships needs to be changed transactionally but all of the data can be stored later. For example, you could have an AccountOwner object which just has 2 ids while all of information about the Account can be stored outside of the transaction. This makes your schema more complicated but maybe much faster.

    Hope something here helps.