Search code examples
sql-serverstored-procedurestriggerssql-agent-job

Will archiving lots of old data lock my Database?


I need to move the data that is a month old from a logging table to a logging-archive table, and remove data older than a year from the later.

There are lots of data (600k insert in 2 months).

I was considering to simply call (batch) a stored proc every day/week.

I first thought about doing 2 stored proc :

  1. Deleting from the archives what is older than 365 days
  2. Moving the data from logging to archive, what is older than 30 days (I suppose there's a way to do that with 1 sql query)
  3. Removing from logging what is older than 30 days.

However, this solution seems quite ineficient and will probably lock the DB for a few minutes, which I do not want.

So, do I have any alternative and what are they?


Solution

  • None of this should lock the tables that you actually use. You are writing only to the logging table currently, and only to new records.

    You are selecting from the logging table only OLD records, and writing to a table that you don't write to except for the archive process.

    The steps you are taking sound fine. I would go one step further, and instead of deleting based on date, just do an INNER JOIN to your archive table on your id field - then you only delete the specific records you have archived.

    As a side note, 600k records is not very big at all. We have production DBs with tables over 2billion rows, and I know some other folks here have dbs with millions of inserts a minute into transactional tables.

    Edit:

    I forgot to include originally, another benefit of your planned method is that each step is isolated. If you need to stop for any reason, none of your steps is destructive or depends on the next step executing immediately. You could potentially archive a lot of records, then run the deletes the next day or overnight without creating any issues.