I have been trying to update a column in a table and I am getting the below error:
The transaction log for database 'STAGING' is full due to 'ACTIVE_TRANSACTION'.
I am trying to run the below statement :
UPDATE [STAGING].[dbo].[Stg_Encounter_Alias]
SET
[valid_flag] = 1
FROM [Stg_Encounter_Alias] Stg_ea
where [ACTIVE_IND] = 1
and [END_EFFECTIVE_DT_TM] > convert(date,GETDATE())
My table has approx 18 million rows. And the above update will modify all the rows. The table size is 2.5 GB. Also the DB is in simple recovery mode
This is something that I'll be doing very frequently on different tables. How can I manage this?
My Database size is as per below
Below are the database properties!!! I have tried changing the logsize to unlimited but it goes back to default.
Can any one tell me an efficient way to handle this scenario?
If I run in batches :
begin
DECLARE @COUNT INT
SET @COUNT = 0
SET NOCOUNT ON;
DECLARE @Rows INT,
@BatchSize INT; -- keep below 5000 to be safe
SET @BatchSize = 2000;
SET @Rows = @BatchSize; -- initialize just to enter the loop
WHILE (@Rows = @BatchSize)
BEGIN
UPDATE TOP (@BatchSize) [STAGING].[dbo].[Stg_Encounter_Alias]
SET
[valid_flag] = 1
FROM [Stg_Encounter_Alias] Stg_ea
where [ACTIVE_IND] = 1
and [END_EFFECTIVE_DT_TM] > convert(date,GETDATE())
SET @Rows = @@ROWCOUNT;
END;
end
You are performing your update in a single transaction, and this causes the transaction log to grow very large.
Instead, perform your updates in batches, say 50K - 100K at a time.
Do you have an index on END_EFFECTIVE_DT_TM
that includes ACTIVE_IND
and valid_flag
? That would help performance.
CREATE INDEX NC_Stg_Encounter_Alias_END_EFFECTIVE_DT_TM_I_
ON [dbo].[Stg_Encounter_Alias](END_EFFECTIVE_DT_TM)
INCLUDE (valid_flag)
WHERE ([ACTIVE_IND] = 1);
Another thing that can help performance drastically if you are running Enterprise Edition OR SQL Server 2016 SP1 or later (any edition), is turning on data_compression = page
for the table and it's indexes.