I encountered a problem during resolving issue in my DB. I tried to look an answer but I found two ambiguous resolution.
Issue: During deleting a huge amount of data I receive an error: 'The transaction log for database '[TableName]' is full.' I was advised to delete data in portion.
I found two different way for manage that issue:
1)
DECLARE @Count INT
Declare @for_delete INT
Declare @chunk_size INT
SELECT @chunk_size=100000
SELECT @Count = 0
select @for_delete=count(*) from [Your big table] where [Your Where Clause]
While (@Count < @for_delete)
BEGIN
SELECT @Count = @Count + @chunk_size
BEGIN TRAN
DELETE top(@chunk_size) FROM [Your big table] where [Your Where Clause]
COMMIT TRAN
END
2)
WHILE EXISTS(SELECT TOP 1 1 FROM Table)
BEGIN
DELETE TOP (10000) FROM Table
END
My question is, why in first example, there is an Begin/Commit tran for single operation (Delete Top(x)....) ? I thought that single operation is autocommit - therefore Begin/Commit is not necessary ? Could you please explain me which option is the most suitable for avoiding error about full transaction log ? Does Begin/Commit tran is necessary for single operation ? Many thanks in advance for your help.
Open and closing a transaction might not be needed but it is good practice.
As both approaches are running within one big job an implicit transaction is not closed before the job is over.
You could try to run each delete portion in its own job. In SQL Server Management Studio there is the chance to use "GO" with an INT parameter: Read here: https://stackoverflow.com/a/3092835/5089204
With this very useful trick you can delete your chunks and run a DBCC SHRINKFILE(N'YourLogFile',0,TRUNCATEONLY)
on every turn.