Skip to content Skip to sidebar Skip to footer

Update Query On Millions Of Rows Fills The Transaction Log

I need to update millions of rows as part of my next release, but doing so fills the transaction log and fails. I have a few ideas but I'm not a SQL expert so I'm sure there will b

Solution 1:

In the end the example I had already written worked best; a transaction log full error gets caught in the catch and 15 minutes is long enough for the log to be recycled.

DECLARE 
    @AffectedRows int 

SET @AffectedRows = 0 

WHILE @AffectedRows < @RowsToUpdate 
BEGIN 
    BEGIN TRY 
        BEGIN TRAN 
        -- Do some updates   
        SET @AffectedRows = @AffectedRows + @@RowCount 
        COMMIT TRAN 
    END TRY 
    BEGIN CATCH 
        PRINT ERROR_MESSAGE() 
        WAITFOR DELAY '00:15:00' 
    END CATCH 
END 

PRINT @AffectedRows

Solution 2:

A few points / ideas:

  1. You can expand your transaction log to whatever size you want so it does not fill it.
  2. If your transaction log grows too much you can always backup your DB and truncate the log.
  3. You can work through the data in batches (do a million at a time)
  4. You can copy the data to a working table and then sp_rename it in when the processing is done.

Solution 3:

You're reinventing nibbling deletes/updates :)

Take a look at this approach, you can do bigger blocks than a single row:

http://www.sqlservervideos.com/video/nibbling-deletes/

http://sqladvice.com/blogs/repeatableread/archive/2005/09/20/12795.aspx


Post a Comment for "Update Query On Millions Of Rows Fills The Transaction Log"