The “SQL Guy” Post # 21: Efficiently Manage Large Data Modifications

The “SQL Guy” Post # 21: Efficiently Manage Large Data Modifications

  • Comments 1
  • Likes

Did you know that you can now use the TOP operator for Deleting, Inserting and Updating data in SQL Server tables?

Using the TOP operator for DML operation can help you in executing very large data operations by breaking the process into smaller pieces. This can potentially help with increased performance and also helps with improving database concurrency for larger and highly accessed tables. This is considered as one of the best techniques for managing data modifications on large data loads for reporting or data warehouse applications.

When you perform an update on large number of records using single set updates, it can cause the Transaction Log to grow considerably. However, when processing the same operation in chunks or pieces, each chunk is committed after completion allowing SQL Server to potentially re-use the T-Log space. Another classic issue many of us have experienced is when you are performing very large data updates and you cancel the query for some reason, you would have to wait for a long time while the transaction completely rolls back.

 

With this technique you can perform data modifications in smaller chunks and you can continue with your updates more quickly. Also, chunking allows more concurrency against the modified table, allowing user queries to jump in, instead of waiting for several minutes for a large modifications to finish.

Let’s take an example of deleting 1000 rows of records in a chunk. Assume a table called LARGETABLE table that has millions of records and you want delete 1000 records in chunk:

DELETING LARGE TABLE IN CHUNKS

 

--CREATE A DEMO TABLE CALLED LARGETABLE

CREATE TABLE LARGETABLE (ID_COL INT IDENTITY(1,1), COL_A VARCHAR(10),COL_B VARCHAR(10))

GO

 

--INSERT THE DATA IN LARGETABLE.. NOTICE THE USE OF ‘GO 10000’

INSERT INTO LARGETABLE VALUES ('A','B')

GO 10000

 

--QUERY THE TABLE

SELECT COUNT(*) FROM LARGETABLE;

 

--PERFORM DELETION OF 1000 ROWS FROM LARGETABLE

WHILE (SELECT COUNT(*) FROM LARGETABLE) > 0

BEGIN

            DELETE TOP (1000) FROM LARGETABLE

            SELECT LTRIM(STR(COUNT(*)))+' RECORDS TO BE DELETED' FROM LARGETABLE --THIS IS JUST A COMMENT.

END

 

 

The above technique can also be used with INSERT and UPDATE commands. 

One thing to remember is that this is ideally suited for data warehouses and not really something to be used if the table you are making changes to is also being modified by other users in an OLTP type of database environment. 

DamirB-BlogSignature

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • Nice one Damir, thank you for sharing this!