Delete many (multiple) records in bunch: SQL-Server 2005

 Delete many (multiple) records in bunch: SQL-Server 2005

 

I had one task few months back. I was asked to delete 3 million records from one table containing total of 8 million records. You may think, ohh delete….. simple, execute delete statement with condition and sit back on your chair with relax. J Its not that simple, If I would have done it, I would have lost my database due to many reasons. I don’t want to go in details about those reasons. Let us focus on our main topic.

 

USE adventureworks

GO

— creating one temporary table from AdventureWork database

— to perform delete operation.

SELECT * into AdventureWorks.DBO.TempPurchaseOrderDetail FROM Purchasing.PurchaseOrderDetail

–Above query has created 8788 row in my temp table

GO

 

 

–Wants to delete rocords based on where condition of

–following query….

— it has 6624 RECORDS in my table

SELECT * FROM TempPurchaseOrderDetail WHERE PurchaseOrderID<3000

 

Now, we will create one stored procedure which will delete all records from my table which will meet specific conditions.

 

 

CREATE PROC DeleteInBunch

@NumOfRow INT –number given for this variable, will be used with TOP in DELETE statement

AS

SET NOCOUNT ON

–INFINITE loop to iterate until and unless it will finish deleting records

WHILE(1=1)

BEGIN

      DELETE TOP (@NumOfRow) FROM TempPurchaseOrderDetail WHERE PurchaseOrderID<3000

      –LOOP will not break until we will reach to @@ROWCOUND=0

      IF @@ROWCOUNT=0

      BEGIN

            PRINT(‘Records have been delete!!!!’)

            BREAK

      END

END

 

So, finally we have created stored procedure; now let us see whether it actually works!!!

 

–SP will delete all records for specific condition in the bunch of 100

EXEC DeleteInBunch 100

 

Reference: Ritesh Shah

 

Advertisements

13 Responses to “Delete many (multiple) records in bunch: SQL-Server 2005”

  1. Rajesh Says:

    Reallly nice, it helped me lot, in same condition as in ur. Thanks for such a nice code..

  2. riteshshah Says:

    you are welcome. 🙂 glad to help

  3. Amha Says:

    It get it very helpful

  4. ravi Says:

    Great code, thanks

  5. Samyuktha Says:

    I got a task to do.I need to insert 400,000 rows into Adventure Works DB table inorder to see how Proactive caching works in different scenarios.
    My question is how to insert 400,000 rows at a time in a table.
    Can you let me know the script to generate it.

    Please do help me out.

    Thanks in Advance.

  6. Samyuktha Says:

    But how to insert 400,000 rows.I don’t have data.
    Can we do it through looping a column or something like that.

  7. riteshshah Says:

    have you seen the link I have provided, I have already inserted 100,000 records, you can change it to 400,000 and it will do the job for you.

  8. Samyuktha Says:

    I want to insert data into Adventureworks DW2008 R2 table.I need to test whether the cube is processing or not after I insert this.

    My question is how to insert rows in the existing table.I need to see the changes in the cube.

  9. Samyuktha Says:

    I want to insert data into Adventureworks DW2008 R2 table.I need to test whether the cube is processing or not after I insert this.

    My question is how to insert rows in the existing table.I need to see the changes in the cube.

  10. Samyuktha Says:

    I’m in a process of testing Proactive testing on my data.
    So,for this I need to Insert data into the table.

    • riteshshah Says:

      for that, there is no generic method, either you have to write some script as per the data you want or preferably use the data generator software.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: