This blog post is more a tip that I picked up on while at PASS 2009. Have you ever had the need to copy the contents of an entire table into another table? Traditionally speaking, we as developers will use SELECT INTO or a INSERT INTO statement to load a destination table. This is a still a great way of accomplishing the task at hand, but it is not nearly fast as what I am about to show you. The method I am about to show you is not for all scenarios, but it can be very handy. I do not know how many Oracle guys are reading this, but I have one question for you, “can your RDMS move 10 millions rows of data in <= 1 millisecond?” I would be willing to bet that most will answer no, but I have to admit that this is not really a fair fight. Why is this a little unfair…. let’s have a look at how this works under the hood.
This method derives it power based on new partitioning functionality, in SQL Server 2005. If you have used partitioning in SQL Server 2005, you probably have a good idea where I am going with this. If not, SQL Server 2005 has built-in functionality that allows tables to be split or divided into what I will call virtual tables, whose values are dependent on predefined boundaries. When the partitioning function column is used in a query predicate the optimizer knows which partition the data resides in, which makes queries more IO efficient. This is amazing functionality because it does not require application changes and significantly reduces the amount of data SQL Server has to sift through. The partitioning feature I will be focusing on is the feature that allows SQL Server to switch or trade partitions out, ironically named SWTICH. This is commonly used for situations were you want to move data to a different partition either because the boundaries have changed or you need to phase data out. The real benefit in using the SWITCH function is SQL Server does not actually move the data, it updates the meta data pointers to the data. Because I am not actually moving data, I am able to move data around nearly instantaneously, regardless of the number of rows. This is why I said it is not fair, but hey what in life is fair :^)
Okay let’s see an example. I will start by creating a sample table.
USE [tempdb] GO IF EXISTS(SELECT 1 FROM sys.tables WHERE NAME = 'TestData') BEGIN DROP TABLE [dbo].[TestData]; END GO CREATE TABLE [dbo].[TestData]( RowNum INT PRIMARY KEY, SomeId INT, SomeCode CHAR(2) ); GO INSERT INTO [dbo].[TestData] SELECT TOP 10000000 ROW_NUMBER() OVER (ORDER BY t1.NAME) AS RowNumber, ABS(CHECKSUM(NEWID()))%2500+1 AS SomeId, CHAR(ABS(CHECKSUM(NEWID()))%26+65) + CHAR(ABS(CHECKSUM(NEWID()))%26+65) AS SomeCode FROM Master.dbo.SysColumns t1, Master.dbo.SysColumns t2 GO IF EXISTS(SELECT 1 FROM sys.tables WHERE NAME = 'NewTestData') BEGIN DROP TABLE [dbo].[NewTestData]; END GO --Create New Table To Move Data To CREATE TABLE [dbo].[NewTestData]( RowNum INT PRIMARY KEY, SomeId INT, SomeCode CHAR(2) ); GO
Now the fun part……. behold the power of SQL Server!!!!!!!!!!!!!!!!
--Move data to the new table SET STATISTICS TIME ON; SET STATISTICS IO ON; ALTER TABLE [dbo].[TestData] SWITCH to [dbo].[NewTestData]; SET STATISTICS TIME OFF; SET STATISTICS IO OFF; GO /* SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms. SQL Server Execution Times: CPU time = 0 ms, elapsed time = 1 ms. */
Next, I will verify the results.
SELECT COUNT(*) FROM [dbo].[TestData]; --0 SELECT COUNT(*) FROM [dbo].[NewTestData]; --10,000,000 /* ----------- 0 (1 row(s) affected) ----------- 10000000 (1 row(s) affected) */
There you have. I have successfully moved 10 million rows into a new table in 1 MS and incurred no IO through IO stats; however, IO has to be incurred to update meta data, although it should be minimal. This method has limited use, but can be extremely advantageous. There are stipulations that have to be met for SWITCH, such as the table you are switching to must be empty and the table must have the same schema. For a comprehensive list of requirements please refer to this link, http://technet.microsoft.com/en-us/library/ms191160.aspx .
Until next time happy coding.