Monday, August 6, 2007

Easily insert huge amounts of test data

[This was originally posted at http://timstall.dotnetdevelopersjournal.com/easily_insert_huge_amounts_of_test_data.htm]

I've plugged the free MassDataHandler tool before - an open source tool that lets you use XML script to easily insert data. One limitation of the tool (perhaps to be solved in future releases) is the inability to easily insert mass amounts of data.

 

Say you want a table to have 10,000 rows so you can do some SQL performance tuning. Ideally you could specify some algorithm to dictate the number of rows and how each row of data is unique. For example, you may want to say "Insert 10,000 rows of company data, using different company names of the form 'co' + i.ToString(), such as co1, co2, co3, etc...".

 

You can easily do this. First you could use the MDH to insert the parent data. Then for specific high-volume tables, you could use the SQL while loop to specify the insert strategy, like so:

Declare @i int select @i = 1  WHILE (@i <= 10000) BEGIN    --Define the dynamic data to insert   Declare @co varchar(10)   select @co = 'co' + cast(@i as varchar(4))    --Do the SQL insert   Insert into MyTable (    [co],     [SomeColumn] )   Values(    1,     @co,     'someData' );    --increment the counter   select @i = (@i + 1) END

The would quickly insert 10,000 rows of test data into MyTable. You could customize the technique for other tables, adding multiple inserts in the loop, or adjusting for a multi-column unique index.

 


Living in Chicago and interested in working for a great company? Check out the careers at Paylocity.

No comments:

Post a Comment