How to effiently load many, many SQL records

  • Thread starter Thread starter William Patterson
  • Start date Start date
W

William Patterson

I have a need to effiently load millions of fairly small records into SQL
Sever 2000. I have some financial data with many, many observations (one
set has 37 million records) and I need to get it into a SQL server.

I create a comma separated variable (CSV) file using a VB.NET program acting
on the raw data. I have tried importing it into SQL Server using DTS
import, but it says that it completes after about 18 million records, with
no error.

Using Visual Studio 2004 VB, I wrote a windows program that uses a SQL
dataadapter to the table I want to insert records in, set the field
parameters to the generated SQL insert command and then call ExecuteNoQuery
to insert each record.

This process works, but is seems to take forever to create a lot of records
this way.

I am fairly well versed in VS 2003 and VB.NET, but I am new to using these
tools to load so many records. I must be doing something wrong or not using
the appropriate technique to do this.

Any help will be appreciated.

Thanks!

Bill Patterson
UW-Madison School of Business
 
Bill,

The all time favorite for this job would be bcp (Bulk Copy). You can search
Sql Server 2000 books online for more information that.

ADO.NET 1.1 will be insanely slow for this, don't bother using that. ADO.NET
2.0 has a new class SqlBulkCopy - which is much faster.

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik
 
William,

You can also use BULK INSERT for this.


Regards,

Deepak
[I Code, therefore I am]
 
Back
Top