Bulk Inserts

  • Thread starter Thread starter James Autry
  • Start date Start date
J

James Autry

Ok, I have to enter up to 3,000 records (56 bytes each)into a table in one
shot. It takes forever doing the standard dataadapter/update. I changed to
a stored procedure to save each record, but the time saved was negligible.
Increased connection packet size, no improvement. The only thing left is to
a bulk transfer, but as I understand, this is not supported in ADO. How
would one do it using ExecuteNonQuery on multiple records for the same
table. Is there a better way?

Thanks,
James
 
You already said it. Bulk operations. ADO is not designed to do this (until
V2.0). Use BCP or DTS. You'll get your performance back. You can also use
Transactions, and batch a number of INSERT/UPDATE statements together if
your provider supports it.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
Hola !
IF I get it well, performance wise, transaction is kind of equivalent to a
bulk transfer ?

is it possible to do something like that: ?

INSERT INTO Table (Field1, Field2) VALUES (1, 2)
SELECT @@IDENTITY
INSERT INTO Table (Field1, Field2) VALUES (2, 3)
SELECT @@IDENTITY
...... N times ......

but I cannot execute reader for each statement, could I ?
how do I get the identity value for each insert ?
 
Lloyd, transacted batched updates will not come close to bulk update
performance. In v2.0 we have added a SqlBulkUpload class to bulk upload data
into Sql Server from managed code, until then I would follow Bills advice.
 
Back
Top