R
Rich
Hello,
I am trying to push 40,000 rows of data from an ado.net
dataTable to a sql server table. It takes me about 5
minutes to read the data from an external data source
(using a vb.net project). But it takes over 10 minutes to
write the data to sql server using dataAdapter.Update
(dataset1, "myTable"). I was thinking maybe I could
output the data to disk as an xml file and use DTS in Sql
server to retrieve the data (I was thinking about writing
out just a plain text file, but there are over 100
fields/columns of which some are nText - very hard to
delimit with just one delimeter - would need 3 chars to
delimit columns). Would this be more efficient than
dataAdapter.Update... for 40,000 or more rows? My other
idea would be to loop through the data table and just use
a command object to insert each row into the sql server
table row by row. Any suggestions appreciated.
TIA,
Rich
I am trying to push 40,000 rows of data from an ado.net
dataTable to a sql server table. It takes me about 5
minutes to read the data from an external data source
(using a vb.net project). But it takes over 10 minutes to
write the data to sql server using dataAdapter.Update
(dataset1, "myTable"). I was thinking maybe I could
output the data to disk as an xml file and use DTS in Sql
server to retrieve the data (I was thinking about writing
out just a plain text file, but there are over 100
fields/columns of which some are nText - very hard to
delimit with just one delimeter - would need 3 chars to
delimit columns). Would this be more efficient than
dataAdapter.Update... for 40,000 or more rows? My other
idea would be to loop through the data table and just use
a command object to insert each row into the sql server
table row by row. Any suggestions appreciated.
TIA,
Rich