Updateing a database table from a dataset

  • Thread starter Thread starter Jake
  • Start date Start date
J

Jake

Hi all,
I am trying to update an SQL server table from a dataset
populated by an excel spread sheet. Each record has a
check box so it can be excluded from the SQL server table
population.

Does anyone now how I would acheive this, I don't want to
have to loop through each record in the dataset and do a
separate insert for each record as there can be allot of
records, and this would result in a waste for recourses.
I was hoping the dataset might have a method that can
perform a bulk insert.

Thanks in advance.

Jake
 
You can do bulk updating using a dataadapter. However because you are
retrieving data from the excel sheet and updating another datasource then
you probably can't do that.

Also, the dataadapter would do an inefficient update as it tends to do more
joins than it needs to. You are probably better off and more efficient to
loop through each record using a datareader and doing the update. I think!

Steve
 
Hi steve,

Thanks for the feed back. I just seems there must be a
more efficient way considering if there is 10000 records
to be inserted ASP.Net has to continuously connect to the
DB 10,000 times. I have tried and it takes 30 seconds to
run, which is not too bad. I was hoping that there was
some sort of bulk insert?

Cheers Jake.
 
Hi,

You can use Bulk insert with command object.

Sql = "insert into a values('0');insert into a values('1')";
SqlCommand oCmd = new SqlCommand(Sql,oConnection);
int i = oCmd.ExcecudeNonQuery();

Natty Gur, CTO
Dao2Com Ltd.
34th Elkalay st. Raanana
Israel , 43000
Phone Numbers:
Office: +972-(0)9-7740261
Fax: +972-(0)9-7740261
Mobile: +972-(0)58-888377
 
Not as far as I'm aware. I think I'm correct in saying that even the
dataadapter, which gives the impression of a bulk insert is using a
datareader behind the scenes to do the insert.
As the data adapter doesn't know the datamodel, it has a rough guess on the
number of joins that it needs to do and therefore could be wrong as it may
join a number of different fields instead of perhaps just one that it needs.
Its been a while since I read all this but I'm pretty sure that I'm right.

I still think you'll need to loop through each record and do the insert, at
least that way you can put the inserts into a transaction.

Hope this helps.

Steve
 
you'd still have to loop through each record and build up a huge sql
statment that would do the insert.

There are 2 problems with doing it this way, in my opinion:
1. The final sql statement would be huge for 1000 records. There might
even be problems trying execute a massive sql statement on the database.
2. I don't think you should do native sql statements on a database that is
allowing user input because you open yourself up to sql injection issues.
Therefore you'd have to parse each insert statement.

So, still reckon youre better off doing it the long way. Which, wouldn't
take too long anyway. The connection would only be open once while you loop
through the records and most of the time is taken up establishing the
connection.

Steve
 
Steve,

You absolute right but it still more efficient to reduce calls the DB.
You have to go through the records one way or another at lest you can
reduce calling DB from 1000 times to 100 times. Check for RPC and see it
for yourself.

I’m not SQL server expert but I know that oracle can execute bulk insert
parallel, if its holds true for SQL server this can also help your
performance.

Natty Gur, CTO
Dao2Com Ltd.
34th Elkalay st. Raanana
Israel , 43000
Phone Numbers:
Office: +972-(0)9-7740261
Fax: +972-(0)9-7740261
Mobile: +972-(0)58-888377
 
There you go Jake, 2 possible well reasoned methods.

Take your pick.
Thanks Natty

Steve
 
Back
Top