sql update slow

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I am having difficulty sending an update of 20,000 rows to an SQL server.
When I use sqlCommandBuiilder to create the update command the update takes
less than a minute. When I manually build the command and the parameters
the update is taking closer to 20 minutes. I have had the same experience
using a stored procedure to perform the update or putting the SQL update
statement in the sqlcommand object. Any thoughts on what could cause such a
difference in performance?
Some facts. I'm updating a single table with almost 20,000 rows and 10
columns. I am only updating two of the columns for each record. I am using
an Identity column in my where statement.
C# code.
//Stored Procedure
SqlCommand updateCmd = new SqlCommand("UpdateDeviceListStatus",
this.getConnection());
updateCmd.CommandType = CommandType.StoredProcedure;
updateCmd.Parameters.Add(new SqlParameter("@deviceStatus",
SqlDbType.VarChar));
updateCmd.Parameters["@deviceStatus"].SourceColumn = "deviceStatus";
updateCmd.Parameters["@deviceStatus"].Size = 250;
updateCmd.Parameters.Add(new SqlParameter("@deviceResponsetime",
SqlDbType.BigInt));
updateCmd.Parameters["@deviceResponseTime"].SourceColumn =
"deviceResponseTime";
updateCmd.Parameters.Add(new SqlParameter("@deviceID", SqlDbType.BigInt));
updateCmd.Parameters["@deviceID"].SourceColumn = "deviceID";
dsCmd.UpdateCommand = updateCmd;
//Update DB with dataset
dsCmd.Update(dtable);

SQL Stored Procedure.
CREATE PROCEDURE dbo.UpdateDeviceListStatus
(
@deviceID bigint, @deviceStatus varchar(250), @deviceResponseTime bigint
)
AS UPDATE deviceList
SET deviceStatus = @deviceStatus, deviceResponseTime = @deviceResponseTime
WHERE (deviceID = @deviceID)
RETURN
GO
 
Are you calling this sproc 20,000 times?

Or just once, against a table with 20,000 rows?

If you are calling this once, 20,000 rows should be updated in
milliseconds.

Does the table have a trigger on it?
 
sirfunusa said:
Are you calling this sproc 20,000 times?

Or just once, against a table with 20,000 rows?
I really didn't know there was a way to do it one way or the other. The
code I'm using is above, so it's however the default behavior of the data
adapter .update works. I'm certainly not going through a foreach loop on
my datatable.
I thought that all SQL commands to the DB were done one at a time unless you
increased the UpdateBatchSize property.
The table does not have a trigger on it.
BTW. I'm monitoring my network connection to the SQL server and I'm not
getting high utilization or any errors, so I don't believe that is the cause.
 
I don't know if this makes a difference but the call to the subroutine to
update the DB takes place in it's own thread. I wouldn't think it should,
but ...
 
During one of my tests I changed the Primary key of the table from my
deviceId (which is a bigInt) to the deviceIP (which is a varchar(50))
and I forgot to change it back. It seems this was slowing down my
update a lot. Not sure exactly why, both columns only had unique
values. But my performance is much better now that I've changed the
Primary key back to the deviceID field.
 
Back
Top