A
AdrianDams
I have a solution where I am receiving a stream of data from a device and I
want to insert it into a SQL Server table. I am using Ado.net 2.0 and the
SQLBulkCopy class.
My test data amounts to about 50 tags a second and I have configured my
system so that I can change parameters of my bulk write operation. I
appreciate that this is hardly a lot of data but my target system will be a
lot bigger. In short, though, I am disappointed with the results and hope
someone can point out how to improve them. I have been hoping that
SQLBulkCopy would be able to handle high throughput at low CPU loads, and
not only bulk bursts of data inserts.
The code I am using does the following:
1. Creates an Ado.net DataTable
2. Creates an instance of the SQLBulkCopy class
3. Sets the DestinationTableName property of the SQLBulkCopy class to the
destination SQL Server table (which has appropriate indexes)
4. Insert rows into the DataTable once a second
5. When the SQLServerInsertInterval fires, the WriteToServer method of the
SQLBulkCopy class is called to flush the buffer (DataTable) to the
destination SQL Server table
These are the results of my test. I'd really appreciate any help here.
The tests were performed on my laptop:
Pentium
1G RAM
2GHz
Win Xp Sp2
There were a total of 50 tags changing once a second on the input stream
In summary, these are the results:
data acquisition cycle Time
SQL Server Insert method
SQL Server insert interval (SQLBulkCopy.WriteToServer)
Total Average CPU load
Cpu load used by SQL Server
1 second
Normal Insert
1 second
~70%
27%
1 second
Bulk Copy
1 second
~55%
~15%
1 second
Bulk Copy
Never (disabled)
~55%
~16%
1 second
Bulk Copy
60 seconds
~55%
~17%
In short, SQL Bulk Copy seems to give an improvement over normal inserts on
the total and SQL Server CPU load, but tweaking the parameters of the SQL
Bulk Copy operation doesn't seem to make much difference. In particular, the
fact that it seems to chew up the same amount of CPU even if there are no
inserts to the SQL Server is very hard to interpret.
want to insert it into a SQL Server table. I am using Ado.net 2.0 and the
SQLBulkCopy class.
My test data amounts to about 50 tags a second and I have configured my
system so that I can change parameters of my bulk write operation. I
appreciate that this is hardly a lot of data but my target system will be a
lot bigger. In short, though, I am disappointed with the results and hope
someone can point out how to improve them. I have been hoping that
SQLBulkCopy would be able to handle high throughput at low CPU loads, and
not only bulk bursts of data inserts.
The code I am using does the following:
1. Creates an Ado.net DataTable
2. Creates an instance of the SQLBulkCopy class
3. Sets the DestinationTableName property of the SQLBulkCopy class to the
destination SQL Server table (which has appropriate indexes)
4. Insert rows into the DataTable once a second
5. When the SQLServerInsertInterval fires, the WriteToServer method of the
SQLBulkCopy class is called to flush the buffer (DataTable) to the
destination SQL Server table
These are the results of my test. I'd really appreciate any help here.
The tests were performed on my laptop:
Pentium
1G RAM
2GHz
Win Xp Sp2
There were a total of 50 tags changing once a second on the input stream
In summary, these are the results:
data acquisition cycle Time
SQL Server Insert method
SQL Server insert interval (SQLBulkCopy.WriteToServer)
Total Average CPU load
Cpu load used by SQL Server
1 second
Normal Insert
1 second
~70%
27%
1 second
Bulk Copy
1 second
~55%
~15%
1 second
Bulk Copy
Never (disabled)
~55%
~16%
1 second
Bulk Copy
60 seconds
~55%
~17%
In short, SQL Bulk Copy seems to give an improvement over normal inserts on
the total and SQL Server CPU load, but tweaking the parameters of the SQL
Bulk Copy operation doesn't seem to make much difference. In particular, the
fact that it seems to chew up the same amount of CPU even if there are no
inserts to the SQL Server is very hard to interpret.