SQL Server 2005/.NET 2.0 - Application performance deteriorates!

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

Hi guys,

I’m getting a really weird problem when accessing a SQL Server 2005 database
from .NET 2.0 code. I was wondering if anyone could make any suggestions as
to what is going wrong?!

Here is the problem (names and real world equivalents have been altered!):

I have a database that stores client information. There is a top level
client table (that stores all the usual stuff like name, address, etc…) and a
collection of child tables storing things like orders placed etc… The child
tables all have one-to-many relationships with the client table, based on
ClientId. There are around six child tables. Pretty standard stuff!

The database lives on a disk array (6 drives, RAID 10 configuration) and is
split into a number of files (one per processor) on each logical drive.

I have some code (fired off in a new thread) that does the following, as
part of a Windows Service. The code is run on a powerful multi-processor
server with A LOT of RAM:

* Opens a database connection.

* Opens a SqlDataReader (using a suitably configured command).

* Loops through the data reader. The data reader gets all ClientIds from the
client table, effectively allowing me to loop through all client ids. (Only
the one field is returned.)

* For each row returned by the reader, I populate a dataset with all the
information for that client. (The data set is basically an image of the
database tables, except that it only ever holds information for one client at
a time. Obviously, this is held in memory!) I fill the data set using a
number of data adapters, one for each table. Each data adapter accepts the
client id so that the right subset of data is returned.

* Once I have the complete set of client data, I serialise the data set and
plop it into another database table (tblList) (in a var-binary field), along
with the Id in another field.

Please don’t dwell on why I’m actually doing the above. As I state above,
I’ve changed the names of tables etc… to protect intellectual property. The
requirement really does make sense in the real version of the application!

The outer loop, driven by the data reader, uses the following construct:

using (SqlDataReader reader = GetReader())
{
while (Reader.Read())
{
//Data adapter code
}
}

The “//Data adapter code†section represents the bit where a data set is
filled from data adapters. Note that the using statement should ensure that
the reader is closed after use. (When I call the reader, I use the
CommandBehavior.CloseConnection option to ensure that the connection is
closed too. Although maybe I need to somehow dispose the connection as well?)
The data adapters used should manage their own connections, right?

As you can image, there is a lot of data adapter activity, but the orders of
magnitudes of rows returned is well within the capability of the data adapter
(i.e. only a few hundred rows in a table at the most). Surely SQL Server and
..NET should more than be able to handle this kind of task? It’s not exactly
out of the ordinary… is it?

Now, this is what happens when the code runs. To begin with, all is fine. I
get good speed (around 20 clients per second). Then, as time passes, the
number of clients processed per second begins to drop off, until the thing
comes to a grinding halt. There are no obvious signs that anything is going
wrong: the processor is not being stressed, and number of database
connections is normal, there is no I/O queue, the amount of memory being
consumed by the Windows Service is similar to the amount when I started it
up, etc… The slow down is not being caused by the data itself, since all of
the clients are approximately equal in size (in terms of numbers of records).
It’s also not to do with the growing size of tblList, since if I clear this
table down when the system is running slowly, the thing does not start to
speed up. I really have no idea what is causing the application to stop
working. Note that there are around 3 million clients in the database.

Has anyone seen anything like this before? If so, what were the causes? What
additional information would I need to supply in order to help you guys make
suggestions as to what is going wrong?

Thanks in advance,

Steve.
 
For beginning you might try using a .net performance profiler (such as
AQTime) and see whether there are problems in code.
 
Now, this is what happens when the code runs. To begin with, all is fine. I
get good speed (around 20 clients per second). Then, as time passes, the
number of clients processed per second begins to drop off, until the thing
comes to a grinding halt. There are no obvious signs that anything is going
wrong: the processor is not being stressed, and number of database
connections is normal, there is no I/O queue, the amount of memory being
consumed by the Windows Service is similar to the amount when I started it
up, etc… The slow down is not being caused by the data itself, since all of
the clients are approximately equal in size (in terms of numbers of records).
It’s also not to do with the growing size of tblList, since if I clear this
table down when the system is running slowly, the thing does not start to
speed up. I really have no idea what is causing the application to stop
working. Note that there are around 3 million clients in the database.

Has anyone seen anything like this before? If so, what were the causes? What
additional information would I need to supply in order to help you guys make
suggestions as to what is going wrong?

Thanks in advance,

Steve.

First I would isolate whether the problem is in your client or whether
the server is showing degradation. What is the impact on memory and
CPU on the server? Any locking happening? If you open up SQL
Management Studio and run some queries is it OK?

If the server looks OK then it is your app and Miha's suggestion is a
good one. There are lots of profilers most of them with free trial
periods. Last time I had a problem I used ANTS, and was able to locate
the slowdown in about 10 minutes after downloading it!
 
I would load the .net provider for SqlServer performance counter and verify
that the NumberOfActiveConntions is not going out of limits. This can be
verified by using the NumberOfFreeConnections counter.
 
Hello Steve,

To be frank, I haven't met such issue before. I'm also afraid it is very
difficult for us to figure out the root cause of issue without the actual
scenario.
When I call the reader, I use the CommandBehavior.CloseConnection option
to ensure that >the connection is closed too. Although maybe I need to
somehow dispose the connection >as well? The data adapters used should
manage their own connections, right?

Yes, it is necessary to dispose the connection even though you use
CommandBehavior.CloseConnection option. Due to GC, the connection may not
be disposed even through it has been closed. The safest way to achieve this
is put your connection in using() block.

DBAdapter really could manage their own connections. However, I would also
suggest you open/close the connection explicit. This is because DBAdapter
will leave the connection open , if the state of connection is open before
DBAdapter want to open it. For example: when we want to use DBAdapter to
fill data from underlying database into datatable.
It will open its connection, fill database, and then close the connection.
This is true when the connection has been closed before DBAdapter use it.
However, if you leave the connection as open, DBAdapter will not close it
after filling data from database. Please open/close connection explicit.
As you can image, there is a lot of data adapter activity, but the orders
of magnitudes of rows >returned is well within the capability of the data
adapter (i.e. only a few hundred rows in >a table at the most). Surely SQL
Server and .NET should more than be able to handle this >kind of task? It
not exactly out of the ordinary?is it?

I think SQL server and .Net framework is smart enough for this situation.
This is not the case.
Did you receive any exception when your application stops working?
I agree with Philip and Miha. If the server looks fine, you may consider
using some .net performance profiler to check your application. It is
helpful in such senario.

Hope this helps.
Sincerely,
Wen Yuan
Microsoft Online Community Support
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
 
Hi Steve,

Just want to check how the issue is going.
If there is any thing I can help with, please feel free to update here.
I'm glad to assist.

Have a great weekend,
Sincerely,
Wen Yuan
Microsoft Online Community Support
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
 
Hi guys,

I finally managed to get to the bottom of this.

I tried ANTS, but due to the speed that the thing was running (i.e. VERY
fast), ANTS killed the performance to a point where it took too long to get
to the part of the process where slow down occurred.

I ended up inserting some very simple time logging code into the process and
identified the section of code that slowed down as the number of iterations
increased. I then made some architectual changes and reduced the number of
client to server round trips. This solved the problem.

So, from our point of view, things are fine. However, my original solution
does still expose some kind of slow down within the Microsoft DataAdapter
implementation, although I have no practical reason to look into this further
now.

Thanks for all your suggestions. They helped to get me on the right track.

All the best,

Steve.
 
You are welcome, Steve,
It's a great news the issue has resolved by you. :)
Glad to work with you.

Have a great day,
Wen Yuan
Microsoft Online Community Support
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
 
Back
Top