Exception - Connection pooling

  • Thread starter Thread starter Stefan Hellberg
  • Start date Start date
S

Stefan Hellberg

Hello everyone,

I have a big problem with my ASP .NET project that I hope
anyone knows something about.

Windows 2000
SQL Server 2000
NET Framework 1.1

I get the below exception.

------
System.InvalidOperationException:

Timeout expired. The timeout period elapsed prior to
obtaining a connection from the pool. This may have
occurred because all pooled connections were in use and
max pool size was reached.
------

Im using Microsoft Data Application block to create
DataReaders in my business class methods. Im not
providing a connection object to the Data Application
Block method, instead I provide the connection string so
a connection will be created each time a DataReader is
created and closed when I call the DataReaders close
method.

Some pages makes alot of calls to different business
object methods, so lets say a normal number of calls
could be 10-20 for each page load. On some pages I also
have recursive methods, for example, a menu creation
method that calls itself to get child links.

If a few recursive methods is used on a page, then alot
more connections to the database is made by a single
user, probably a total of 30-80.

--

Is it bad design that each method creates a new
connection and closes it?
(I thought that with connection pooling it doesnt really
mattered even if this was done many times by a single
user)

Should I instead redesign so that I create a connection
object before any business classes is used and pass that
connection with each call to the Data Application Block
and explicity close that connection at the end of my ASP
page when no more business objects are used?

Anyone that have any help for me on this?

If my approach is wrong, then where is the best place to
create and close a connection?

Lighten up my world please :)

All help will be very appreciated.

Regards
Stefan Hellberg
 
Stefan,

Datareaders are forward only cursors that tie up a connection until you
properly close it. Looking at the error you get and the description you give
you are opening loads of connections to the database and you are NOT closing
them quickly. 30-80 active connections to a database from a single page call
will almost certainly get you in trouble.

Here are a couple of pages for you to look at:
Q310369 No method in datareader closes its underlying connection:
http://support.microsoft.com/default.aspx?scid=kb;en-us;310369
Especially note the "System.Data.CommandBehavior.CloseConnection" flag to
the ExecuteReader method. I expect you are not having this one

Best practices for ADO.NET
http://msdn.microsoft.com/vbasic/us...l=/library/en-us/dnadonet/html/adonetbest.asp
Take special attention to the section about "Using the datareader" and how
it deals with connections

Last but not least have a look at the caching methods in .net. It looks to
me that for instance those menu calls are perfect candidates to be cached in
your application. This will save you a lot of roundtrips to the database.

Your suggestion to open a connection once and then open all datareaders on
it won't work since only one datareader can be open on a single connection.
This appears different from how it worked in ado, but actually it isn't
since in ADO a new connection was implicitly created (and closed) for you to
accomplish this. ado.net won't do this anymore.

Hope this helps,
Edwin Kusters
Hot ITem Informatica
 
Thanks for your reply Kevin.
(links were good reading)

The connection is closed for each DataReader directly
after im done with it using the Close method of the
DataReader. The Microsoft Data Application Block
that creates the DataReaders in my case adds the
CommandBehavior.CloseConnection.

My ideas on how to minimize connections is this.

My PageInfo class has methods that returns "page
title", "meta data", number of articles, if its published
and so on.

As I see it, I can add a Load() method to all my classes
that reads all information and populates these values
into different properties. That way I only need to make
one database call(perhaps call it from constructor, New
() ).

Doing this, I think I can prevent at least 5-10
connections on each page load to be made.

If the connection is closed everytime after im done with
my DataReader, should I really need to do this?

I have no idea on how to generate my menus, categorys and
so on without using an recursive function.

Any idea on a good aproach?

Btw, thanks for tip on caching. I will look more into
this.

Regards
Stefan Hellberg
-----Original Message-----
Stefan,

Datareaders are forward only cursors that tie up a connection until you
properly close it. Looking at the error you get and the description you give
you are opening loads of connections to the database and you are NOT closing
them quickly. 30-80 active connections to a database from a single page call
will almost certainly get you in trouble.

Here are a couple of pages for you to look at:
Q310369 No method in datareader closes its underlying connection:
us%3b310369
Especially note
the "System.Data.CommandBehavior.CloseConnection" flag to
 
Stefan,
If the connection is closed everytime after im done with
my DataReader, should I really need to do this?

Yes! Creating a new connection is _really_ expensive that's why we use
connection pooling. However a roundtrip to a database is still not free of
cost and will be magnitudes slower than reading data from memory on the same
machine.
I have no idea on how to generate my menus, categorys and
so on without using an recursive function.

Any idea on a good aproach?

I can't judge how static your data is, but it sound to me that a lot of it
is meta data that hardly ever changes. Have a look at HttpApplicationState.
When your app fires up you can use the Application_Start event handler to
retrieve all static metadata from your database and store them as objects
(datasets perhaps?) in the applicationstate. Assuming here that we aren't
talking about millions of records :) You can then retrieve them easily in
your pages almost the same way you are doing now but without the extra
overhead of going to the database.

As for the menu's the recusive part sounds ok to me. The issue is to not
have db roundtrips for each step. So you can keep the recursive logic but
just apply it on data that is kept on the webserver. If the menu's are
actually static as well (the same for all users or only appearing in 2 or 3
different versions) you could even opt to store the whole menu in the
applicationstate instead of the underlying data which will save you some
cputime (at a cost of more memory usage).

Using this approach you have to stop/start your application if such meta
data does change or add some mechanism to detect that the metadata got
changed and repopulate your cache. Or you can look at storing your data in
asp.net's own cache where you can set a timeout value for it. Then simply
refresh it when it times out.

Even for categories (which might change more often?) you can opt to send
only 1 query to the database retrieving all of them in one resultset (a
stored procedure could be helpfull here). If you are going to proces almost
all records in it it will help you. If there is only one page where you can
add a categorie you could even cache the whole list of categories just like
the metadata and refresh it from that one page where you modify it from.

The bottomline is that you should look carefully at the data your are
getting from the database and determine which part of that data should and
which part should not be cached on the webserver. Also look carefully if it
isn't faster to get more data in one roundtrip over having multiple trips
that return fewer rows. You can easily test this for each scenario. Don't
overdo this caching though. It's one of those use don't abuse features that
can give you a load of benefit when used correctly but can hunt you forever
when misused.

Succes,
Edwin Kusters
Hot ITem Informatica

Stefan Hellberg said:
Thanks for your reply Kevin.
(links were good reading)
<snip>
 
I might add that CommandBehavior.CloseConnection will NOT close the
Connection automatically unless the DataReader is closed. When binding a
DataReader to a complex bound control (like the DataGrid), the DataReader is
closed automatically. We've seen this exception before when DataReaders are
not closed, but I expect that if you're opening 20 connections/page you
might not support more than 5 users as the default max pool size is 100
connections. I think that this is going to be problematic in the long run--I
agree with Edwin--try to get away with fewer connections.

hth

--
____________________________________
Bill Vaughn
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
Back
Top