Miha and Nakul have mentioned a couple of reasons (which are typically the problem), but consider that the number of connections in the pool is a function of the behavior of the applications that share the pool. In an ASP environment, the application instances can (usually) share a common pool. When the application opens a connection, the pooler looks for a matching connection string in the pool--if it's there it's used if the connection is "dormant", if there is no dormant connection, a new one is created. When you close a connection, it's simply marked as dormant and the real connection to the database remains open. After 4-8 minutes dormant connections are closed and discarded if not reused.
Problems occur when:
a.. The connections in the pool are not closed when the Connection object is still in scope. This is a typical mistake and many folks have found problems getting connections closed. This is offtimes the fault of incorrectly using a DataReader which requires manual connection handling.
b.. The connections are held open too long. This can occur (especially with a full-sized production database) when the code is asked to perform too much processing or fetch too many rows. For example, if ASP instance "A" is still working on a fetch or other operation and instance "B" is launched, it can't reuse "A"s connection.
c.. Two or more operations collide and block the application(s) from completing the assigned task in time. When updating or adding rows, you aren't allowed to change locked rows--rows held by other tasks. This sort of "deadly embrace" or locking contention is exacerbated by fetching large rowsets or doing too much processing while holding the connection open.
d.. Applications open connections whether they need them or not instead of opening connections "just in time". This means that connections are consumed for longer periods of time than necessary thus increasing the load on the pool.
e.. An application fails or returns an exception sev > 10 (IIRC) that "breaks" the connection. This leaves a "dented" connection in the pool that can't be reused, but is cleaned up only when some application tries to reuse it.
f.. The connection string changes. This is another (albeit related) problem that occurs when your application changes the connection string for some reason--as when changing the user credentials or application name for some reason. This does not overflow the pool--it causes a new pool to be created each time the connection is opened. This wastes time as it adds unncessary overhead to the Open Connection process. If you have to change the connection string, turn off pooling.
Use the PerfMon to monitor the pool (if you're using SQL Server). In a heavily used ASP application, the number of pooled connections should rise over time, level off and remain stable at 10 or so. If the number keeps rising, you'll hit the wall (100 pooled connections) before long. In some cases you can simply increase the number of pooled connections to match the demand being placed on the server. However, we've seen heavily used Web sites use fewer than a dozen connections. Increasing the size won't help if you're not releasing connections. Focus on that problem first. Make sure that any connection that's opened is closed.
If you have application components that take (dramatically) more time than others, use separate pools. This is easy to do, simply change the Application Name in the Connect string. This way the applications don't fight over the pools (as much).
I discuss this in more depth in my books.
hth
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________