P
Piggy
Hi all,
I am having a very strange performances problem with a webapplication after
deployment on the production server.
The following is the scenario:
1. My development environment: very quick responses
2. Testing environment (very poor hardware): acceptable response times
3. Production environment (very high resources: 4xCPU + 32 GB RAM): response
times are simply not acceptable at all. Note: this is the setup phase, so no
users load on the machine.
Trying to investigate I checked to be sure the code is identical in the
testing and production environment (precompiled locally and after copied to
the target sites).
Also, the IIS configuration (machine.config) is identical in the Testing and
Production environments, and are both set to the setup defaults.
I decided not to modify anything before having the full picture of the issue
- and I will set the min and max Thread parameters only when this initial
performances difference will be explained.
Using both Fiddler and the IIS trace logs I found a huge difference in the
"time-taken" (IIS logs) and elapsed time between the "ServerGotRequest" and
"ServerBeginResponse" values (Fiddler).
The Testing environment results to be from 2 to 3 times faster then the
Production environment in spite of the resources of the 2 platforms:
From 3 to 5 seconds per page in the Testing environment, the performance
falls to
12-15 seconds at least in the Production environment.
The application architecture is quite simple:
Security
1. Windows authentication (with impersonation)
2. Roles managed using the "AspNetWindowsTokenRoleProvider"
Data
1. SQL Server 2005 SP3 Std
2. Hardcoded databinding
3. very limited use of Dynamic Data (just for administrative configuration
pages).
Since the application is the same and the IIS config is the same, can
anybody suggest me what is likely to affect so negatively the performances?
Thank you very much in advance
Alberto
I am having a very strange performances problem with a webapplication after
deployment on the production server.
The following is the scenario:
1. My development environment: very quick responses
2. Testing environment (very poor hardware): acceptable response times
3. Production environment (very high resources: 4xCPU + 32 GB RAM): response
times are simply not acceptable at all. Note: this is the setup phase, so no
users load on the machine.
Trying to investigate I checked to be sure the code is identical in the
testing and production environment (precompiled locally and after copied to
the target sites).
Also, the IIS configuration (machine.config) is identical in the Testing and
Production environments, and are both set to the setup defaults.
I decided not to modify anything before having the full picture of the issue
- and I will set the min and max Thread parameters only when this initial
performances difference will be explained.
Using both Fiddler and the IIS trace logs I found a huge difference in the
"time-taken" (IIS logs) and elapsed time between the "ServerGotRequest" and
"ServerBeginResponse" values (Fiddler).
The Testing environment results to be from 2 to 3 times faster then the
Production environment in spite of the resources of the 2 platforms:
From 3 to 5 seconds per page in the Testing environment, the performance
falls to
12-15 seconds at least in the Production environment.
The application architecture is quite simple:
Security
1. Windows authentication (with impersonation)
2. Roles managed using the "AspNetWindowsTokenRoleProvider"
Data
1. SQL Server 2005 SP3 Std
2. Hardcoded databinding
3. very limited use of Dynamic Data (just for administrative configuration
pages).
Since the application is the same and the IIS config is the same, can
anybody suggest me what is likely to affect so negatively the performances?
Thank you very much in advance
Alberto