J
Juan Carlos Rico Gil
I am developping a 'Web Service with Data Server Access' benchmarking
application. The Web Service and the Data Server are located remote in my
network. The benchmarking (client) application runs on XP Professional,
localy. It is developped with Visual Studio, Enterprise Edition. It creates
and runs 8 threads to simulate users concurrency and apply load to the Web
Service during around 1 minute.
The problem is: when I run the client under Visual Studio, the process is 4
times quicker than if I run it outside Visual Studio, for example from my
desktop. And if I run the client from a Server 2003 Machine, I get again the
Visual Studio result, 4 times quicker.
The .exe file is in all cases exactly the same. In the 2 local tests, under
VS and outside VS, the .exe file is even exactly the same file, in the same
directory.
The machines loads is near null outside the tests periods, even running from
Visual Studio. So the only load is my application, the Web Service and the
Data Server.
The cpu load is very low, in the problematic case: around 10%. Under VS, the
cpu load is around 60% (limited by the remote Web Service and Data
processes). Under Server 2003 the load is 100%, as all processes (Client,
Web Service and Data) run on the same machine.
So something limits my application under XP Professional, outside Visual
Studio.
Does somebody know where this come from?
The aplication has a possibility to access the data directly, providing a
comparison with the Web Service. This time the load reached is identical in
all 3 cases (local VS, local outside VS, Server 2003), and of course the
process is much quicker than with Web Service (3.3 times quicker), so no
problem.
The only apparent difference between the two situations (Web Service /
Direct Access) seems to be related to the threads... But why?
Is there something I can do to correct this?
application. The Web Service and the Data Server are located remote in my
network. The benchmarking (client) application runs on XP Professional,
localy. It is developped with Visual Studio, Enterprise Edition. It creates
and runs 8 threads to simulate users concurrency and apply load to the Web
Service during around 1 minute.
The problem is: when I run the client under Visual Studio, the process is 4
times quicker than if I run it outside Visual Studio, for example from my
desktop. And if I run the client from a Server 2003 Machine, I get again the
Visual Studio result, 4 times quicker.
The .exe file is in all cases exactly the same. In the 2 local tests, under
VS and outside VS, the .exe file is even exactly the same file, in the same
directory.
The machines loads is near null outside the tests periods, even running from
Visual Studio. So the only load is my application, the Web Service and the
Data Server.
The cpu load is very low, in the problematic case: around 10%. Under VS, the
cpu load is around 60% (limited by the remote Web Service and Data
processes). Under Server 2003 the load is 100%, as all processes (Client,
Web Service and Data) run on the same machine.
So something limits my application under XP Professional, outside Visual
Studio.
Does somebody know where this come from?
The aplication has a possibility to access the data directly, providing a
comparison with the Web Service. This time the load reached is identical in
all 3 cases (local VS, local outside VS, Server 2003), and of course the
process is much quicker than with Web Service (3.3 times quicker), so no
problem.
The only apparent difference between the two situations (Web Service /
Direct Access) seems to be related to the threads... But why?
Is there something I can do to correct this?