G
Guest
We have a client server application written in C# which
makes a series of calls to the database using a third
party .net provider. The .net provider resides on the
client workstation and the database resides on a different
machine. The .net code loops thru the DataReader to
create application level objects which are then accessed
by the UI. As the network delays increase, the data access
degrades which is to be expected. But what I see is that
the application object creation times / execution times
also increase. This I infer from the logs being
maintained by the application. It is as if every part of
the client application has adapted to the "Slowness" of
the network. I am looking for clues here why the other
regions have gotten slow. The application does not use
threads and is "fully synchronous". Any pointers as to the
cause of the issue greatly appreciated
makes a series of calls to the database using a third
party .net provider. The .net provider resides on the
client workstation and the database resides on a different
machine. The .net code loops thru the DataReader to
create application level objects which are then accessed
by the UI. As the network delays increase, the data access
degrades which is to be expected. But what I see is that
the application object creation times / execution times
also increase. This I infer from the logs being
maintained by the application. It is as if every part of
the client application has adapted to the "Slowness" of
the network. I am looking for clues here why the other
regions have gotten slow. The application does not use
threads and is "fully synchronous". Any pointers as to the
cause of the issue greatly appreciated