High CPU usage in Sockets

  • Thread starter Thread starter Anatoly
  • Start date Start date
A

Anatoly

In our application I need to detemine if there is a internet connection
valid.
So I build a windows service which every minute creates telnet connection
against some host and port.
If I do connect to this host I know that internet connection is OK.
The problem: after few days CPU usage growing up to 50% for this service.
After restart it's problem gone for another 2,3 days.

This is a code:
public static bool IsBrowseable(string IP, int Port)

{

bool Result = false;

try

{

IPEndPoint localEP = new IPEndPoint(IPAddress.Parse(IP), Port);

Socket m_Socket = new Socket(AddressFamily.InterNetwork, SocketType.Stream,
ProtocolType.Tcp);

m_Socket.Connect(localEP);

Result = m_Socket.Connected;

m_Socket.Close();

}

catch (Exception E)

{

Utils.ProcessError("something wrong", E);

Result = false;

}

return Result;

}



Please help to find the problem.

Thanks
 
Sorry, I don't have an answer, but could you please post a follow-up to
this thread if you find one? I am experiencing a similar problem with a
simple sockets application that I wrote. After a few days the CPU usage goes
up to around 50% even though the service isn't really doing anything...
 
I am seeing this behavior in a simple Ping application that we wrote... it
would be wonderful to find a solution.

Brandon
 
I am seeing this behavior in a simple Ping application that we wrote... it
would be wonderful to find a solution.

Brandon

What operating system are you using? I'm running my sockets app on
Win2000. I'm wondering if upgrading to Win2003 would help.
 
Windows 2003.... sorry. :(

Brandon

David Sworder said:
What operating system are you using? I'm running my sockets app on
Win2000. I'm wondering if upgrading to Win2003 would help.
 
Windows 2003.... sorry. :(

Brandon

Bummer. Have you ever stepped into the code during one of the CPU "fits"
to see what's going on? It's kind of hard for me to do this since the
problem is taking place on a production server. I guess I'm going to have to
start running this app on a development server and step into the code
somehow.
 
I looked through my sockets code again today. I use the async socket
methods BeginReceive() and EndReceive(). When data is received, EndReceive()
is called, and then BeginReceive() is immediately called in order to
immediately start receiving more data.

One thing I forgot to do is to handle the situation in which
EndReceive() returns zero. This can happen when the client closes his
connection. In my client/server scenario, when a client closes a connection,
EndReceive() on the server typically throws an exception. That's what my
server is designed to handle. It never occurred to me that this method could
return zero until reading the documentation this morning.

I'm wondering if every now and then EndReceive() was returning zero
which then lead to a BeginReceive() call which then lead to another
EndReceive() returning zero, etc... and perhaps this tight loop was causing
the CPU to get hammered. I'm still not sure. This is all speculation at this
point.

I changed my code so that when EndReceive() returns zero, the socket is
closed out (as opposed to calling BeginReceive()). I'll see if this fixes
the problem.

David
 
Back
Top