K
Keith Langer
Maybe someone has come across this situation and has a way to handle
it.
If I have a tcpClient object and do a GetStream.Read on it after
setting a ReceiveTimeout, I'm getting different behavior in .Net 2.0
than in 1.0 when the method returns 0 bytes (which signifies a
timeout). In both cases I get a System.IO.IOException , but in the
2.0 framework the socket also disconnects when this occurs (in 1.0 it
did not disconnect). They have made some improvements in the
tcpClient class, such as exposing the underlying socket, so I'd prefer
to keep using tcpClients. Any idea how I can prevent this
disconnect? Right now I'm checking DataAvailable and using
Thread.Sleep as a workaround, but I'd prefer to do a blocking read
call.
Thanks,
Keith
it.
If I have a tcpClient object and do a GetStream.Read on it after
setting a ReceiveTimeout, I'm getting different behavior in .Net 2.0
than in 1.0 when the method returns 0 bytes (which signifies a
timeout). In both cases I get a System.IO.IOException , but in the
2.0 framework the socket also disconnects when this occurs (in 1.0 it
did not disconnect). They have made some improvements in the
tcpClient class, such as exposing the underlying socket, so I'd prefer
to keep using tcpClients. Any idea how I can prevent this
disconnect? Right now I'm checking DataAvailable and using
Thread.Sleep as a workaround, but I'd prefer to do a blocking read
call.
Thanks,
Keith