Losing data sent through network socket

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I have a server and client that I've written in .NET using the System.Net.Sockets objects, and I am having a bit of a problem. First let me describe what the programs do. The server takes an arbitrary message, encrypts it using a CryptoStream into a MemoryStream, gets an array of bytes from the MemoryStream, and then transmits then length of the encrypted data, followed by the encrypted data down the socket. The client does the exact opposite, reading a 4-byte integer to get the length of the encrypted data, then reading that much data off the socket

On the server, the messages that are being pumped out are put on a queue so that the code using the server will not block while waiting for the message to be sent. So I try filling up the queue with a few hundred messages, and the server starts firing them off as fast as it can. The client keeps up with this for a short duration but inevitably crashes at some point because of an IOException or serialization error.

The only way I have been able to keep a lid on things so far is to have the server wait for the client to send an acknowledgement byte back to the server once it has successfully read the message from the socket. If I do this I get every message flawlessly, but it seems like there is a better way to handle this, I just have not been able to find it

Any help would be GREATLY appreciated
 
Scott Crumpler said:
I have a server and client that I've written in .NET using the
System.Net.Sockets objects, and I am having a bit of a problem. First
let me describe what the programs do. The server takes an arbitrary
message, encrypts it using a CryptoStream into a MemoryStream, gets
an array of bytes from the MemoryStream, and then transmits then
length of the encrypted data, followed by the encrypted data down the
socket. The client does the exact opposite, reading a 4-byteinteger
to get the length of the encrypted data, then reading that much data
off the socket.

Are you attempting to read it in one go, by any chance? You may not
read it all in one call...
On the server, the messages that are being pumped out are put on a
queue so that the code using the server will not block while waiting
for the message to be sent. So I try filling up the queue with a few
hundred messages, and the server starts firing them off as fast as it
can. The client keeps up with this for a short duration but
inevitably crashes at some point because of an IOException or
serialization error.

It sounds like you could be sending the data too fast - are you by any
chance sending the data with BeginWrite several times on the same
socket? If so, you'll end up with potentially overlapping writes.
The only way I have been able to keep a lid on things so far is to
have the server wait for the client to send an acknowledgement byte
back to the server once it has successfully read the message from the
socket. If I do this I get every message flawlessly, but it seems
like there is a better way to handle this, I just have not been able
to find it.

Any help would be GREATLY appreciated!

Could you post a short but complete program which demonstrates the
problem?

See http://www.pobox.com/~skeet/csharp/complete.html for details of
what I mean by that.
 
Below is some code the pretty much sums up the problem. Any ideas

// include all relevant using directive
class Class

static void ListenAndSend(

TcpListener listener = new TcpListener(20000)
listener.Start()

TcpClient client = listener.AcceptTcpClient()

string testMessage = "This is a test message."

BinaryFormatter bf = new BinaryFormatter()
MemoryStream ms = new MemoryStream()
bf.Serialize(ms, testMessage)

NetworkStream ns = client.GetStream()
for(int i = 0; i < 1000; i++

ns.WriteByte((byte)ms.Length)
ns.Write(ms.ToArray(), 0, (int)ms.Length)


client.Close()
listener.Stop()


[STAThread
static void Main(string[] args

Thread listenerThread = new Thread(new ThreadStart(Class1.ListenAndSend))
listenerThread.Start()

TcpClient client = new TcpClient("localhost", 20000)

BinaryFormatter bf = new BinaryFormatter()
NetworkStream ns = client.GetStream()
for(int i = 0; i < 1000; i++

byte len = (byte)ns.ReadByte()
byte[] bytes = new byte[len]

ns.Read(bytes, 0, (int)len)

MemoryStream ms = new MemoryStream(bytes)

string str = (string)bf.Deserialize(ms)

Console.WriteLine(str)

bytes = null


client.Close()

}
 
Scott Crumpler said:
Below is some code the pretty much sums up the problem. Any ideas?

Yup - for a start, your reading code is broken:

ns.Read(bytes, 0, (int)len);

there's no guarantee this will actually return having read "len" bytes.
Instead, you should loop, something like this:

int read=0;
while (read < len)
{
int chunk = ns.Read(bytes, read, len-read);
if (chunk==0)
{
// Error somewhere... not enough data sent
// You might want to throw an exception
}
read += chunk;
}

You should also have a using block round your stream (in both places),
so it gets closed automatically, whether or not there's an exception:

using (Stream ns = client.GetStream())
{
...
}

(You then don't need to call Close explicitly.)
 
Back
Top