Serializing / Deserializing binary data via a NetworkStream

  • Thread starter Thread starter Jonathan Jones
  • Start date Start date
J

Jonathan Jones

All,

I am writing some C# client / server stuff that interfaces with hardware via
some physical port (CAN or RS232). I am receiving messages via the physical
port and then I have a message class that parses the data and stuffs the
parsed data into a new message object so it is available to any clients that
want it. I understand how to use the BinaryFormatter class but my question
is regarding getting the bytes sent out via the NetworkStream and having it
deserialized properly on the other side.

So I create my BinaryFormatter and then call
BinaryFormatter.Serialize(stream, msg) where stream is the underlying
NetworkStream from the TcpListener and msg is my object. How does the client
know how much data to use to deserialize the object? Do I have to send some
sort of synch message that includes data length so it knows there is a
message coming and how long it is? If that's the case, that's fine. I
didn't know if there if there was something I was missing in my understanding.

Thanks!

Jonathan Jones
 
Ok,

I did some examples and I have another question. Because I have a class
that wraps the TcpClient and NetworkStream stuff, I don't plan on writing
directly to the NetworkStream. I want to write to a MemoryStreamand then use
the ToArray() method to pass the returned byte array to my Write() method
(which uses NetworkStream.BeginWrite()).

As a test, I created a simple 6 byte array and passed it to the
BinaryFormatter.Serialize(stream, object) as the object to be serialized.
The resulting MemoryStream was 35 bytes long! I know this is a poor example
because I could have just sent the 6-bytes via a direct Write() call, but is
that kind of bloat normal? And more importantly, does it save time if you are
sending 5 - 6 times the amount of data you would normally be sending.

Jonathan
 
Back
Top