J
Jonathan Jones
All,
I am writing some C# client / server stuff that interfaces with hardware via
some physical port (CAN or RS232). I am receiving messages via the physical
port and then I have a message class that parses the data and stuffs the
parsed data into a new message object so it is available to any clients that
want it. I understand how to use the BinaryFormatter class but my question
is regarding getting the bytes sent out via the NetworkStream and having it
deserialized properly on the other side.
So I create my BinaryFormatter and then call
BinaryFormatter.Serialize(stream, msg) where stream is the underlying
NetworkStream from the TcpListener and msg is my object. How does the client
know how much data to use to deserialize the object? Do I have to send some
sort of synch message that includes data length so it knows there is a
message coming and how long it is? If that's the case, that's fine. I
didn't know if there if there was something I was missing in my understanding.
Thanks!
Jonathan Jones
I am writing some C# client / server stuff that interfaces with hardware via
some physical port (CAN or RS232). I am receiving messages via the physical
port and then I have a message class that parses the data and stuffs the
parsed data into a new message object so it is available to any clients that
want it. I understand how to use the BinaryFormatter class but my question
is regarding getting the bytes sent out via the NetworkStream and having it
deserialized properly on the other side.
So I create my BinaryFormatter and then call
BinaryFormatter.Serialize(stream, msg) where stream is the underlying
NetworkStream from the TcpListener and msg is my object. How does the client
know how much data to use to deserialize the object? Do I have to send some
sort of synch message that includes data length so it knows there is a
message coming and how long it is? If that's the case, that's fine. I
didn't know if there if there was something I was missing in my understanding.
Thanks!
Jonathan Jones