Sending Chr(255) to Serial Port

  • Thread starter Thread starter ...
  • Start date Start date
?

...

Hi

I need to send a chr(255) to a serial port. When I send it, through
comm.write (chr(255)) it sends a chr(63) ... in Hex, I write chr(&FF) and it
actually sends chr(&3F) ... why does this happen, and how can I send it
right ?

I'm using vb.net 2005 express with framework 2.0

Thanks for an answear ...

Mike
 
Hi,

You cannot send is as a character, you must send it like this (or some
variation; there are several, some with more compact syntax -- I've spelled
it out to make it clear what is needed):

Dim My255 () As Byte
My255(0) = 255
SerialPort.Write(My255, 0, 1)

Take a look at the overloads for the Write method.

Dick

--
Richard Grier, MVP
Hard & Software
Author of Visual Basic Programmer's Guide to Serial Communications, Fourth
Edition,
ISBN 1-890422-28-2 (391 pages, includes CD-ROM). July 2004, Revised March
2006.
See www.hardandsoftware.net for details and contact information.
 
Well, actually you can send it as a character, as long as you set the
stream encoding to ASCII
 
Interesting. Have you actually tried this? What syntax do you suggest?
Write does not accept a Stream, so I'm puzzled.

Dick

--
Richard Grier, MVP
Hard & Software
Author of Visual Basic Programmer's Guide to Serial Communications, Fourth
Edition,
ISBN 1-890422-28-2 (391 pages, includes CD-ROM). July 2004, Revised March
2006.
See www.hardandsoftware.net for details and contact information.
 
Vincent said:
Well, actually you can send it as a character, as long as you set the
stream encoding to ASCII

I'd hope that wouldn't work, as ASCII doesn't have a character 255.
 
Yes, ASCII is a 7 bit encoding, but .net does not include 7 bit
encoding, so settings a stream to "ASCII" lets one write anything up to
0xFF

Yes it does work. You have to learn there are many things in life that
shouldn't work, but do work, and vice versa.

Anyway, I'm 'sure' that by ASCII, MS really mean 'extended ASCII',
which indeed goes all the treasured way up to FF.

Oh, and is there any particular reason you would hope something doesn't
work? Maybe we should do away with OO programming altogether. After
all, there is no such thing as an 'object' as far a a microprocessor is
concerned.
 
The fact IS, that SerialPort IS A STREAM, it has a property called
"Encoding" which you would have seen if you bothered to look in the
list that appears after you press the dot. Remember, the dot requires
you to use ONE finger, by placing it about the dot key, and moving it
down and up, a distance of less than half an inch. So difficult for
someone who wrote a book on the subject. I'm surprised really,
considering you wrote the book you should know this????


mySerialPort.Encoding = Encoding.ASCII;
mySerialPort.Write((char)255);

Hook it up to a computer with a nice terminal programme (e.g. one of
the ones that they had that were actually good before GUI's lip-stuck
everything) and you will see a nice little <FF> appear. Magical ! Wow!

Although I agree with the poster that one should use bytes...

....because it makes more simple. Though of course, things are simpler
if you use a real programming language like c, where byte and char are
pretty much the same thing. Why didn't they keep char, and make a new
one called unchar (e.g. unicode character)? I don't know, just to
confuse those of us who considered a char to be 8 bits, but now it is
32 bits.
 
I'm wrong, .net includes 7 bit encoding in places. But their so called
'ascii' encoding, is actually 8 bit, so not ascii, but extended ascii.
Para cada cosa correcto, hay una cosa inverdad.
 
Hi,

Try "hooking this up" on a computer that has a DBCS enabled OS (such as
Chinese). Does it work?

Dick

--
Richard Grier, MVP
Hard & Software
Author of Visual Basic Programmer's Guide to Serial Communications, Fourth
Edition,
ISBN 1-890422-28-2 (391 pages, includes CD-ROM). July 2004, Revised March
2006.
See www.hardandsoftware.net for details and contact information.
 
Hi,

Yes it works, if you do it properly. Let me explain:

Depends what you mean by does it work? Does it still send the 255? Does
the Chinese system receive the 255? Yes. Does the program interpret it
properly? Yes, if its streamreader encoding is also set to ASCII. The
fact it is a DBCS enabled system makes no difference. Also, that is
becoming irrelevent. Windows NT (including Windows 2000 and XP) has
been using unicode natively ever since is inception, whereas the
3.1-95-98-ME line has always used a relevant codepage for the area.
This is only a problem when programmes are made by something like VB6
which for some reason stores everything in unicode, but converts it to
the local codepage whenever it is accessed by code
(reading/concatenation etc.) and are used on XP, because it will then
be using the deprecated API functions that are non-unicode (and the
default codepage for that can be set up in the Regional Settings).

(which is why one still gets text files in downloads from Taiwanese
manufactures, where there is rubbish text, because what was an
apostrophe in the BIG-5 encoding is actually two seperate characers in
the (ISO8859-1/Windows-1252/Latin-1) encoding, neither of which is an
apostrophe).

If you were using VB6 then you would have to read the data only in
arrays of bytes, because if you are reading strings, then writing a
programme on a latin system, and then moving it to a MBCS system
(because there are more than 40,000 characters in Chinese, though most
educated Chinese know only about 3-4000 of them, and DBCS only allows
for 2^15 (at most, but doesn't because of a whole bunch of reserved
control codes) which is 32,768, but actually less and I can't be
bothered remembering the exact number), anyway the point is in GB2312
2003, one character can be even three bytes.

Anyway, in serial communications, everything should be done with proper
control codes (which are always 7 or 8 bit regardless of whatever
codepage a computer is using), using bytes. The parts that are supposed
to be strings should be extracted seperately and then interpreted into
the appropriate codepage. Anyway, I am against using stupid things like
GB2312 (but I'm forced to) or EUC or UHC, or indeed ISO, it is much
better just to do anystring in UTF-8. (In my opinion, but it doesn't
matter if you aren't bothered by that).

If you want to know more, IBM, yonks ago, made a really nice bunch of
protocols for serial communication. And in fact many of the ASCII
characters are named aptly for this use. All the information is
plastered all over the Internet.

It is up to the programmer(s) to make sure that the sending and
receiving systems are on the same code page.

So does it work? If the programmer is good, then of bl**dy course it
does.

By the way, .net allows streams to be read (and written) in any
codepage. If you used extended ASCII and then converted to say GB2312,
then all of the characters over 7F will be lost.

Sum up: in .net, everything can be read as a string, if encoding is set
to 'ASCII'
in VB6 you have to read everything as byte arrays, and extract the
strings yourself.

Cheers,
Vincent.
 
Vincent said:
Yes, ASCII is a 7 bit encoding, but .net does not include 7 bit
encoding, so settings a stream to "ASCII" lets one write anything up to
0xFF

What do you mean by ".NET does not include 7 bit encoding"? It includes
ASCII, which is 7-bit...
Yes it does work. You have to learn there are many things in life that
shouldn't work, but do work, and vice versa.

I know there are - and that's a bad thing, IMO, and should be
discouraged.
Anyway, I'm 'sure' that by ASCII, MS really mean 'extended ASCII',
which indeed goes all the treasured way up to FF.

No they don't - partly because there's no single encoding called
"extended ASCII". It's a blanket terms which covers lots of code pages.

By the way, you should try doing Encoding.ASCII.GetBytes("\u00ff") - it
returns a byte 63, not 255.
Oh, and is there any particular reason you would hope something doesn't
work? Maybe we should do away with OO programming altogether. After
all, there is no such thing as an 'object' as far a a microprocessor is
concerned.

If something works now, even though it shouldn't, people may well end
up relying on it. That's not a good move.
 
Vincent said:
The fact IS, that SerialPort IS A STREAM, it has a property called
"Encoding" which you would have seen if you bothered to look in the
list that appears after you press the dot. Remember, the dot requires
you to use ONE finger, by placing it about the dot key, and moving it
down and up, a distance of less than half an inch. So difficult for
someone who wrote a book on the subject. I'm surprised really,
considering you wrote the book you should know this????


mySerialPort.Encoding = Encoding.ASCII;
mySerialPort.Write((char)255);

Hook it up to a computer with a nice terminal programme (e.g. one of
the ones that they had that were actually good before GUI's lip-stuck
everything) and you will see a nice little <FF> appear. Magical ! Wow!

Even if it does work, it won't be guaranteed to work in the future
though. MS would be quite within their rights to change Encoding.ASCII
to properly reflect the fact that ASCII is a 7-bit encoding.

Why suggest something which shouldn't work and might not in the future?
Although I agree with the poster that one should use bytes...

...because it makes more simple.

That's not the main reason - the main reason is because it's guaranteed
to work, whereas your suggestion isn't.
Though of course, things are simpler
if you use a real programming language like c, where byte and char are
pretty much the same thing.

That doesn't make things simpler at all. It confuses the ideas of text
and binary data.
Why didn't they keep char, and make a new
one called unchar (e.g. unicode character)? I don't know, just to
confuse those of us who considered a char to be 8 bits, but now it is
32 bits.

Well, in .NET a char is actually 16 bits, not 32...
 
Yes. You are correct.

What do you mean by ".NET does not include 7 bit encoding"? It includes
ASCII, which is 7-bit...


I know there are - and that's a bad thing, IMO, and should be
discouraged.


No they don't - partly because there's no single encoding called
"extended ASCII". It's a blanket terms which covers lots of code pages.

By the way, you should try doing Encoding.ASCII.GetBytes("\u00ff") - it
returns a byte 63, not 255.


If something works now, even though it shouldn't, people may well end
up relying on it. That's not a good move.
 
Yes, you are correct, except I still think that the c char is simpler.

Instead of ASCII encoding, I should have written
Encoding.GetEncoding("Windows-1252")

Which is defined as being 8 bit. Thank you for correcting me :-)

And yes as you said, char is actually 16-bit. So does that mean .net
doesn't handle unicode characters after codepoints above FFFF? Such as
Old English Runes, recently added Chinese characters, old Italic,
Gothic etc. etc.??

Cheers,
Vincent.
 
Sorry for my earlier sarcasm. I was drunk and I actually respect you
very much for the work you haved done.
 
Vincent said:
Yes, you are correct, except I still think that the c char is simpler.

You find it simpler to use an encoding needlessly, rather than just
writing out the binary data required?
Instead of ASCII encoding, I should have written
Encoding.GetEncoding("Windows-1252")

And at that point, you find that if you want to write out byte 129 (for
instance) you have issues, as Unicode character 129 isn't represented
in CP 1252.
Which is defined as being 8 bit. Thank you for correcting me :-)

And yes as you said, char is actually 16-bit. So does that mean .net
doesn't handle unicode characters after codepoints above FFFF? Such as
Old English Runes, recently added Chinese characters, old Italic,
Gothic etc. etc.??

It handles them as surrogate pairs. Effectively, a .NET char is a UTF-
16 code point.
 
And at that point, you find that if you want to write out byte 129 (for
instance) you have issues, as Unicode character 129 isn't represented
in CP 1252.

lol

in that case, then the only way can be to use byte arrays.
You find it simpler to use an encoding needlessly, rather than just
writing out the binary data required?

Definitely not. Writing out the binary data is most definitely the
simplest way.

I would never condone using my suggested method to output bytes, I was
simply saying that it could be done. But if it doesn't work for 129
then my suggested method obviously doesn't work with that code page
(but it worked for 255 when I tried it out).

I agree completely with the first suggestion that was made. I was just
pointing out that something "can be done" not that it is/was or ever
will be the 'correct' way to do it!

You can be rest assured that I have never actually done serial comm's
this way.
It handles them as surrogate pairs. Effectively, a .NET char is a UTF-
16 code point.
ok, so a char still isn't really one character.

Thanks for your reply,
Vincent.
 
Hi,

The SerialPort object is not a Stream object (try Write with a Stream, or
Read with a Stream). See below.

You may or may not be able to send binary data using 8-bit ASCII encoding
(for a Char, not for a String). THIS IS NOT RECOMMENDED. You cannot
(reliably) receive binary data using 8-bit ASCII encoding.

What works (always)?

Write(Byte)
Write(ByteArray, start, count)

Read(Byte)
Read(ByteArray, start, count)

IMO, it is a mistake to try to use Strings for/with binary data. Naturally,
you can do what you want, and if that works, fine. My approach is to try to
find the "write" way. Sometimes, that is "right." Sometimes, not.

BTW, I worked with the Microsoft BCL team on the SerialPort class prior to
the Whidbey Alpha (and during Alpha). The SerialPort object did change
several internals during Beta testing, when it became System.IO.Ports
(including the spurious addition in its internals of BaseStream as
System.IO.Stream).

Dick

--
Richard Grier, MVP
Hard & Software
Author of Visual Basic Programmer's Guide to Serial Communications, Fourth
Edition,
ISBN 1-890422-28-2 (391 pages, includes CD-ROM). July 2004, Revised March
2006.
See www.hardandsoftware.net for details and contact information.
 
Hi All :)
Sorry my lack of news, I've been a bit busy in other stuff ...thanks for
your input on the matter, after a while I was able to find the solution (I'm
not entirely convinced, still it works well, as I wanted)

The solution is like

SerialPort.Encoding = Encoding.Default
SerialPort.Write (char(255)+char(15)+...)

Using Encoding.ASCII I only got char(63) on the other side, but with Default
it works like a charm.

Notes on my System: Windows XP Home SP2 Portuguese with VB.Net 2005 Express
(Framework 2.0).

Once again, thanks for the input :)
Cheers

Mike
 
.... said:
Sorry my lack of news, I've been a bit busy in other stuff ...thanks for
your input on the matter, after a while I was able to find the solution (I'm
not entirely convinced, still it works well, as I wanted)

It's definitely not a *good* solution - it may well fail when someone
tries it with a different default encoding, and it may well also fail
on some data even with your own default encoding.
The solution is like

SerialPort.Encoding = Encoding.Default
SerialPort.Write (char(255)+char(15)+...)

Using Encoding.ASCII I only got char(63) on the other side, but with Default
it works like a charm.

Use:

SerialPort.Write (new byte[] { 255, 15 });

instead. Why are you so keen to use characters?
 
Back
Top