T
Tony Johansson
Hello!
Here is some text and it says somewhere in the middle that "a
code point is encoded into a sequence of one or more 16-bit values".
I mean that if you use UTF-16 a code point is encoded into a sequence of a
16 bit value not as the text says
a sequesnce of one or more 16 bit values ?
The Unicode Standard identifies each Unicode character with a unique 21-bit
scalar number called a code point, and defines the UTF-16 encoding form that
specifies how a
code point is encoded into a sequence of one or more 16-bit values. Each
16-bit value ranges
from hexadecimal 0x0000 through 0xFFFF and is stored in a Char structure.
The value of a
Char object is its 16-bit numeric (ordinal) value.
//Tony
Here is some text and it says somewhere in the middle that "a
code point is encoded into a sequence of one or more 16-bit values".
I mean that if you use UTF-16 a code point is encoded into a sequence of a
16 bit value not as the text says
a sequesnce of one or more 16 bit values ?
The Unicode Standard identifies each Unicode character with a unique 21-bit
scalar number called a code point, and defines the UTF-16 encoding form that
specifies how a
code point is encoded into a sequence of one or more 16-bit values. Each
16-bit value ranges
from hexadecimal 0x0000 through 0xFFFF and is stored in a Char structure.
The value of a
Char object is its 16-bit numeric (ordinal) value.
//Tony