T
Tony Johansson
Hi!
First of all this statement is from MSDN ".NET Framework uses Unicode UTF-16
to represent characters.
In some cases, the .NET Framework uses UTF-8 internally."
Unicode representations is used for char data types for example if I check
with
sizeof for the char datatype it write 2 bytes.
Unicode is using a signed 16 bits integer.
Now if we look at the string data type which is reference type.
Here I store the string literal "s" in a byte array.
The size of the byte array is 2 bytes because Unicode is using 2 bytes.
byte[] myByte = Encoding.Unicode.GetBytes("s");
Now to the question that I have hard to understand. Remember that I wrote
this row as the first row in this mail
".NET Framework uses Unicode UTF-16 to represent characters. In some cases,
the .NET Framework uses UTF-8 internally."
So if I write this simple statement
string myString = "s";
will Unicode be used to store the single string literal "s" ???
If the answer is no then we have a follow up question just because
of a string is a sequence of char and as we saw earlier a char is using
unicode with 2 bytes so why is not each char
in a string using 2 bytes ???
//Tony
First of all this statement is from MSDN ".NET Framework uses Unicode UTF-16
to represent characters.
In some cases, the .NET Framework uses UTF-8 internally."
Unicode representations is used for char data types for example if I check
with
sizeof for the char datatype it write 2 bytes.
Unicode is using a signed 16 bits integer.
Now if we look at the string data type which is reference type.
Here I store the string literal "s" in a byte array.
The size of the byte array is 2 bytes because Unicode is using 2 bytes.
byte[] myByte = Encoding.Unicode.GetBytes("s");
Now to the question that I have hard to understand. Remember that I wrote
this row as the first row in this mail
".NET Framework uses Unicode UTF-16 to represent characters. In some cases,
the .NET Framework uses UTF-8 internally."
So if I write this simple statement
string myString = "s";
will Unicode be used to store the single string literal "s" ???
If the answer is no then we have a follow up question just because
of a string is a sequence of char and as we saw earlier a char is using
unicode with 2 bytes so why is not each char
in a string using 2 bytes ???
//Tony