A
Andrew Falanga
Hi,
I have some code that uses a dictionary containing shorts as the
values stored in the keys. As an example:
Dictionary<string, short> shortMap = new Dictionary<string, short>();
shortMap.Add("first", (short)0xef43);
When I look at the contents of this dictionary in the debugger, I see
"0xffffef43" instead of "0xef43". Why is this? This is making me
think that the system is looking at a 32 bit quantity rather than a 16
bit.
Thanks,
Andy
I have some code that uses a dictionary containing shorts as the
values stored in the keys. As an example:
Dictionary<string, short> shortMap = new Dictionary<string, short>();
shortMap.Add("first", (short)0xef43);
When I look at the contents of this dictionary in the debugger, I see
"0xffffef43" instead of "0xef43". Why is this? This is making me
think that the system is looking at a 32 bit quantity rather than a 16
bit.
Thanks,
Andy