What size should a short be when viewed from the debugger

  • Thread starter Thread starter Andrew Falanga
  • Start date Start date
A

Andrew Falanga

Hi,

I have some code that uses a dictionary containing shorts as the
values stored in the keys. As an example:

Dictionary<string, short> shortMap = new Dictionary<string, short>();
shortMap.Add("first", (short)0xef43);

When I look at the contents of this dictionary in the debugger, I see
"0xffffef43" instead of "0xef43". Why is this? This is making me
think that the system is looking at a 32 bit quantity rather than a 16
bit.

Thanks,
Andy
 
Are you sure you don't run previously compiled code. 0xef43 can't be
converted to a signed short.

Alsowhat do you use to show the value. For example the watch window shoudl
display decimal values by default, not hexadecimal...
 
Andrew said:
Hi,

I have some code that uses a dictionary containing shorts as the
values stored in the keys. As an example:

Dictionary<string, short> shortMap = new Dictionary<string, short>();
shortMap.Add("first", (short)0xef43);

When I look at the contents of this dictionary in the debugger, I see
"0xffffef43" instead of "0xef43". Why is this? This is making me
think that the system is looking at a 32 bit quantity rather than a 16
bit.

It's important to keep in mind that the debugger will do some conversion
of data before displaying.

In this case, it's just an artifact of having your display settings to
show hexadecimal. The debugger automatically extends values to 32 bits
before displaying when you do that.

Unfortunately, I don't know of a way to suppress that behavior. I agree
that it would be nice if the debugger would show primitive data types in
their actual format.

Pete
 
Are you sure you don't run previously compiled code. 0xef43 can't be
converted to a signed short.

I think it can be done... its a nice negative (decimal) -28483 if
signed, and should read 6f43 if positive if I remeber correctly...
something about binary 16 bit numbers... might have changed since I
botherd to look at them.
 
Are you sure you don't run previously compiled code. 0xef43 can't be
converted to a signed short.

Alsowhat do you use to show the value. For example the watch window shoudl
display decimal values by default, not hexadecimal...

I have my watch window set to display hexidecimal. The program I'm
debugging compares on hex values rather than decimal, or other,
values. So, I'm displaying as they'll be compared.

Andy
 
I'm using C#2008.

0xef43 is a signed int whose value is positive and for this reason can't be
casted to a short (unless using the unchecked keyword). The provided code
doesn't compile here ? Do you use C# 2005 ?
 
It's important to keep in mind that the debugger will do some conversion
of data before displaying.

In this case, it's just an artifact of having your display settings to
show hexadecimal.  The debugger automatically extends values to 32 bits
before displaying when you do that.

Unfortunately, I don't know of a way to suppress that behavior.  I agree
that it would be nice if the debugger would show primitive data types in
their actual format.

Pete

After some debugging, I was beginning to suspect this. Thanks for
confirming this.

I did eventually find the answer to my problem. I inherited the code
and found that, although the system is supposed to check 16 bit
quantities, the code was calling into other supporting libraries that
take int sized data rather than short sized. Because these other
functions are used in many other areas, I just updated my dictionary
to use int sized data. So, now I'm doing this:

Dictionary<string, int> integerMap = new Dictionary<string, int>();
integerMap.Add("first", 0xef43);

instead of what I had before. This seems to work well.

Thanks everyone,
Andy
 
Andrew said:
I have my watch window set to display hexidecimal. The program I'm
debugging compares on hex values rather than decimal, or other,
values. So, I'm displaying as they'll be compared.

Surely you simply mean that hex literals are used in the code.

The code itself compares _numbers_, which are neither decimal nor hex.

In any case, all that happens is that the original is sign-extended; for
debugging purposes, it should not matter whether the debugger displays
"0xffffef43" or "0xef43". You can just mentally ignore the four most
significant digits.

Pete
 
Ok, so :
- for the compilation (C# 2008) I 've got Constant value '61251' cannot be
converted to a 'short' (use 'unchecked' syntax to override)
- for the debugger I've got the same behavior in the watch window and it is
still the same in VS 2010.

I could be done on purpose as math expression as it reminds me that AFAIK an
expression always return an integer when shorter values are used. So anyway
the value will be used this way as soon are you are using it in an
expression.

--
Patrice


"Andrew Falanga" <[email protected]> a écrit dans le message de
Are you sure you don't run previously compiled code. 0xef43 can't be
converted to a signed short.

Alsowhat do you use to show the value. For example the watch window shoudl
display decimal values by default, not hexadecimal...

I have my watch window set to display hexidecimal. The program I'm
debugging compares on hex values rather than decimal, or other,
values. So, I'm displaying as they'll be compared.

Andy
 
Patrice said:
Ok, so :
- for the compilation (C# 2008) I 've got Constant value '61251' cannot
be converted to a 'short' (use 'unchecked' syntax to override)

Depending on context, unchecked might be the default. Or, the OP might
not really have a literal in his code but instead just a variable or
something. Hard to say for sure if the OP doesn't post a complete code
example.

But yes, ordinarily, you'd need "unchecked((short)0xef43)" to get the
OP's code to compile. A cast by itself isn't legal, because by default
compile-time conversions are checked (run-time default is unchecked).
- for the debugger I've got the same behavior in the watch window and it
is still the same in VS 2010.

I could be done on purpose as math expression as it reminds me that
AFAIK an expression always return an integer when shorter values are
used.

Sort of correct. Even if you just add two shorts, the expression is int
and will have to be cast back to short to be used as a short. But a
short by itself is still just a short. It's only promoted if the
context requires it.

Still, IMHO it would be a nice refinement to be able to tell the
debugger to take into account the _actual_ size of a variable when
displaying it to the user. :)

Pete
 
Back
Top