Divide error during debug

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I'm running Visual Studio.NET and am experiencing a problem during debugging
(i assume this will also be the case during normal operation). If the
variable "test" is a double and "denom" a finite integer, the operation test
= 1 / denom; results in test having a value of 0.000000! However, if denom is
also a double, test has the correct value. Dumb question perhaps, but am i
missing something?
 
mudman said:
I'm running Visual Studio.NET and am experiencing a problem during
debugging
(i assume this will also be the case during normal operation). If the
variable "test" is a double and "denom" a finite integer, the operation
test
= 1 / denom; results in test having a value of 0.000000! However, if denom
is
also a double, test has the correct value. Dumb question perhaps, but am i
missing something?

If you have something like this

int denom;
double test;

test = 1 / denom;

then test will be assigned a value of zero (where denom is not 0). That's
because the division happens in integers, it yields a result of 0 and
integer 0 is converted to double 0.

If on the other hand you have an expression that contains integers and
doubles then the integers are promoted to doubles before the division and
assignment happen.

Is that what you see?

Regards,
Will
 
This would indeed explain what i am experiencing, although it is somewhat
illogical- essentially, the expression is being interpreted as:

test = (double)((int)(1 / denom));

Thanks.
 
mudman said:
This would indeed explain what i am experiencing, although it is somewhat
illogical- essentially, the expression is being interpreted as:

test = (double)((int)(1 / denom));

Is is that, or essentially that?

Your C style cast above

(int)

forces the devision to be done in integers, no? And that zero result is cast
to double.

Why not try

test = (double) 1 / denom;

You are welcome.

Regards,
Will
 
Back
Top