G
Guest
I'm running Visual Studio.NET and am experiencing a problem during debugging
(i assume this will also be the case during normal operation). If the
variable "test" is a double and "denom" a finite integer, the operation test
= 1 / denom; results in test having a value of 0.000000! However, if denom is
also a double, test has the correct value. Dumb question perhaps, but am i
missing something?
(i assume this will also be the case during normal operation). If the
variable "test" is a double and "denom" a finite integer, the operation test
= 1 / denom; results in test having a value of 0.000000! However, if denom is
also a double, test has the correct value. Dumb question perhaps, but am i
missing something?