Double data type subtraction changes precision

  • Thread starter Thread starter neerajb
  • Start date Start date
N

neerajb

I have 3 variables, all Double data type. When I subtract dblA - dblB =
dblC, what should be a simple number changes to a lot of decimals. For
example: 2.04 - 1.02 becomes 1.0199999999999999.... Why is this? It seems
that if dblA and dblB are only 2 decimals, the result should be 2 decimals or
less... Very critical in the application that I'm working on that this
number comes out to the expected 2 decimals, but it can vary to any number of
decimals depending on what values are passed into dblA and dblB.

This is ridiculous. THis is happening in both .NET 1.1 and .NET 2.0
 
Back
Top