G
Guest
When I query my Oracle database for a decimal value, I explicitly specify a
ROUND to one decimal place. When the value is filled into the dataset, the
value is stored with two decimal places. For instance, if the value I am
querying (and rounding in the SQL statement itself) is 8.5, the actual value
in the dataset is 8.50.
Now, I'm not going to argue that 8.5 is the same value as 8.50, but when
that 8.50 value is displayed in a grid... it is displayed as 8.50. This is
not good when I have validation code checking to make sure then user didn't
enter a value with more than one decimal point.
Note, I am not looking for a workaround to this issue. I would like to solve
the problem at the source. I would like the value filled in the dataset to be
the value retrieved from the database.
What is really odd is that if the user types in 8.5 directly, the dataset
stores the value as 8.5. It isn't until the value is retrieved back from the
database that it gets stored as 8.50.
ROUND to one decimal place. When the value is filled into the dataset, the
value is stored with two decimal places. For instance, if the value I am
querying (and rounding in the SQL statement itself) is 8.5, the actual value
in the dataset is 8.50.
Now, I'm not going to argue that 8.5 is the same value as 8.50, but when
that 8.50 value is displayed in a grid... it is displayed as 8.50. This is
not good when I have validation code checking to make sure then user didn't
enter a value with more than one decimal point.
Note, I am not looking for a workaround to this issue. I would like to solve
the problem at the source. I would like the value filled in the dataset to be
the value retrieved from the database.
What is really odd is that if the user types in 8.5 directly, the dataset
stores the value as 8.5. It isn't until the value is retrieved back from the
database that it gets stored as 8.50.