mystery 2.3

  • Thread starter Thread starter Vince13 via DotNetMonster.com
  • Start date Start date
V

Vince13 via DotNetMonster.com

I am working on a page with textboxes that can only accept a certain number
of decimals places. So I wrote a small function to check the number of
decimals, not counting zeros, as well as range check the entry and then
return the results, based on a parsing method I learned in c++ involving
integer division:

public bool entryCheck(int decimals, double min, double max, TextBox field,
Label error)
{
bool valid = false;
if (field.Text != "")
{
error.Text = "";
double entry = double.Parse(field.Text);
//decimal check
if ((double)((int)(entry * Math.Pow(10,(decimals + 1))) / 10) != entry *
Math.Pow(10, decimals))
{
field.ForeColor = Color.Red;
error.Text = "Too Many Decimals";
}
//range check
else if (entry < min || entry > max)
{
field.ForeColor = Color.Red;
error.Text = "Invalid Range";
}
else
{
field.ForeColor = Color.Blue;
valid = true;
}
}
else
error.Text = "Required Field"; //error if left blank

return valid;
}

It works beautifully ALMOST all of the time. For some reason, when the entry
is 2.3, the computer multiplies it by say 10000 and gets 22999.9999, which
obviously won't work and isn't correct. Does anyone know why this is?
Thanks?
 
because floating point is done in base 2, not base 10. just as 1/3 is
..33333... in base 10, 2.3 is a repeating number in base 2

switch to the decimal datatype which uses base 10 (actual its integer
only with a implied decimal).


-- bruce (sqlwork.com)
 
Thanks a lot guys!

**in case anyone else is wondering:

if ((((int)(dentry * (decimal)Math.Pow(10,(decimals + 1)))) / 10) != dentry *
(decimal)Math.Pow(10, decimals))

gave me the corrext result
 
I am working on a page with textboxes that can only accept a certain number
of decimals places. So I wrote a small function to check the number of
decimals, not counting zeros, as well as range check the entry and then
return the results, based on a parsing method I learned in c++ involving
integer division:

public bool entryCheck(int decimals, double min, double max, TextBox field,
Label error)
{
bool valid = false;
if (field.Text != "")
{
error.Text = "";
double entry = double.Parse(field.Text);
//decimal check
if ((double)((int)(entry * Math.Pow(10,(decimals + 1))) / 10) != entry *
Math.Pow(10, decimals))
{
field.ForeColor = Color.Red;
error.Text = "Too Many Decimals";
}
//range check
else if (entry < min || entry > max)
{
field.ForeColor = Color.Red;
error.Text = "Invalid Range";
}
else
{
field.ForeColor = Color.Blue;
valid = true;
}
}
else
error.Text = "Required Field"; //error if left blank

return valid;
}

It works beautifully ALMOST all of the time. For some reason, when the entry
is 2.3, the computer multiplies it by say 10000 and gets 22999.9999, which
obviously won't work and isn't correct. Does anyone know why this is?
Thanks?

What I would suggest is to use a numeric up down control. This yields
a couple of benefits
- Less code
- Less headache
- Simpler to understand
- Intuitive to the users
 
Back
Top