Problem storing 0.000 in DataRow

  • Thread starter Thread starter Carl G
  • Start date Start date
C

Carl G

I have a DataSet with a couple of tables. I have noticed that when I
am trying to store a Decimal 0.000 in a datarow, the value is changed
to 0, without the information about its decimalplaces.

Storing 3.000 works fine but not 0.000. Why is that?
 
Hi Carl,

Err, data in dataset is stored in native format and not formatted at all
(3.000 is stored as 3 and 0.000 is stored as 0).
Formating is supposed to happen at UI level (datagrid perhaps).
 
Hi Miha,

Yes, it is not formatted but the decimal value still contains
information about how many decimal it was created with. I want to use
this information in the UI level, but I can't since 0.000 gets
transformed to 0 without the .000. All other decimal values keep the
information about decimal places except for 0.

Look at this example

Decimal dec1 = new Decimal( 0, 0, 0, false, 3 ); //0.000
Decimal dec2 = new Decimal( 3000, 0, 0, false, 3 ); //3.000

myRow["dec1"] = dec1;
myRow["dec2"] = dec2;

When looking at the flags property in quick watch:

dec1 has 196608, ie. 3 decimals
dec2 has 196608, ie. 3 decimals
myRow["dec1"] has 0, ie. 0 decimals !!!!
myRow["dec2"] has 196608, ie. 3 decimals

Miha Markic said:
Hi Carl,

Err, data in dataset is stored in native format and not formatted at all
(3.000 is stored as 3 and 0.000 is stored as 0).
Formating is supposed to happen at UI level (datagrid perhaps).

--
Miha Markic [MVP C#] - RightHand .NET consulting & development
miha at rthand com
www.rthand.com

Carl G said:
I have a DataSet with a couple of tables. I have noticed that when I
am trying to store a Decimal 0.000 in a datarow, the value is changed
to 0, without the information about its decimalplaces.

Storing 3.000 works fine but not 0.000. Why is that?
 
Err, data in dataset is stored in native format and not formatted at all
(3.000 is stored as 3 and 0.000 is stored as 0).

Not as of v1.1 of the framework - decimals remember their precision for
non-zero numbers.

Try this code:
using System;

class Test
{
static void Main()
{
decimal d0 = 1m;
decimal d1 = 1.0m;
decimal d2 = 1.00m;
decimal d3 = 1.000m;
decimal d4 = 1.0000m;
Console.WriteLine (d0);
Console.WriteLine (d1);
Console.WriteLine (d2);
Console.WriteLine (d3);
Console.WriteLine (d4);
}
}

Now the curious thing is that if you change the "1" to a "0", you lose
the information. Here's another program to show what I mean:

using System;

class Test
{
static void Main()
{
ShowDecimal(1m);
ShowDecimal(1.0m);
ShowDecimal(0m);
ShowDecimal(0.0m);

int[] bits = new int[]{0, 0, 0, 65536};
ShowDecimal(new decimal(bits));
}

static void ShowDecimal(decimal d)
{
int[] bits = decimal.GetBits(d);
Console.WriteLine ("{0} {1} {2} {3} {4}",
d, bits[0], bits[1],
bits[2], bits[3]);
}
}

The last entry shows the construction of something which is *logically*
0.0, but still gets displayed as 0. It looks like -0 also gets
displayed as 0 by default.
 
Hi Jon and Miha

Some information I am not sure of (so no information however to keep in
mind), this is as far as I remember me that I have readed it in the
language.vb group changed in the versions 1.0 and 1.1.

Cor
 
Cor Ligthert said:
Some information I am not sure of (so no information however to keep in
mind), this is as far as I remember me that I have readed it in the
language.vb group changed in the versions 1.0 and 1.1.

It definitely changed between 1.0 and 1.1. In 1.0, 1.0 would have been
stored as 1, whereas 1.1, 1.0 would be stored as 1.0. Just to confuse
things ;)
 
Back
Top