G
Guest
I do not understand the behavior I see when using SqlDecimal.Round. Here is a simple test program
// ------------- begin snippet ----------------
using System
using System.Data.SqlTypes
public class RRK1
public static void Main()
decimal prob = 13.7744065m
int dig = 4
SqlDecimal srounded4 = SqlDecimal.Round(prob, dig)
Console.WriteLine("rounded {0} is {1}", prob, srounded4)
// --------------- end snippet------------------------
I expect to see 13.7744, but in fact I get 13.7745. This is not a question of Banker's Rounding or anything like that -- this number is much closer to 13.7744 than it is to 13.7745 no matter how you slice it. What am I misunderstanding here
Thanks.
// ------------- begin snippet ----------------
using System
using System.Data.SqlTypes
public class RRK1
public static void Main()
decimal prob = 13.7744065m
int dig = 4
SqlDecimal srounded4 = SqlDecimal.Round(prob, dig)
Console.WriteLine("rounded {0} is {1}", prob, srounded4)
// --------------- end snippet------------------------
I expect to see 13.7744, but in fact I get 13.7745. This is not a question of Banker's Rounding or anything like that -- this number is much closer to 13.7744 than it is to 13.7745 no matter how you slice it. What am I misunderstanding here
Thanks.