Why does a C# System.Decimal remember trailing zeros?

Posted by Rob Davey on Stack Overflow See other posts from Stack Overflow or by Rob Davey
Published on 2010-06-08T11:17:02Z Indexed on 2010/06/08 11:22 UTC
Read the original article Hit count: 257

Filed under:
|
|

Is there a reason that a C# System.Decimal remembers the number of trailing zeros it was entered with? See the following example:

public void DoSomething()
{
    decimal dec1 = 0.5M;
    decimal dec2 = 0.50M;
    Console.WriteLine(dec1);            //Output: 0.5
    Console.WriteLine(dec2);            //Output: 0.50
    Console.WriteLine(dec1 == dec2);    //Output: True
}

The decimals are classed as equal, yet dec2 remembers that it was entered with an additional zero. What is the reason/purpose for this?

© Stack Overflow or respective owner

Related posts about c#

Related posts about decimal