We ran into a magic decimal number that broke our hashtable. I boiled it down to the following minimal case:
decimal d0 = 295.50000000000000000000000000m;
decimal d1 = 295.5m;
Console.WriteLine("{0} == {1} : {2}", d0, d1, (d0 == d1));
Console.WriteLine("0x{0:X8} == 0x{1:X8} : {2}", d0.GetHashCode(), d1.GetHashCode()
, (d0.GetHashCode() == d1.GetHashCode()));
Giving the following output:
295.50000000000000000000000000 == 295.5 : True
0xBF8D880F == 0x40727800 : False
What is really peculiar: change, add or remove any of the digits in d0 and the problem goes away. Even adding or removing one of the trailing zeros! The sign doesn't seem to matter though.
Our fix is to divide the value to get rid of the trailing zeroes, like so:
decimal d0 = 295.50000000000000000000000000m / 1.000000000000000000000000000000000m;
But my question is, how is C# doing this wrong?
edit: Just noticed this has been fixed in .NET Core 3.0 (possibly earlier, I didn't check) : https://dotnetfiddle.net/4jqYos
See Question&Answers more detail:os