Wonder why. C# .Net 3.5
int a = 256 * 1024 * 1024;
int b = 8;
long c = b * a;
Console.WriteLine(c);//<-- result is -2147483648
Where does this minus from?
See Question&Answers more detail:osWonder why. C# .Net 3.5
int a = 256 * 1024 * 1024;
int b = 8;
long c = b * a;
Console.WriteLine(c);//<-- result is -2147483648
Where does this minus from?
See Question&Answers more detail:osWhere does this minus from?
From the integer overflow. Note that your code is equivalent to:
int a = 256 * 1024 * 1024;
int b = 8;
int tmp = b * a;
long c = tmp;
Console.WriteLine(c);
I've separated out the multiplication from the assignment to the long
variable to emphasize that they really are separate operations - the multiplication is performed using Int32
arithmetic, because both operands are Int32
- the fact that the result is assigned to an Int64
afterwards is irrelevant.
If you want to perform the multiplication in 64-bit arithmetic, you should cast one of the operands to long
(i.e. Int64
):
int a = 256 * 1024 * 1024;
int b = 8;
long c = b * (long) a;
Console.WriteLine(c); // 2147483648
(It doesn't matter which operand you cast to long
- the other one will be implicitly converted to long
anyway.)