why does java and c# differ in simple Addition

  • A+
Category:Languages

I have two snippets, one in Java and one in c#.

float a = 1234e-3f; float b = 1.23f; float ca = 1.234e3f; float d = 43.21f; long e = 1234L; int f = 0xa; int g = 014; char h = 'Z'; char ia = ' ';   byte j = 123; short k = 4321;  System.out.println(a+b+ca+d+e+f+g+h+ia+j+k); 

the Java snippet returns 7101.674

and in c#

float a = 1234e-3f; float b = 1.23f; float ca = 1.234e3f; float d = 43.21f; long e = 1234L; int f = 0xa; int g = 014; char h = 'Z'; char ia = ' ';   byte j = 123; short k = 4321;  Console.WriteLine(a+b+ca+d+e+f+g+h+ia+j+k); 

produces a result of 7103.674.

why am I off by 2 and what is correct?

 


The difference is in the

int g = 014; 

It's Octal in case of Java (014 == 12) and Decimal in case of C# (014 == 14).

Comment

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen: