This question already has an answer here:
I'm looking for an explanation for Java's behavior when handling the following scenarios. I understand that the ASCII table is arranged so that the value of the character
5 is five positions greater than
0. This allows for calculations to be done on the char without converting to an int as seen in the first example.
What I don't understand is why Java seems to inconsistently handle when to provide a value from an ASCII table and when to do a calculation on the chars as though they were integers.
int x = '5' - '0';
x = 5;
int x = '5'
x = 53;
Now for some examples, that introduce confusion.
int x = '0' + 1 - '5'
x = -4
int y = '5' - '0' + '1'
int y = '5' - 0 + '1'
Java seems to be doing an implicit type conversion, but how is Java inferring which representation of the int/char should it be using?
Just write the char conversion to ASCII code (below your statements)
int x = '0' + 1 - '5' 48 + 1 - 53 = -4 int y = '5' - 0 + '1' 53 - 0 + 49 = 102 int y = '5' - '0' + '1' 53 - 48 + 49 = 54
Notice it's consistent, each int remains int and each char converted to ASCII code