Why can't I do array[-1] in JavaScript?

  • A+

In Python, you can do that:

arr = [1,2,3] arr[-1] // evaluates to 3 

But in JS, you can't:

let arr = [1,2,3]; arr[-1]; // evaluates to undefined 

The question is: why?

I know about the tricks to get around it (arr[arr.length-1], modifying the array prototype, etc), but that is not the point.

I'm trying to understand why it is still not in the EcmaScript standards to interpret negative array indices as indices starting from the end, despite that it seems pretty easy to implement a JS engine that understands that (and also, the whole Python community is having a blast with this notation).

What am I missing?


You miss the point, that arrays are objects (exotic object) and -1 is a valid key.

var array = [1, 2, 3];  array[-1] = 42;  console.log(array); console.log(array[-1]);


:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen: