That's right! All JavaScript numbers (either primitive numbers, or the Number object, and don't even get me started on how stupid it is that there is even a *distinction* between the two) are actually double floats!

Now, in any Lisp-derived language, you don't expect to have immediate-integers with the full bit-width of the native word size, since a few bits need to be consumed by runtime tagging. If the language implementor is really clever, you can get away with losing only 2 bits, but mostly people are less clever than that, and you lose 5-8 bits, so it's reasonable to expect that MAXINT on a modern computer is *at least* 2^56, 2^59 being more reasonable.

(It would also be reasonable to assume that any sane language runtime would have integers transparently degrade to BIGNUMs, making the choice of accuracy over speed, but of course that almost never happens, because the painful transition from 32-bit architectures to 64-bit architectures apparently taught the current crop of CS graduates no lesson better than, *"Oh, did I say that 32 bits should be enough for everybody? I meant 64 bits."* But again I digress.)

Anyway, the happy-fun-time result of the fact that JavaScript dodged the MAXINT problem by making all numbers be floats means that while there is not technically a MAXINT, you can't actually do meaningful integer arithmetic on anything bigger than 2^53! Not because 11 bits are being used for tags, but because the underlying IEEE-754 floats can't unambiguously represent integers larger than that!

It's like an AI Koan: "One day a student came to Moon and said, *'I understand how to avoid using BIGNUMs! We will simply use floats!'* Moon struck the student with a stick. The student was enlightened."

Even better is that this behavior is explicitly laid out in the language specification: ECMA 262, 8.5:

The finite nonzero values are of two kinds: 2^64-2^54 of them are normalised, [...] The remaining 2^53-2 values are denormalised [...] Note that all the positive and negative integers whose magnitude is no greater than 2^53 are representable in the Number type.

This means that if you write a JavaScript implementation that does not faithfully reproduce the bug that arithmetic on integers greater than 2^53 silently does something stupid, then your implementation of the language is non-conforming.

Oh, also bitwise logical operations only have defined results (per the spec) up to 32 bits. And the result for out-of-range inputs is not an error condition, but silently mod 2^32.

I swear, it's a wonder anything works at all.