Every day I learn something new... and stupid.

Hey kids, did you know that JavaScript doesn't have integers?

That's right! All JavaScript numbers (either primitive numbers, or the Number object, and don't even get me started on how stupid it is that there is even a distinction between the two) are actually double floats!

Now, in any Lisp-derived language, you don't expect to have immediate-integers with the full bit-width of the native word size, since a few bits need to be consumed by runtime tagging. If the language implementor is really clever, you can get away with losing only 2 bits, but mostly people are less clever than that, and you lose 5-8 bits, so it's reasonable to expect that MAXINT on a modern computer is at least 2^56, 2^59 being more reasonable.

(It would also be reasonable to assume that any sane language runtime would have integers transparently degrade to BIGNUMs, making the choice of accuracy over speed, but of course that almost never happens, because the painful transition from 32-bit architectures to 64-bit architectures apparently taught the current crop of CS graduates no lesson better than, "Oh, did I say that 32 bits should be enough for everybody? I meant 64 bits." But again I digress.)

Anyway, the happy-fun-time result of the fact that JavaScript dodged the MAXINT problem by making all numbers be floats means that while there is not technically a MAXINT, you can't actually do meaningful integer arithmetic on anything bigger than 2^53! Not because 11 bits are being used for tags, but because the underlying IEEE-754 floats can't unambiguously represent integers larger than that!

It's like an AI Koan: "One day a student came to Moon and said, 'I understand how to avoid using BIGNUMs! We will simply use floats!' Moon struck the student with a stick. The student was enlightened."

Even better is that this behavior is explicitly laid out in the language specification: ECMA 262, 8.5:

The finite nonzero values are of two kinds: 2^64-2^54 of them are normalised, [...] The remaining 2^53-2 values are denormalised [...] Note that all the positive and negative integers whose magnitude is no greater than 2^53 are representable in the Number type.

This means that if you write a JavaScript implementation that does not faithfully reproduce the bug that arithmetic on integers greater than 2^53 silently does something stupid, then your implementation of the language is non-conforming.

Oh, also bitwise logical operations only have defined results (per the spec) up to 32 bits. And the result for out-of-range inputs is not an error condition, but silently mod 2^32.

I swear, it's a wonder anything works at all.

Previously, previously, previously, previously.

Tags: , , , ,

86 Responses:

  1. ahruman says:

    Spidermonkey traditionally does use two bits for tagging of integers, but with a 30-bit payload regardless of architecture. (Last time I checked, they were working on switching to a 128-bit value type.) Numbers are converted between 30-bit integer and pointer-to-double representation as needed. Not that this is helpful in any way.

    ECMAScript Harmony will probably add a separate type system for "value types", which is specifically motivated by IBM's desire to support decimal floats and Mozilla's desire not to. The details haven't worked out, but the general idea is a category of types that have operator overloading but no properties. It should be possible for a host app to implement sane integers on top of this. So as long as you don't need to do anything before 2013 and will control your host app, you're home free!

    • jwz says:

      Wow, I was unaware that anyone more recent than, like, Charles Babbage thought that decimal floats were a good idea...

      Operator overloading. I'd have to file that under "now you have two problems"...

      • ahruman says:

        I'm kinda on the fence about operator overloading. It leads to much stupidity, but on the other hand, doing maths on non-primitive types (vector algebra and so forth) without it is painful. Separating extensible-things-with-operators from objects might actually be a good idea. I kinda do this already with Objective-C++, but using ObjC++ gives me at least five problems anyway. :-)

        (Obviously I'll defer to Brendan on all the on-topic stuff.)

      • Actually decimal floats have a very specific use case: computations on monetary values.

        Regular floats may do some funky things in a complex computation (especially one involving lots of inputs, like an average). This is fine so long as you have all your decimal places, but falls apart when you are restricted to two or so decimal places (like money). To compound the problem, two different CPUs might come up with different answers for the same algorithm and inputs.

        With decimal floats (or BCD) you avoid the whole issue by doing the calculations the way an old fashioned hand-crank calculator would.

        Given IBM's customer base, this is a pretty important consideration for them.

        • BrendanEich says:

          https://bugzilla.mozilla.org/show_bug.cgi?id=5856 is the most-dup'ed JS bug, last I checked.

          Try javascript:alert(.1+.2) and consider people doing non-monetary arithmetic, say for CSS style computation.

          It's a problem, and IBM is right to want a solution, but IEEE754r is a glorious, decade-long, multi-format (last I heard; to placate two competing hardware vendors) mess. The standards body was reduced to using Robert's Rules in committee to prevent bloodshed at one time.

          Some on Ecma TC39 (JavaScript) are warm to bignums, but the whole decimal jam-job has made everyone a bit gun-shy.

          /be

          • jered says:

            I did some time in standards (SNIA, in this case), and AFAICT the biggest hobby-horse in IBM is fixed-point arithmetic. We chose the absolute minimal set of useful primitive types for our data abstraction (unlike JavaScript, apparently) and at the last minute the IBM folks came back with a fixed-point data requirement.

            Also, everything in our standard had to be a superset of all of their products, but that wasn't a problem unique to IBM.

        • flipzagging says:

          When it comes to financial software, every decent developer already uses integer millis (1000x the smallest currency unit).

          • BrendanEich says:

            or vampire-squid banks:

            $ bc
            bc 1.06
            Copyright 1991-1994, 1997, 1998, 2000 Free Software Foundation, Inc.
            This is free software with ABSOLUTELY NO WARRANTY.
            For details type `warranty'.
            2^53
            9007199254740992
            ./1000
            9007199254740
            ./365.25
            24660367569
            ./24
            1027515315
            ./3600
            285420

            The last three division steps show how JS's Date object fits time since the epoch in a double and still cover a very large extrapolated-Gregorian calendar.

            /be

        • jp_larocque says:

          Or you could use rational numbers, which are more general, more elegant, and possibly even more efficient. And hey, Lisp does that by default, too.

          • BrendanEich says:

            Sam Tobin-Hochstadt, private correspondence:

            "Bignums are great. There's no reason for the performance problems to stop people from switching to bignums in any non-C language, and it's fairly straightforward to optimize to use fixnum operations in many cases. I think [trace-based JITting] will do well here, also. Exact rationals are a whole other can of worms. I admit to liking having them around in Scheme, but I rarely use them.
            ...
            I think [fast enough and better semantics than double] is the case for bignums, but not for rationals. I don't know of any other awesome solutions for rationals, though. Fast, precise, rational - pick any two."

            /be

      • Decimal floats are very useful for avoiding rounding/representation errors when data moves back and forth between pen and paper and computerized systems. Some businesses have financial processes where this is essential.

      • Actually, William Kahan said "A major decrease in avoidable silly errors can be achieved by letting most (not all) scientists, engineers, ... and other computer users perform most (not all) of their floating-point arithmetic in decimal; therefore that is the way we must go."

    • BrendanEich says:

      SpiderMonkey used one bit for ages, not two. Now we use NaN boxing.

      It was not Mozilla who stopped IBM's mad IEEE754r decimal jam attempt. We actually had a prototype patch from Sam Ruby. All the other browser vendors, plus Doug Crockford of Yahoo!, were solidly opposed. In many ways Mozilla was IBM's best friend on this point, and we are still paying for it.

      Value types are a dark horse, but if based on proxies they may turn out well.

      The bignum strawman could just be a new primitive type, and happen quickly. IBM would not be happy, but many others would. Right now web sites are doing crypto in JS using ints (bitwise ops) and double multiplies!

      /be

  2. inoah says:

    Is this Brendan's fault, or did this creep in sometime later?

    • jwz says:

      No idea, but it sounds like the sort of 4am shortcut of which that first implementation was entirely composed, so I'd guess it's been in there from the start...

      • BrendanEich says:

        Yes, it was there from the start. But bignums were not in the cards. JS had to "look like Java" only less so, be Java's dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened.

        So double by default, int under the hood, and bitwise ops are 32-bit int (uint if you use >>>). I blame Java.

        /be

        • duskwuff says:

          So, since I have to ask: What would we be stuck with if JS hadn't happened?

          • BrendanEich says:

            Something like PHP only worse, which Netscape's VP Eng killed fast (in early July 1995, if memory serves; I did JS in early-mid-May) since it was "third" after Java and JS. Two new web programming languages were hard enough to justify...

            /be

            • dasht says:

              I think JS was mostly a great thing to come along, in its way so please take this in the context of that spirit. I respect a lot of what you've done -- but also have a gripe that relates to this thread.

              As respectfully as I can ... I'm sorry that this will be harsh but, this bit, where you say:

              "Yes, it was there from the start. But bignums were not in the cards. JS had to "look like Java" only less so, be Java's dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened."

              WTF IS THE MATTER WITH YOU!?!?!

              By which I mean... we all know the creation story of Jscript but in that story and in your reiteration of it here... basically you bent over for your masters. You had the option of a few of you standing up, not bending over... and quite possibly crashing the damn company (then running to the press and working on setting up your own separate thing). You might have lost, in other words.. You might collectively have been tossed out on your butt, blacklisted, replaced, and have had no impact .... but I don't think it was terribly likely. With a collective spine you could have won. Better quality AND well earned riches. Instead, hackers sold out. I think you could have made that "F.U." threat and leveraged it for more say about "Doing Things Right" in the engineering department. Instead, y'all wussed out, rushed, and took lots of money for it. At least that's how it looked from nearby.

              From where I sat, back then (just down the road)... the silly valley execs and money people were talking about you hackerish types there behind your back all up and down the valley. They played you guys. They were mostly bluffing. They knew all along you were their highest cards.... they just wanted to be able to brag about how they got your labor on the cheap and in a rush. And, no, they didn't really appreciate what you were trying to do. They heaped a bunch of needless stress on y'all with the effect of reduced quality essentially so they could justify their existence to their friends.

              • jwz says:

                Brendan's house and my nightclub thank us for selling out early and often.

                • dasht says:

                  And, I do too. I tried to convey that but just to be really clear... I want both. I want the world where you win big and the world where you don't blame it all on Teh Man but rather sometimes stand up to him and win for the team, not just your accounts. You are both, so far as I can tell, smart, hard working, socially beneficial people on balance and perhaps it is my brain bug or perhaps I'm right but my perception is you screwed up in your relation to big money back then, around stuff like the topic. Younger hackers reading this should wonder if maybe they can do better. My impression from what I saw around the corridors of power back then is that you guys folded like a house of cards compared to what you could have gotten away with. (Maybe that's why they paid you the big bucks. :-)

              • flipzagging says:

                In my experience, when it comes to the very best engineers, what you're paying for is not that they will do everything perfectly. It's their ability to know what to neglect, in favor of getting the important work done.

                • Joel Webber says:

                  +1. This song should be automatically played each time someone busting ass gets accused of selling out by an armchair quarterback!

              • BrendanEich says:

                It's a good question how much we sold out, and although jwz was earlier than I (dumbass me had an offer in spring '94 to join Netscape on the first floor, but I stayed loyal to MicroLunacy^H^H^H^H^H^HUnity), I am pretty sure we didn't trade technical merit for money.

                I think you have us wrong. We didn't sell out -- others (later-comers, who did far less work) made much more than we did. We were naive, in point of fact. If we had it all to do over again, I think we would (in hindsight and with your advice) use our leverage to get better technical and financial outcomes.

                But at the time, mostly we felt the need to move very quickly, not to make money but because we knew Microsoft was coming after us. Microsoft low-balled Netscape in late '94, and the Jims (Clark and Barksdale) told them to pound sand. After that, we felt the monster truck riding up on our little Yugo's rear bumper, month by month.

                If you appreciate this and it is accurate, consider that JavaScript (please, not "JScript") saved you from VBScript.

                As far as us not selling out: live and learn. I'm not sure I'll ever have the opportunity to apply this knowledge, but here it is for you kids who may end up in a rocket-like startup.

                Either get your unfair share, or get your technical goods, or both (in your dreams) -- but don't just work hard to try to invent something new in a hurry, to improve the nascent Web. You might not have the good grace I've had with Mozilla and Firefox to make up for it.

                /be

            • jwz says:

              I'm still bummed that I failed to talk you in to making #!/usr/bin/javascript work back then, because I think that we were still in the window where we had a shot at smothering Perl in the crib...

        • I don't begrudge you the lack of bignums; the state of bignum libraries back then was not terrific. What I don't get is the lack of proper tail calling. Once you understand it, it's not any harder than doing it wrong. How did that get missed?

          • BrendanEich says:

            Ten days to implement the lexer, parser, bytecode emitter (which I folded into the parser; required some code buffering to reorder things like the for(;;) loop head parts and body), interpreter, built-in classes, and decompiler. I had help only for jsdate.c, from Ken Smith of Netscape (who, per our over-optimistic agreement, cloned java.util.Date -- Y2K bugs and all! Gosling...).

            Sorry, not enough time for me to analyze tail position (using an attribute grammar approach: http://wiki.ecmascript.org/doku.php?id=strawman:proper_tail_calls). Ten days without much sleep to build JS from scratch, "make it look like Java" (I made it look like C), and smuggle in its saving graces: first class functions (closures came later but were part of the plan), Self-ish prototypes (one per instance, not many as in Self).

            I'll do better in the next life.

            /be

            • I hear you, and I've been in the "ten days to do the impossible" situation before. The real shame is that it never got fixed up as the language was reimplemented (many times), and now it's more difficult to do than it would have been, say, 10 years ago.

              • BrendanEich says:

                After the beginning came a painful death dance, largo non troppo. Netscape used its IPO mad-money to binge on acquisitions and over-invest in for-diminishing-profits servers,"enterprise groupware" (which killed the Netscape browser as much as MSFT did), and Java, while under-investing in HTML and JS (never mind CSS).

                You're right: it would have been better to make Proper Tail Calls a rule of the language's semantics, an asymptotic space guarantee. Not clear it would have stuck in the market, though. Most JS devs didn't know Scheme and did not write tail-recursive or intentionally tail-calling code.

                We have a chance for Harmony to require PTCs, though. See the wiki link in my earlier comment, and please advocate on es-discuss@mozilla.org (mailman subscribe protocol).

                /be

        • lloydwood says:

          32-bit uint? Wait, Java doesn't do uints.

          As far as I can tell as a networking guy, not supporting uints has pretty much doomed Java for network programming, since going the next int size up and masking is way inefficient for parsing protocol headers.

          C still wins.

          • BrendanEich says:

            32-bit uint shows up in several places in JS. The >>> operator (same as in Java); the Array.length property and array indexes (property names that act like uint32 but equate to strings by the standard conversion).

            I'm a C hacker, so you won't hear me arguing back. unsigned is useful for systems programming chores, and not just for parsing packed structs. It's hard to strength-reduce integer arithmetic by hand without something like unsigned (>>> suffices for div, but the bitwise-logicals are not enough for all cases without a sign test and conditional negation).

            See http://wiki.ecmascript.org/doku.php?id=strawman:binary_data for something we hope to get into JS, which needs packed structs and arrays for at least WebGL (and better than WebGL's typed arrays in JS, which are like aliased Fortran common vectors -- get me off this Karmic wheel!).

            /be

  3. BrendanEich says:

    JS has a lot of stupid in it. News at 11, as chouck used to say.

    There's hope:

    http://wiki.ecmascript.org/doku.php?id=strawman:bignums

    Convincing all the browser vendors will be tough. Some don't want to do much more than be super-fast at the current version of the language (or the last one).

    IBM voted "no" on ECMA-262 Edition 5 just because we did not jam IEEE754r decimal (which has finite precision, rounds base 10 numbers well compared to base 2 numbers printed in base 10, but lacks hardware support, is slow even on hardware that has support, and is different enough that no one could agree on whether and how to integrate it) into what was supposed to be a "no new syntax" update to the standard.

    The rounding bug is still the most-dup'ed JS bug in buzilla.mozilla.org. We want to fix it, and right now bignums have a shot. Write your browser vendors!

    /be

    • strspn says:

      My browser vendors are writing me. What should I tell them?

      • BrendanEich says:

        What are they writing to you?

        Ask them for bignums in JS if you agree they would help compared to alternative courses of evolution, and compared to doing nothing for numeric types in JS.

        Heck, ask for anything that you think needs doing. I'm interested in cross-browser evolutionary jumps that the vendors can actually swallow (unlike boil-the-ocean schemes and effectively-single-vendor managed-[native-or-not-]code stacks), so feel free to keep me posted too.

        /be

        • lloydwood says:

          "ask for anything that you think needs doing"? I've been asking browser vendors for an implementation of Javascript that won't crash for over fifteen years.

          Not holding my breath. (Most crashes these days are associated with Flash, which is ActionScript, which is still ultimately your fault, dammit.)

          • BrendanEich says:

            Not crashing is a research problem. We're on it though -- results will be sent backward in time via tachyons as soon as we have them.

            Browsers all crash, often in the other hairy parts of the codebase not implemented in memory-safe languages: HTML, CSS, DOM, crypto, HTTP, img, video, graphics, etc., etc. JS counts as causal for only some of the crash bugs. Even Chrome's process isolation (in other browsers too, variously) doesn't completely save them from lack of memory safety and control flow integrity in the main implementation language: C++.

            If you meant "DoS" instead of "machine crash", that's a different research problem, but we have watchdogs and quotas.

            I can't take direct blame for ActionScript, but sure: that's my fault too. The good parts of JS go back to Scheme and Self, so all credit to Guy, Gerry, David, Craig, et al.

            /be

        • strspn says:

          They usually write to tell me about some new layout thing or a fancy new feature that uses the mouse or the keyboard. I tell them it would be great to be able to upload a microphone recording. They write back about what a great new audiovisual codec they have. It's been going on for more than a decade. Flash finally came through a few years ago, but HTML5 has been spinning its wheels on bidirectional audio for at least a year now.

  4. harryh says:

    You learned about this because of how it recently screwed twitter I assume?

  5. lhoriman says:

    I don't get your outrage. Yes, all JavaScript numbers are floats, and no, JavaScript doesn't have a builtin BigInt type. Congrats, you made it through chapter 1 of JavaScript For Dummies.

    Yes it's *nice* that some languages have builtin arbitrary precision integers and automatically convert on overflow. But most don't - not Java, not C/C++, not even Python. Ruby does. The only thing that seems weird is that you expect this behavior.

    There are plenty of weird things about JavaScript, but this isn't on my list. You might want to try this out instead: http://jashkenas.github.com/coffee-script/

    • jwz says:

      1) This is not what passes for outrage, sonny.

      2) It may shock you to learn (from your response, I guess it may actually shock you to learn) that I don't read books like JavaScript for Dummies. I am, in fact, highly ignorant of the ins and outs of Javascript, because most of the time I just don't care.

      But hey, if it makes you happy to have known this piece of pointless trivia before I stumbled across it and decided to point and laugh, then I'm glad to have brought that sliver of joy into your day.

      Seriously, replies that just say "so what, I knew that" are barely above "first porst!" Try harder.

      "Most languages" blah blah blah. With such low expectations, I guess you get the languages you deserve. That's the only think I "expect", not bignums.

      • lhoriman says:

        While I'm enjoying my sliver and you're busy not caring enough about JavaScript to write blog entries on the subject, don't overlook the link to CoffeeScript. It may not read your mind and polish your knob while you code, but it does elevate JS to a real programming language with some significant advantages over other dynamic languages like python and ruby.

        • rane500 says:

          Beyond the fact this is just a fun jwz "WTF" post, I'm mildly amused your suggestion to someone noting a language flaw is to recommend something with a disclaimer that includes "Until it reaches 1.0, there are no guarantees that the syntax won't change between versions."

    • darius says:

      Python does have proper integers. It didn't originally, but that was long ago.

      • lhoriman says:

        Python 3.x ints behave the way JWZ wants (arbitrary precision, automatically scaling up). Python 2.6 (the default on most linux distros and OSX) & 2.7 have separate integers (32 bits, possibly more) and longs (arbitrary precision) and do not automagically scale up if you overflow the normal int size.

        Don't get me wrong, programming languages should hide this crap. But I sure wouldn't expect it in an old language.

        • darius says:
          Python 2.6.1 (r261:67515, Feb 11 2010, 00:51:29)
          [GCC 4.2.1 (Apple Inc. build 5646)] on darwin
          Type "help", "copyright", "credits" or "license" for more information.
          >>> x = 2**29
          >>> x
          536870912
          >>> x**10
          1989292945639146568621528992587283360401824603189390869761855907572637988050133502132224L
          • lhoriman says:

            I stand corrected. Ruby *and* Python get a gold star today.

            • scullin says:

              Not that your bluff hasn't already been called - but what would you consider a not old language? Certainly not Ruby?

              • lhoriman says:

                How about this - why don't you name a commonly-used language other than Python or Ruby that automatically converts integers to arbitrary precision types on overflow?

                I'm correct about C, C++, Java, C#, ObjectiveC, Ada, Modula2, and Pascal. I'm less familiar with VB, Fortran, and Cobol, but I don't think they have magic integer types either. I'd be curious to know what languages set JWZ's expectation that you can pay no attention to the storage characteristics of numeric types, because I suspect he "grew up" using the same statically typed languages I did.

        • teferi says:

          On the other hand, Python ints are all boxed (no fixnums) and heap-allocated. The interpreter pre-allocates 0-100 on startup, I think, which I always thought was pretty grody.

          • gen_witt says:

            It's worse than that, small numbers are de-duped, big numbers are not.

            Python 2.6.5 (release26-maint, Aug 20 2010, 17:50:24)
            [GCC 4.4.3] on linux2
            Type "help", "copyright", "credits" or "license" for more information.
            >>> 0 is 0 + 0
            True
            >>> 256 is 256 + 0
            True
            >>> 257 is 257 + 0
            False
            >>>

        • edouardp says:

          My favourite 30-year old programming language handles this stuff just fine. It the modern languages that are made of fail.

    • quadhome says:

      jwz complains about Javascript numbers and you link to a source-level compiler that exhibits the same flaw?

    • Scott Graham says:

      Python does have long.

      More salient to the current conversation, I wrote (most of) JS bigint in http://www.skulpt.org/

      Of course, they're slower than when implemented at C-level, but *shrug*.

  6. wtfwtf_ok says:

    ActionScript 3 has ints, but still *doesn't have integer math*.

    So:

    var x :int = 3;
    myArray[x/2] = "foo";
    // Congratulations, you've just corrupted your array by storing an element at index 1.5
    // Yes. Really.

    • antifuchs says:

      Wow, that is just... Beautiful. Almost as nice as the bit in FORTRAN that lets you modify the value of constant numbers.

      • scullin says:

        FORTRAN? How quaint. Try this in Python:

        True=False
        True
        ????

        • That's beautiful.
          >>> True = False
          >>> True
          False
          >>> True is False
          True

        • ciphergoth says:

          FWIW, you're not overwriting True there; you're creating a variable in that scope with the same name.

          >>> foo = True
          >>> True = False
          >>> True, foo
          (False, True)

          • Brilliant. And what's the chance that this, of all things, is what the programmer intended? Emit a diagnostic.

            This is why we can't have nice things.

            • gryazi says:

              This is why stdout and stderr are separate streams even though creating a sane default interface to review both at once requires too much cooperation between shell and GUI authors [or technically too much intelligence from either, in UNIXland - #!/usr/bin/javascript should announce exceptions over dbus or Growl if it wants the user to see them, right?].

              (I mean sure, shell scripts + named pipe + [pick save-default-session-feature in your favorite tabbed xtermalike] but when's the last time you bought a car that left you to install the mirrors and the seats yourself?)

              Or are all you rich kids just using Xcode?

        • fanf says:

          I love this C code. It breaks the brain of the reader without breaking the code.

          #define false true
          #define true false

    • nothings says:

      To be fair (sort of), ActionScript 3 was designed to basically be an implementation of the then-proposed Ecmascript 4. So its int/uint handling can still be blamed on Brendan, I guess?

      • BrendanEich says:

        No, AS3 preceded ES4 and was one of its progenitors. AS3 itself was based on Waldemar Horwat's JS2 or "early ES4" from the Netscape days, but with changes Waldemar was not in on.

        See http://brendaneich.com/2007/11/my-media-ajax-keynote/ where, in a later slide, I get into why C-like storage types without C's evaluation (promotion) rules can make type-annotated code *slower* than untyped code.

        AS3 has this bug. It is the performance side of the same coin that the original post about AS3 brings up (lack of div leading to array indexing by a fractional number, equated to its canonical string conversion).

        AS3 tries to be like JS and evaluate intermediates using doubles only, yet adds Java-like int storage type honey traps to JS. This design choice is not my fault :-|.

        /be

        • nothings says:

          Yeah, I realized "early ES4", but I hadn't realized exactly the who and what behind all those things, so fair enough.

          I think I learned more of these details two years ago when I first looked into this (as one of the few implementers of an AS3 VM, I kind of wanted to know who to blame), but I didn't retain the details in my memory, and the info I was reading back then doesn't seem to be showing up now, so this was the best I could come up with now.

          So, aspersions uncast, blame unassigned, sorry!

  7. I'll get cut down for this, but...

    Ok, so the boxing is just bizarre, but I would argue that the choice of having just doubles is the correct one. They're typically supported on a lot of hardware that doesn't have native 64-bit integers. There, it has the interesting effect that the largest machine int is actually the 53-bit integer range of doubles. So, I if you have to pick just one, for simplicity, doubles are a counterintuitive but inspired choice.

    Why you would want to pick just one number type? Well, it avoids a *lot* of problems. In larger environments you can deal with all the (conv|co|av)ersions any (if not all) ways you like, but in the pocket class, you really want one and only *the right* one.

  8. vordark says:

    Any time I see the earnest defense of modern languages, I'm reminded of this quote from the UHG:

    "I liken starting one's computing career with Unix, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. But, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act." -- Ken Pier, Xerox PARC

    Pretty much sums up life after C.

  9. ch says:

    all language designers need to read CLtL at the very least.

    need I mention Python?

    • BrendanEich says:

      (I'm like a Yankees fan on Red Sox turf, but whatever.)

      Scheme was the ideal admired by the JS supporters at Netscape, and the trick used to recruit me. But (see above) the "look like Java" orders and the intense time pressure combined to take out a lot of good ideas from Scheme that would've taken longer (not much longer, sigh) to implement. One example: just the C-like syntax makes tail calls harder (or less likely: consider falling off the end of a function returns undefined, so the last expression-statement before such an implicit return is not in tail position).

      On the upside, JS commands growing academic attention (Web 2.0 trickle-down economics plus JS's browser ubiquity with open source implementations and large user-testing communities), as demonstrated by everything from our practical PLDI 2010 paper on TraceMonkey, to pure semantic modeling such as

      http://www.cs.brown.edu/~sk/Publications/Papers/Published/gsk-essence-javascript/

      The Ecma TC39 standards body benefits from this: Shriram and Arjun came to the July meeting, Cormac Flanagan and Sam Tobin-Hochstadt are regulars, Tom Van Cutsem (AmbientTalk, and of course the new Proxies for JS) is attending, and long-time Scheme (and JS) semanticist Dave Herman finished his PhD at Northeastern and works at Mozilla now.

      This doesn't excuse anything from 15 years ago, but it is a positive development. I'm sure Schemers and Common Lispers would agree.

      /be

  10. edm says:

    FYI, this post was linked as a 'Development Quote of the Week' by this week's Linux Weekly News. It's behind a paywall for one week, but then open to the public.

    Ewen