
I was telling someone about about this the other day, and it's time that Google said something other than "Your search - most-positive-bignum - did not match any documents" so here it is.
Background:
In Common Lisp, there are two types of integers: small ones, which are what you expect; and "bignums", which the specification says should have no upper limit on their range. Fixnums are immediate values (not objects) and bignums are compound (allocated) objects. All the math functions work on either, promoting results to bignums as needed.
So, there is a constant called MOST-POSITIVE-FIXNUM, which is the largest representable "immediate" integer. C calls this MAXINT, and in a C world, it's generally 2^31-1. Because of the type bits on every word, in Lisp it's typically smaller than that. On TI Explorer Lisp Machines it was 2^24-1 (32 bit word, 7 bits of tag).
Here Comes the Dumb:
Well, I was poking around in the system's basement one day, and realized that their implementation of bignums did have an upper limit! A bignum was implemented as an array, with no facility to tack on a second array, so the limit was related to the size of the length field in the array header (instead of being limited by available memory).
So I went and consed up MOST-POSITIVE-BIGNUM.
I did this by carving out a chunk of memory of the proper size for the underlying array, filling it with the right pattern of ones, and slapping an appropriate type tag on the front so that the system would recognize it.
(These sorts of tricks are exactly the sort of things that Java goes out of its way to prevent you from doing, and that's part of why it's impossible to write efficient programs in Java. Whereas the whole Lisp Machine operating system was written in Lisp.)
Fun facts:
- It is 1,223,146 decimal digits (that's 1.0e1223146).
- The object representing it consumes 524 KB of RAM.
- The largest known prime passed it in 1999.
- If you add one to it, e.g.: (1+ MOST-POSITIVE-BIGNUM) you get zero. But it's a zero that consumes 524 KB, and is not = to 0.
- If you ever tried to print it (for example, by having it be the return value in the debugger...) a TI Explorer would lock up in a tight loop in microcode, from which only a warm boot would recover. That microcode loop was trying to grind the number into decimal. That process took more than three days.
- Printing it as a roman numeral took pretty much the same amount of time.
And finally, here's some code that doesn't run on any computer that has been manufactured in two decades.
(Coincidentally, it looks like I wrote this code eighteen years and four days ago. I guess I should have posted this on Friday.)
<lj-cut text=" --More--(49%) ">
;;; -*- Mode:Lisp; Syntax: Common-Lisp; Package:USER; Base:10 -*- ;;; 21 Mar 90 Jamie Zawinski Created. ;;; When you load this file, the constants MOST-POSITIVE-BIGNUM and ;;; MOST-NEGATIVE-BIGNUM will be defined. ;;; ;;; These are the absolute largest and smallest numbers which can be ;;; represented in the TI Explorer's memory architecture. ;;; ;;; WARNING: if you try to print these numbers, the microcode will ;;; hang. They are totally useless quantities, and dangerous to ;;; have around. You can examine them with ;;; (sys:dump-memory most-positive-bignum :bignum-is-dump-object t) ;;; and perform normal arithmetic operations on them. But the same ;;; dangers apply to any numbers this large. (defun make-most-positive-bignum (&optional minusp) (let* ((header-word 0) (nwords (1- (expt 2 17))) new) ;; Construct a bignum header word. ;; 31 29 24 22 18 17 0 ;; +-------------------------------------------------------------------+ ;; |CDRcode | DTPheader | unused | hdr-type-bignum | sign | datalength | ;; +-------------------------------------------------------------------+ (setq header-word (dpb sys:%header-type-bignum (byte 3 19) 0)) (setq header-word (dpb (if minusp 1 0) (byte 1 18) header-word)) (setq header-word (dpb nwords (byte 17 0) header-word)) (without-interrupts ;; Allocate a chunk of memory, with the header word at the front ;; and a pointer to NIL in every other location. (setq new (sys:%allocate-and-initialize sys:DTP-EXTENDED-NUMBER ; pointer dtp to return sys:DTP-HEADER ; object dtp to store header-word ; "pointer" field contents nil ; word n+1 sys:DEFAULT-CONS-AREA ; area (+ 2 nwords) ; total data length )) ;; Stomp on the first word after the header to be a lit-up bignum data ;; chunk. All bits are on except bit 31 (the msb). Since fixnums are ;; only 25 bits, we do this in two steps. (sys:%p-dpb-offset #b0111111 (byte 7 25) (sys:%pointer new) 1) (sys:%p-dpb-offset -1 (byte 25 0) (sys:%pointer new) 1) ;; Now replicate the N+1 byte to addresses N+2 to N+length. (sys:%blt (sys:%pointer-plus (sys:follow-structure-forwarding new) 1) (sys:%pointer-plus (sys:follow-structure-forwarding new) 2) nwords 1)) new)) ;; #, to eval at load-time so that the binary isn't monstrously huge... (defconstant MOST-POSITIVE-BIGNUM '#,(make-most-positive-bignum)) (defconstant MOST-NEGATIVE-BIGNUM '#,(make-most-positive-bignum t)) ;;; If you want to play with these, this might help. Prevents huge bignums ;;; from being printed. (sys:advise sys:print-bignum :around safe-biggestnums nil (cond ((eql (car sys:arglist) most-positive-bignum) (write-string "#.MOST-POSITIVE-BIGNUM" (second sys:arglist))) ((eql (car sys:arglist) most-negative-bignum) (write-string "#.MOST-NEGATIVE-BIGNUM" (second sys:arglist))) ((> (integer-length (car sys:arglist)) 10000) (write-string (if (minusp (car sys:arglist)) "#<massively-negative-bignum>" "#<massively-positive-bignum>") (second sys:arglist))) (t :DO-IT)))
I love it when you bust this shit out. Seriously. No one I work with has written in anything earlier than Java, except That One Guy, and *do not* get him started talking about Dylan.
The nostalgia... I vaguely remember doing some Z80-coded bignums (or eqv) for a CP/M Scheme as a project for Hal Abelson. Back in the day, y'know. Later I had a Lisp Machine at Symbolics. I remember so little.
I had a student do an assignment in lisp the other day. I made happy noises! But I wasn't allowed to give them extra marks for it.
I think you are allowed, so long as the extra marks are parentheses.
#< massively-positive-bignum > is funnier still.
Did you ever try this in CMU Common Lisp?
It might have worked when we had the BiBOP scheme.
But probably not after we went to low-end tags in the runtime, as arrays could be 2^29-1 in length. I think that was after your time.
No, I never tried this in CMUCL. I never tried it in Lucid CL either, and I can't imagine why... Lucid had high-bit tags too.
Way to remind me that I need to get my opengenera environment set up (at least I finally have a 64bit cpu to use with it).
Your mention of both languages in the same paragraph makes me wonder which would be more pointless of an exercise: Writing a Lisp in Java, or a JVM in Lisp...
The former has already been done. I'm not so sure about the latter, but CL is a great language to build compilers in. 'Course, that doesn't answer the *why*.
Really? The last time I went Google-diving for Java+Lisp, I got almost nothing. What few things I did find were mostly half-finished student projects (or seemed to be, at any rate).
ABCL is the first hit if you google "java common lisp" :-)
Eh, I was purposefully-vague as to how long ago I'd last checked...
The latter has also been done. It runs (for slow values of "runs") in SBCL, and is good enough to start Eclipse (-:
Check out http://groups.google.at/group/comp.lang.lisp/msg/3ae0cc36d70cfb3c for the lulz.
One minor addition to that: Cloak (the JVM in CL) also runs ABCL, for maximum silly recursiveness: http://paste.lisp.org/display/2096
Hah!
That's awesome/frightening.
I can't imagine it'd be very hard to take any Lisp compiler and retarget its backend to emit Java bytecode...
I was thinking more in terms of a Lisp interpreter written in Java, using their scripting interface introduced in JDK6. Embedding/scripting in Java apps using Lisp, rather than Groovy or Jython.
Armed Bear Common LISP started out as an interpreter, although it now has a LISP-to-JVM bytecode compiler. The interpreter might still be lurking in there.
http://armedbear.org/abcl.html
You can probably find more LISP (or at least scheme) interpreters for Java in this list: http://www.robert-tolksdorf.de/vmlanguages.html
Back when I considered recreational programming "fun", I wrote a small IDE type environment that had plugin modules for different interpreted languages, so you could open a Groovy listener right next to a Common LISP listener right next to a BeanShell listener, etc..etc...etc...There are far more languages implemented in Java now than there were then.
At the now-dead startup I worked at in 2000, we had an embedded Scheme interpreter in the server, everything written in Java.
Yeah, I know it's not Lisp, but it's got lots of parens.
There's a big ugly list of languages for the JVM that includes a bunch under the Lisp header.
http://www.unlambda.com/nevermore/
Who wants to try it on a modern machine?
I think LISP is what broke me. Trying to deal with that was what made me decide I was not cut out for programming for a living.
neat! I vaguely understand.
How old are you?
MOST-POSITIVE-LIVEJOURNALLER-AGE
I see your Wikipedia is broken...
aren't you the sweet one! Thanks.
Everybody knows Wikipedia is made of lies.
And bees.
Lies and bees.
The rate at which The Google indexes these things is really beginning to frighten me.
24 is the biggest number: http://youtube.com/watch?v=f3ek85X2uOE
No, it's 48,000,000,000: https://www.youtube.com/watch?v=Pj2NOTanzWI
cfsI did "b := 10 raisedTo: 1223146" in the Smalltalk I have on my machine, and it took about three minutes to complete. Then I tried opening an inspector on it to check it was correct - that was several hours ago, and it still hasn't come back to me (I'm pretty sure it's converting a display string into decimal in there somewhere).
I'm actually really impressed that your Lisp 18 years ago could handle all this, and that converting to a decimal string only took three days!
Lisp is next on my "important languages to learn" list, which I am processing in reverse chronological order...
Oh man. After all these years of thinking of you as a fixture in the club scene, I momentarily forget sometimes what a pure, shining geek you are at heart. Especially for a former art student. Thing is, the geekitude has really infected me. I think of myself as only barely conversant in the mysterie of Geeque, that I only am able to get a few passing jokes after years and years of osmosis, having married one and having befriended so many.
Yet knowing how engaging you are when you speak about good hacks, I actually read this code, all the way through, just to see what you commented out.
Some artist. I am a code-reading geek. I suck.
Doomed, doomed, doomed.
The idea of DANGEROUSLY-HUGE-VALUE is very funny to me. But.. where did you pull this code from, anyhow? A floppy stuffed in a closet shoebox labelled "1990 - Lisp hax"? No.. you're the kind of guy that takes advantage of exponentially increasing storage sizes to archive anything you've ever produced, aren't you?
I don't know, but I have his old lisp machines. And actually, I need to get rid of them now along with the other ones I have since I need to move. Want one?
Have you tried either of the Explorer emulators? Meroko and Nevermore?
Before you give away the machines, do you think you can find a way to extract disk images of the partitions (file systems as well as load and microcode bands)? I think that if someday any of these emulators work, I'll be wanting those...
Btw, these machines are now in the hands of my friend Devon McCullough, who turned me on to lispms and aaa terminals in the first place. I haven't talked to him in 6 months or so but last time I chatted with him he was in the process of reading in the explorer and symbolics boot tapes I gave him along with the machines. I should check in with him to see how that's going.
Meroko kinda works for a little while before it crashes. The CADR emulator seems to hold up for a little longer. As far as I can tell nobody's done any work on them for the past 3 years, though.
If you are looking to get rid of Lisp Machines, please let me know... I'm in the Bay Area.
The C standard, which was also published about eighteen years ago, actually names this constant INT_MAX, not MAXINT. Usually MAXINT is defined elsewhere with #define MAXINT INT_MAX
That would be the new-fangled ANSI C standard, yes? MAXINT dates to at least V7 UNIX (and is the general term in other languages and contexts, too).
I actually enjoyed reading the comments in the code. They're usually so cryptic. You have a great career in store for you as a technical writer.