I do, but not that much because I don't really have to -- as I wrote a little while back,
I do think it's kind of awesome, though, that my team totally won and these days basically all languages have the Lambda Nature. Remember when I used to have to argue with people about how garbage collection was a good idea, bounds checking was a good idea, intrinsic types were a good idea, and everyone disagreed, I mean, you had to actually argue with people about that, but now every language has garbage collection, bounds checking and intrinsic types? Ha ha, screw you guys.
Hey, pop quiz! Which of these languages is the best Lisp dialect?
1. Java
2. Objective C
3. Perl
4. PHP
5. Emacs
The answer, of course, is Perl, because it's the only one of those languages that has both lexical closures and first class anonymous functions. And how sad is that?
Also I'd like to point out again that nearly every security bug you've experienced in your entire life was Dennis Ritchie's fault, for building the single most catastrophic design bug in the history of computing into the C language: the null-terminated string. Thanks, Dennis. Your gift keeps on giving.
I imagine Dennis is laughing his arse off in the afterlife right now, since this year brought us one of the most catastrophic security flaws in the history of computing that not only directly exploited data types stored as length-value, but made the perfectly well written bounds check a vital part of the exploit.
I still blame BCPL's '*Z' for giving Richie the idea, and I'm still sure he can't have been the first to write a null-terminated string library. But maybe he was.
Just because he didn't come up with the idea doesn't mean it wasn't a terrible idea. He's the one who chose to build C around that model. It is his legacy we live with, not BCPL's.
It's Peter Norvig's Paradigms of Artificial Intelligence Programming (1992). The rights reverted to him and he's sharing it freely; I'm helping convert the OCR'ed text to Markdown. I mentioned the preface, which hasn't been cleaned up yet.
I'm writing an operating system for the Raspberry Pi from scratch, for reasons, and using C, because why not, and it's amazing the amount of breakage you can cause in so few lines of code.
This doesn't even include when I tried to print the value of the reset IRQ vector (located at address 0) and gcc decided I was dereferencing NULL so felt free to unleash the nasal demons.
I don't know. My other current project is writing Apple II demos in 6502 assembly language, and that's giving me a lot less trouble than the bare-metal C programming.
Though the 6502 programming is made easier because I develop in an emulator so have decent debugger support.
The most recent C issue wasn't even technically C's fault. Different models of Pi apparently the DRAM initializes to different values (all 0s vs all 5s) and so the code was only accidentally working in some cases.
Assuming by 'bare metal' you mean 'not relying on OS facilities' then the first non-bare-metal C compiler was presumably the first C compiler, which would have been developed on the Unix that existed at the time.
Do ... do you miss Lisp? I'm looking at a Lisp book right now that thanks you in the preface.
What book?
I do, but not that much because I don't really have to -- as I wrote a little while back,
And relatedly,
I imagine Dennis is laughing his arse off in the afterlife right now, since this year brought us one of the most catastrophic security flaws in the history of computing that not only directly exploited data types stored as length-value, but made the perfectly well written bounds check a vital part of the exploit.
I still blame BCPL's '*Z' for giving Richie the idea, and I'm still sure he can't have been the first to write a null-terminated string library. But maybe he was.
Ritchie*
I don't feel as bad about misspelling his name as I thought I would.
Just because he didn't come up with the idea doesn't mean it wasn't a terrible idea. He's the one who chose to build C around that model. It is his legacy we live with, not BCPL's.
It's Peter Norvig's Paradigms of Artificial Intelligence Programming (1992). The rights reverted to him and he's sharing it freely; I'm helping convert the OCR'ed text to Markdown. I mentioned the preface, which hasn't been cleaned up yet.
I'm writing an operating system for the Raspberry Pi from scratch, for reasons, and using C, because why not, and it's amazing the amount of breakage you can cause in so few lines of code.
This doesn't even include when I tried to print the value of the reset IRQ vector (located at address 0) and gcc decided I was dereferencing NULL so felt free to unleash the nasal demons.
It's a step up from assembly, and better than most everything else in its class. Try Bourne sh's uppercase C macros.
I don't know. My other current project is writing Apple II demos in 6502 assembly language, and that's giving me a lot less trouble than the bare-metal C programming.
Though the 6502 programming is made easier because I develop in an emulator so have decent debugger support.
The most recent C issue wasn't even technically C's fault. Different models of Pi apparently the DRAM initializes to different values (all 0s vs all 5s) and so the code was only accidentally working in some cases.
I wonder when the first non-bare metal C was.
Assuming by 'bare metal' you mean 'not relying on OS facilities' then the first non-bare-metal C compiler was presumably the first C compiler, which would have been developed on the Unix that existed at the time.
I meant C executed by a virtual machine of any kind, but your definition is probably more widely accepted, and probably means 1972.