It's such a strange article, in that it's mostly favorable to my point of view but with such a breathless amazement to it, like he's just discovered an actual unicorn or something. "Look, everybody! Here's a hacker who actually accomplished things and yet he doesn't fetishize the latest fads that I and all of my friends make our living writing about!" There's this tone to the thing like he just can't imagine that someone like me can exist. He's impressed but he doesn't really believe in it, this mythological creature he's discovered. And of course the whole "duct tape" thing is vaguely insulting, and a perfect example of what we call "damning with faint praise".
So I guess to the extent that he puts me up on a pedestal for merely being practical, that's a pretty sad indictment of the state of the industry.
In a lot of the commentary surrounding his article elsewhere, I saw all the usual chestnuts being trotted out by people misunderstanding the context of our discussions: A) the incredible time pressure we were under and B) that it was 1994. People always want to get in fights over the specifics like "what's wrong with templates?" without realizing the historical context. Guess what, you young punks, templates didn't work in 1994. They also like to throw stones at Mozilla, and how much 4.0 sucked and how mozilla.org decided they needed to rewrite it all in 1999, so that jwz code must not have been any good, right? The peanut gallery always fails to understand that I was talking about an entirely different code base that pretty much ceased to exist by early 1996, thanks to the (at the time completely unwarranted) Collabra rewrite, and that has never been seen by the outside world.
Around 1998 I pushed for Netscape to open source both the 3.0 and 4.0 code bases, since the 3.0 code base was the one that included a mail reader that actually worked, but they wouldn't let me do it.
Peter wrote his own response to Joel's article that goes into more detail with some more excerpts from the book.
I really enjoyed reading Peter's book, by the way. (The parts that I'm not in, I mean.) You should buy it.
<3
Yeah, that article wasn't one of Spolsky's better efforts. He goes from "duct tape programmers understand ... that any kind of coding technique that's even slightly complicated is going to doom your project" to the conclusion that only insanely talented programmers can work that way. Durr what? Shitty programmers are better off using multithreaded templated C++? Only sooper geniuses like jwz can work with the simplicity of single-threaded C?
Also, I doubt Spolsky's seen any of your code (he would know that you are now hard at work selling beer instead of hard at work inventing the future), so I'm not sure how he concludes that you're exceptionally talented. It seems to me more like you use foresight and pessimism to avoid getting into situations where you need to demonstrate exceptional programming ability.
Absolutely no offense, even by faint praise, intended.
"Use foresight and pessimism"
This needs to become some sort of meme metric.
The formulation owes something to a pilot's maxim: "a superior pilot uses his superior judgment to avoid having to exercise his superior skill."
I love this.
Oh, amazing.
Maybe if people were killed by bad software more often we'd have a motto like that. Wait, scratch that. More obviously.
But that's just Joel, right? He gets all dramatic.
I love Peter Siebel's writing. I ripped off the word "concision" from him, from Practical Common Lisp.
Beats me. I don't read Joel's blog because I couldn't care less about that kind of software-industry navel-gazing. (Unless it's about meeeeeeeeeeeeee)
Why did you push to open source the code?
-t
I think Joel is generally a pretty smart guy. In this case there seem to be two things that stick out:
A) he has an idea about who his audience is, and he is writing squarely at them. Hence the breathlessness of the style, as that's probably what they would be thinking, even if, personally, Joel would probably have a quite different view.
And B) "Duct-tape programmer" is just an unfortunate choice of words. He could have used something else like "Old-fashioned Programmer", "Customer-Driven Programmer" or "Result-Oriented Programmer" (or a million others), and got exactly the same point across, without the slightly insulting tone. Or maybe, somewhere in Joel's past, he really built a respect for duct-tape and didn't think it made you sound like you wrote Netscape 1.0 in Perl. The entire article reads quite different if you simply :s/Duct-Tape/Customer-Driven/g
Hey, and nice work on Netscape, if I haven't mentioned that before - 0.91 was what made me jump over from Mosaic on my trusty little Indigo.
B) They used to have a term/humor for this: real programmers don't eat quiche.
I got the impression that was more about a faux-macho "edit the inodes by hand using magnets" mentality that was the absolute opposite of pragmatism.
But what is "pragmatic"? As JWZ replied, "it was 1994". You have to choose to build with tools that not only work better in theory, they can also work in practical terms with that day's practical capabilities and horsepower. I probably burn more clock cycles animating a mouse cursor today than I used in my whole degree. Coding approaches that are "obvious" today are like templates were back in 1994, GUIs in the '80s, JPA / Hibernate and full-GUI continuous integration acceptance tests last week / next week.
Back when it was written much of the line-by-line humour of the Quiche Programmer stuff was about toggling your own bootloader, but the reason behind it was also the sheer impracticality of doing anything useful with Pascal (cue Borland, moments later) and the grim hack of fitting embedded C into the miserable 37½ bytes of space you had available.
If I were making an argument about programming languages or performance/space hacks, that would be relevant. But stuff like "Real Programmers don't comment their code—if it was hard to write, it should be hard to understand and harder to modify" is just chest-beating dickishness and has nothing to do with where you sit on the scale of something-simple-that-works-now-with-what-tools-we-have vs. something-baroque-which-will-be-beautiful-tomorrow-honest.
Actually, the term was hacker.
It's been so diluted/transmogrified that I've seen web developers coding PHP throw the word around without a second thought. My inner Moon cries...
I finished reading 'The Tipping Point' last night. From the conclusion:
Band-aids are short-term fixes to self-repairing systems.
Not a good idea on bridges, or even web browsers.
Looking at it from the outside, it was pretty obvious that 3.0 was 2.0 but finished being written, and 4.0 was the disastrous project which never shipped. Not knowing anything about Collabra, my impression of how it never shipped was that it was founded on goofy architectural concepts (Brendan Eich's seemingly meaningless statements about architecture greatly contributed to this impression) and that the company decided all of a sudden to be 100% compatible with every 'standard' in existence, and that the w3c was suddenly a legitimate standards body. Microsoft of course then sent a bunch of people to the w3c and bloated up the standard with every conceivable piece of garbage they could imagine, and presto, the browser never shipped. Of course, this story isn't actually incompatible with the Collabra story in any way, but I don't have any clear evidence for it, it's just my general impressions.
I get the impression that my coding style is simultaneously even more simplistic than yours in terms of technique and far more rigorous in terms of engineering discipline than what the navel-gazers engage in. It might make Spolsky's head explode. His writings always come across as deeply flawed to me, to the point where the best response is to make fun of them.
Joel likes homely analogies... it's his shtick. Half the article isn't so much about you as against what he calls architecture astronauts.
The last paragraph doesn't fit with the rest... I think what he's trying to say is that not doing unit tests is something you can't get away with unless you're very good.
Joel's regular indictment of architecture astronauts is why I read (and share) his articles. I've been subjected to more than my fair share of astronautics, adding unnecessary complexity, slowing down work on the project, making debugging more difficult and, worst of all, making the app significantly slower than necessary, so I greatly appreciate this sort of thing. Even if it is a bit over the top.
He's also against needless rewrites—say, of Netscape.
Notably, though, both of these articles are from eight years ago. His good ones from more recent years have been on more managerial topics.
There is no such thing as over-engineering. Optimal engineering is as simple and elegant as possible.
I hear a lot of things that are overbuilt being called over-engineered when it appears they have not been engineered at all. Sure someone may not have trusted their calculations and put in a massive factor of safety, but more likely they used excess material because they didn't know how to engineer the project mathematically.
The same goes for complication. If someone gave you directions to destination where the route was 10 miles and had twelve turns, and you could get there with a direct route that was three turns and 8 miles, who you say the former was "over-navigated'?
Bur the difference is the over-engineer would then brag about all the sites his route can take you to, even if you just wanted to get to the end as quickly as possible. I think over-engineering tends to mean things like solving problems that don't exist.
I can over-engineer faster than most of our code bricklayers can over-build. There's nothing wrong with an "over-engineered" solution that means someone with A Clue spent a day locked behind a door with a whiteboard and didn't "write any productive code", when it then saves the grunt coders a week of doing it in a slower way with more bricks and QA a month of not having to kick so many bugs out of it.
Over-engineering, I think, refers to the act of coming up with solutions that are far more complex than the task requires or the situation requires. The argument (which I agree with) is that as a general rule, unnecessary complexity = bad.
I don't think anyone is making the argument that one should just sit down and bang out code all the time without ever doing some planning. That would be just plain silly, too.
Many people are making exactly that kind of argument.
Stupidity is one culprit, pointy-hair another and Scrum-done-wrong is flavour of the week for causing it.
Ah, you're right of course, there are many idiots in the world, and some of them happen to work in our field.
What I meant by nobody was nobody in this particular debate, which may still be wrong. Spolsky and the comments I've seen have not been making that argument. But I'm sure plenty of other people are. :)
Yes, I'm just bitter(er) and tired(er) of defending "design" and "architecture" from a seemingly increasing pressure of "I want everything done now and perfectly". A lot of this comes from the world apart from the "duct-tape coders" of either the Spolsky or JWZ schools. The trouble is that it's not commercially practical to use coders of the calibre who can expand a tip calculator until it reads email - there aren't enough of them, and we aren't allowed to pay them enough to tempt expert coders away from their bar jobs (that isn't a joke, locally).
So "software development" mostly becomes the cat-herding problem of getting acceptable (if not astounding) work out of a roomful of average coders with marginal motivation (a meagre wage that's just enough to make them want to code for you, not enough to make them want to deathmarch for you). The feature list is too long to allow any one übercoder to simply type it all in, let alone code it, and that means learning to delegate to joe-curlybrace and his chums, who are on average, average.
This town used to have a HP research lab in it, now we have to work with minimum wage grunts who not only can't run the damned unit tests before shipping, they don't even check that it compiles. I'm considering a move to dentistry.
Yeah I'd agree, for most of us, 'over-engineered' means something like a full HTML rendering near re-rewrite of a web browser, when all the spec called for was a simple way to display the .txt format 'readme' file.
When I hear that term I think rube-goldberg solutions, or endless debates about MVC vs MVP patterns when the project doesn't call for anything nearly that complicated, and probably never will even if it grows up into something far more than it currently is. It's time wasted planning for something you will never need, and worse insisting that the coders write code for it.
Ample or Adiquite engineering I'm all for. Saves us all a lot of trouble. The problem is when you go too far, and start heading into the land of YNGTNI and work that ends up amounting to so much Yak Shaving.. (or worse yet, forcing others to shave the yak for you because YOU are the vaulted engineer..)
That is the common usage, but solving problems that don't exist (and adding costs) is worse than optimal case....and means that the engineering was sub-par. I would call it under-engineered, because the lack of knowledge created the situation.
Reading JWZ's and Eich's account, via Peter and JWZ: this doesn't sound like a duct-tape v. over-engineering problem at all. That seems like Joel trying to retrofit his issues onto the situation.
The stories seem to add up to the original Netscape team buying into the "ship in 6 months" business case, and skillfully Getting It Done, followed by the Collabra folks rejecting any business case that contradicted what they wanted to do, and then doing some relatively incompetent shit.
The Collabra folk didn't, per these accounts, "over-engineer" anything. I would describe it as *under*-engineering. They were soft on requirements and budget and soft on software architecture. They had (again, per these accounts) the vague idea of a cleaned-up, hecka-slick, code-base-for-the-ages but they didn't have the skills to recognize one when they saw it and didn't have the business sense to know how to manage long-term development.
The reason it irks me to call the Collabra effort "over-engineering" is that it gives development with a longer term perspective a bad name. There is a time for "get it out in 6 months or die trying" but if that's all you ever do, you're going to either die anyway or cause big problems for the world by releasing an unreliable system that becomes massively and critically deployed.
Good things do come from competent examples of patient, thoughtful, longer-term-thinking engineering. But there are enough cases of things like the Collabra disaster that the business side is highly suspicious.
That gives rise to a lot of what I guess we could dub "masking tape and bubble gum programmers" who get mistaken for gaffers ("duct-tape programmers") and who all too often manage to win the confidence of management by pointing at someone taking a more prudent approach and saying "Eh, see boss? They're *over-engineering*. Me? I already got something working!"
-t
Joel's article prominently features a link to a book at his Amazon store. I'm sure JWZ's referer is showing up in someone's logs. I've enjoyed both of their writings for many years ... in this case though I'd say these threads were Dvorak'd. :(
Thanks... "it was 1994" puts Splosky's no-unit-test fetish in a new light... to the point that it is absolutely dishonest. In 1994 Debian was 1 year old, a kernel build took a LONG FUCKING TIME (my 486 lasted a few years), and I didn't know shit about any of this because I was playing Doom 2 all day on a 4 line BBS on my 486.
When it comes to Coders At Work, I also liked the parts that you weren't in!
Yeah yeah, I'll set 'em up, you knock 'em down.
I was going to say something earlier about Netscape 4 inexplicably seeming perfectly fine on OS/2 (was it the similarity to Windows that helped?), but instead I will note that I made the mistake of impulse-buying an "Object-Oriented Analysis and Design" book [don't ask] and the 'big lesson' of its first chapter is to write to the point of usefulness first, then go back and attempt to mush your code into something-resembling-an-object-model that isn't complete spaghetti later.
Also, it's all in Java. And Java didn't have enums until last year, apparently.