geek check

I just read "Understanding Object Oriented Programming" and it's a hoot --

Every if and every switch should be viewed as a lost opportunity for dynamic polymorphism.

Until I got to the last page, there was no doubt in my mind that this was a hilariously deadpan satire. I'm going to continue to believe that because I have to believe that.

Tags:

41 Responses:

  1. torgo_x says:

    Until I got to the last page, there was no doubt in my mind that this was a hilariously deadpan satire.

    I have read many many books on OOP
    and I have seen exactly one that was any good:
    Damian's.

    All the others are just kookoo.

  2. geektron says:

    it does seem to be obviously baked examples to 'prove the point', but their end result, though obviously java-centric, doesn't seem all that bad.

    with a grain or two of salt it's not that bad of an article. am i missing something obvious?

    • hafnir says:

      I think the criticism is "how come the 'right' solution is completely cryptic (at least at first glance, or at the very least in comparison to the simplest solution)?!" Similar to say the acronym joke Robin Williams makes in "Good Morning Vietnam", in that if you know what all the acronyms mean his statement makes perfect sense, but otherwise it sounds like a random letter generator. There's humor in it man, I don't get that stuff. :)

      PS: Say hi to Texas for me. :)

      • geektron says:

        since i have to deal with the unnecessary complexity every day i don't see the humour in it.

        and i agree with some of the other posters about the 'extra classes', but i just lump that intuitively into my 'i hate java' rants.

        texas didn't say much back. sorry man.

  3. Really, the filename says it all.

  4. transiit says:

    It hurts my brain.

    Maybe I've not done enough "real" programming (that hacker solution looked mighty reasonable.), but their other examples are awful.

    Yes, I know they are examples. They still blew it, in my impression.

    Procedural solution:
    Yeah. Call a function every time you need a string variable. The overhead won't matter.

    Naive Object Oriented solution:
    Add the overhead of the function calls the procedural solution made, with the memory overhead of a bunch of brainless classes to wrap them.

    Sophisticated Object Oriented solution:
    Same as all the other problems. I think this sums it up: System.out.println(OSDiscriminator.getBoxSpecifier().getStatement())
    Three lookups, four functions. Just to print a freaking string. Plus shameless pandering to Mac fans.

    It's bullshit. We went from a few lines of easy logic to an infrastructure to register variables (which were clearly already contained in osName), and they call it better.

    I suspect that OOP is good iff you believe SLOC is a meaningful yardstick.

    -transiit

    • hafnir says:

      I think it's much more helpful when you're trying to solve complicated, involved real-world problems. In practice, if you're just writing a string I think most people wouldn't use anything more complicated than the procedural solution (there are always exceptions, etc). But when you know you're going to be dealing with crazy dynamic systems with tons of cases and that's going to grow in somewhat arbitrary ways, you start to think about starting out with a generalized approach. On the other hand, when you're inbetween, I've seen it's common practice to try the simplest approach and get more complicated when the simple approach starts getting unwieldy. On my current project, there are a few areas where I wish I'd just started with the abstract solution instead of trying to massage it in later.

    • quercus says:

      The overhead doesn't matter. What does this thing do ? How often are you going to call it ?

      I have a buttload of code, and I care about the performance of it. And the piece where overhead really matters is about 3% of the total. For the rest of it, I'm driven by maintainability, re-use on my next contract, flexibility and all sorts of non-execution performance stuff like that.

  5. Mr. Bunny's Guide to ActiveX by Carlton Egremont II
    "The computer book for the inner child"

    'ActiveX has been around for literally dozens of months, bu until now it has remained an inscrutible mystery to all but the most overpaid contract engineers, three Sufi priests, and Don from Nevada. Finally, here is a book that dumbs down the topic so it is understandable even to a piece of shoe leather.'

    I'm pretty sure that somewhere in this book is actual useful programming knowledge somewhere in it....

    The scary thing is that is does have an ISBN number 0-9661296-0-1

    • dagbrown says:

      Aww, that sucks. I only have "Mr. Bunny's Big Cup O' Java." Only Java book I've ever read. And that was plenty for me!

      And as far as I can tell, "design patterns" are a spectacular way of pointing out just how awful Java is, inasmuch as it takes dozens and dozens of lines of code to do things that would be simple in more advanced languages like C.

    • hafnir says:

      That book came highly recommended at the Guerilla COM class I took 4 or so years ago, taught by a bunch of COM gurus or something like that, and they gave it away to some people there. Can't remember if I got a copy - I never really got past Essential COM, though.

  6. xenogram says:

    I bet they're right into object factories. I love object factories, all those anonymous objects. Used for everything, you'll never know what you're doing. Just like using variable variable names instead of arrays, only with the power of OOP!

  7. recursive says:

    I think I need to put an extra hour tommorow into Project Wrap My Head Around Forth, just to counteract the psychic trauma that quote has caused me.

  8. jwz says:

    Mal points out the striking similarity to Hello World.

    • hafnir says:

      Oh no, I guess I'm a master programmer - I looked at that COM code and was like, "ok, makes sense, right." I'm only somewhat joking. :(

      • exiledbear says:

        Yeah, I know exactly what you mean. Sort of the same feeling I get when I listen to Tom Lehrer's _New Math_ song, and I wonder why it's so funny.

        Especially when he does it in base 8.

    • ronbar says:

      I'm definitely Apprentice Hacker, except my scripts only contain the part under "else". With time, they become more complicated in stupid, unmaintainable ways.

  9. hafnir says:

    Never read this article before, but even before that every time I write a switch statement I briefly check whether I should do polymorphism instead. Most of the time I just write the switch statement, though. :) On the other hand, I don't usually need switch statements very often.

  10. confuseme says:

    Jesus H. The Christ. It's like that evolution of a programmer joke, only, you know. Sort of looks not jokey.

    Speaking of insanity, have you seen this rant about RSS?

  11. xach says:

    Now, if he had written "Every lengthy etypecase is a lost opportunity for a generic function..."

  12. bitpuddle says:

    s/Object Oriented Programming/Obfuscation and Job Secuurity

  13. naturalborn says:

    That essay wouldn't be nearly so funny if it were trying to be funny.

  14. hafnir says:

    Incidentally I knew there were a few things bothering me about the so-called "sophisticated solution".

    1) He has the master class register the worker classes. Therefore if you add a new worker class, the master class code has to be modified. The worker classes should register themselves with the master class through a public member.

    2) The worker classes should be in separate executables from the master class. Therefore you can add a new worker class without having to recompile the master class. In fact, you can add it while the program is running! Something like COM can take care of this behind the scenes.

    What a bunch of beginners! ;)

    On the other hand, this is basically just turning into the master solution from the other thing you linked...you should forward that to the author.... :)

    (I should mention that at least for this example if I really had to do it I'd just do the dumb implementation and be done with it!)

    • darius says:

      Your suggestion won't work because Java classes are loaded dynamically. Since there are no references to the worker classes in the main one, they won't get loaded before it terminates. Which is kind of a pity since your approach would actually have some point to it -- these jokers have replaced one way of writing it that makes you edit the central class when there's a change with another, much more complicated, way of writing it that makes you edit the central class when there's a change.

      • hafnir says:

        You caught me, I'm a C++ coder not a Java coder. I understand Java automagically takes care a lot of the BS us C++ coders have to deal with, but I don't know a lot about it.

  15. rantzilla says:

    You know, I'll be honest and say I have not read the GOF design patterns book; but I have done enough procedural and OOP programming, mostly debugging and maintenance to say with certainty that no matter how fucking clever you think you are, you are still a human being and therefore completely and utterly flawed. In fact, they more clever you think you are, the more flawed your efforts will be. Ever hear of hubris? The ancient Greeks didn't write plays about it just because it made for a nice story.

    I have worked with the results of such "sophisticated" solutions and that is just what they are: sophisticated, as in the opposite of simple and elegant. I don't think I have ever heard anyone say "By golly, that program is great! It's design is sophisticated AND elegant!" No, I am pretty sure the two are mutually exclusive.

    JWZ, I have to agree with you. I am gonna believe the article is a satire because it cannot be otherwise is a sane world. Did I say "sane" world. I'm sorry, I forgot. My bad.

    Let me provide an example. I have recently been working with a "sophisticated" application that makes use of these "design patterns". Oh yes, this gem has it all: factory methods, singleton instances, dynamic polymorphism up the yin-yang. I tried reading one of the developers manuals for this application, because it is not just an application, no, it is a whole *framework* for developing and extending applications just like it. The mantra is put forth: "always code to an interface and never to an implementation".

    Yeah, that all sound nice and peachy. But when you have to debug that code, I think you will find yourself in a world of shit. I have spent I don't know how much time just trying to trace though this code to see how shit gets done because I am tracing all manner of factory and class method calls passing around what? Objects cast as interfaces in the method signatures. Yeah, using the debugger can help, along with good old fashioned println(); but if you just want to *look* at the code and understand it in a timely fashion, no fucking way.

    UNLESS, you are one of the primary designers and you have the whole thing in your head anyway.

    And therein lies the problem: cocky, clever, "they call me Mr. Knowitall", "Sophisticated Object Oriented + Patterns Solution" fucking programmers.

    Is their code more maintainable? Maybe, maybe not. Chances are, the guys who wrote it aren't gonna be maintaining it. And the guys who wrote it will have written the documentation, if any, in the same way they wrote the program: sophisticated.

    People say I tend to over-simplify things. Well, yeah. Because MY mantra is KISS:

    Keep It Simple, Stupid

    Maybe I am wrong. Maybe simple is too simple and not maintainable either. That means that no code is truly maintainable. It's all flawed. It's all a loss. In the next 10-20 years we are gonna reach an upper limit to how much "sophistication" all these intertwined systems can deal with and there will be a total cataclysmic failure. Boom.

    So I leave you with this to ponder:

    "You know, when you have a program that does something really cool, and you wrote it from scratch, and it took a significant part of your life, you grow fond of it. When it's finished, it feels like some kind of amorphous sculpture that you've created. It has an abstract shape in your head that's completely independent of its actual purpose. Elegant, simple, beautiful.

    Then, only a year later, after making dozens of pragmatic alterations to suit the people who use it, not only has your Venus-de-Milo lost both arms, she also has a giraffe's head sticking out of her chest and a cherubic penis that squirts colored water into a plastic bucket. The romance has become so painful that each day you struggle with an overwhelming urge to smash the fucking thing to pieces with a hammer."

    - Nick Foster ("Life as a programmer")

    • quercus says:

      I don't think I have ever heard anyone say "By golly, that program is great! It's design is sophisticated AND elegant!"

      I once had an ex-client phone me up and say that - a year after I'd left. For the last year they'd happily been modding my code for small stuff, but now they had to make A Really Big Change and they wanted to make sure that it was as easy as they thought it would be.

      Yes, I was rather pleased by that 8-)

  16. tfofurn says:

    By gum, some of us have to program embedded systems, and we're happy to have a frickin' C compiler! Pedantic overimplementors! But then, that was your point.

  17. stonemonkey says:

    I have already drank the kool-aid on OO, polymorphism, and design patterns. However, what are these guys smoking? What a horrible advocacy paper. Take the procedural "hacker" solution. What person in their right mind uses if statements for a DATA-DRIVEN problem. Something like the following would have been a better starting point:


    public class PrintOS {
    public static void main(String args[]) {
    Hashtable table = new Hashtable() {{
    put("SunOS", "This is a UNIX box and therefore good.");
    put("Linux", "This is a UNIX box and therefore good.");
    put("Windows NT", "This is a Windows box and therefore bad.");
    put("Windows 95", "This is a Windows box and therefore bad.");
    }};

    String lookup = (String) table.get(System.getProperty("os.name"));

    if (lookup == null) {
    System.out.println("This is not a box");
    } else {
    System.out.println(lookup);
    }
    }
    }

    Replacing a Hashtable with a Properties object would have moved the data out of the code altogether (which seems like it solves a larger problem in their fake problem). Though I am moving beyond the point of their article....It is fairly clear that their example sucks because it does not need polymorphism to be solved maintainably or elegantly.

    So why write an article on polymorphism and use a problem that does not illustrate any benefits to polymorphism? And why show design patterns when their are NO forces that require them? (I found it extra weird that the container would be responsible for registering its containees -- talk about an inverted responsibility).

    I smell a conspiracy theory in here somewhere....

    Academics in CSci with a supposed OO background could not possibly write something this bad without an ulterior motive....Are they really prolog or functional guys trying to discredit OO? Pyschologists trying to study scathing reviews? Just a joke?

    • I've noticed a trend for a long time in these "advocacy" articles -- they all seem sophomoric to me. Like the guy who wrote this probably got "religion" last year and has been busily applying his new hammer to every screw he sees.

      You're absolutely right that it's a stupid example. The problem is, real world situations where polymorphism is truly a major win are neither simple, nor easy to write about in a trivial article like that one.

      People with the necessary experience to do justice to the topic are also often too overworked/burned out/cynical (and/or drinking heavily to forget that they are overworked/burned out/cynical) to give a shit.

    • exiledbear says:

      And if it's null or "", reset it to "unknown". Then add another entry in your table, and shorten that if statement to just print().

      But then people might understand what you're doing, and that wouldn't impress anyone. God forbid that ever happening.

      Nah, I think what you're seeing is a guy who is a true believer in Patterns and all that shit. The fact that he's also using Java should ring some warning bells too.

      When the Chaos comes, this guy is just going to be so much Meat for the Monster.