OpenGL questions

I was again somewhat-halfheartedly trying to wrap my brain around the slow-motion clusterfuck that is the OpenGL specification, before (again) throwing my hands up in disgust.

You'd think that somewhere, there would be a table: a grid of checkmarks, where one axis is "versions of the OpenGL spec", and the other axis is "features that are supported in that version". If this exists, I haven't found it. Have you?

The problem I am hoping to solve is this: I want to write an OpenGL program, and have it able to compile run on both a 5- or 10-year-old Linux system, as well as on an iPhone. I need to know what APIs are available to me. It's possible that the intersection of the sets of APIs that run on each of those platforms is in fact the empty set, but I can't even tell.

I'd like to modernize the code in xscreensaver to make it easier to port to newer systems, but not if it means each file has to have 3 ifdefs with 3 completely disjoint implementations (meaning all bugs have to be fixed in 3 different places).

Converting all the glBegin and glNewList stuff to glVertexPointer wouild be a pain in the ass, but it might be possible to do that with preprocessor macros, or otherwise mechanically. But then I click around and find out that random shit like glInterleavedArrays doesn't exist in OpenGLES. Or does it? The fact that I am finding it nontrivial to answer this question is crazy.

And that's even before dealing with the flying circus they've transformed the lighting model into. "Oh, you just wanted to place a light in the scene, and specify your object's color and specular properties? Well instead of that, wouldn't it be easier to learn a new language in which you can write hundreds of lines of code for a SIMD vector supercomputer? ...No?"

Given the proliferation of incompatible specifications, you'd think that the man pages for these functions would indicate which versions of the spec they conform to, like all the POSIX man pages do, but no, because the OpenGL specifications -- and especially -- appear to be curated by drooling pinheads. A wiki? Seriously?

Tags: , , ,

43 Responses:

  1. gen_witt says:

    Stick to the OpenGL-ES 1.0 and you should be fine, it's basically the parts of OpenGL 1.3 you're supposed to use. You're not supposed to use InterleavedArray, instead use multiple calls to VertexPointer, et. al. with big strides.

    • jwz says:

      So, you believe that an OpenGL ES 1.0 program will compile and run under, say, a 2001-vintage Mesa3D?

      • gen_witt says:

        OpenGL 1.3 was released in 2001 which is close enough to OpenGL-ES 1.0 if you don't use the fixed point stuff. Chances are it will work with earlier stuff but you'll be in extension hell. The is the GLX/EGL/AGL difference but I think you've already extracted. And come to think of it EnableClient isn't in GL-ES so you need to do a little voodoo there (#define glEnableCLient glEnable, might be enough I'm not 100% sure).

        GL sucks, no question.

      • mcc_baka says:

        My understanding is the answer's yes. OpenGL ES is intended to be a subset of OpenGL.

    • gen_witt says:

      Also some resources,

      the OpenGL-ES 1.1 man pages it a marginal webpage, .

      GLEW, fish around in the sources for a clean list of the differences between big OpenGL versions. In particular look at files named GL_VERSION_1_1, etc.

      the OpenGL-ES 1.1 difference spec, if you're very familiar with OpenGL 1.5, this is a big list of differences,

      I'm sorry this is all 1.1 stuff, the 1.0 specs are no longer on the front page, but they're in the Khronos members area. (I can't put them up, maybe you can find them somewhere, I couldn't find them in a cursory search).

      • jwz says:

        What is the relationship between "OpenGL N" and "OpenGL ES M"? E.g., how do I answer the question "does function X exist in N but not M?" Are OpenGL 4.0 and OpenGL ES 2.0 the same thing?

        • gen_witt says:

          OpenGL-ES is defined as a diff against OpenGL. OpenGL-ES 1.0 -> OpenGL 1.3, OpenGL-ES 1.1 -> OpenGL 1.5, OpenGL-ES 2.0 -> OpenGL 2.0. But of course they made both additions and subtractions. The most straight forward way to answer those questions is to consult the difference specs; however, they're about 100 pages each. OpenGL 4.2 added the last few things in OpenGL-ES 2.0 that weren't in OpenGL (i.e. OpenGL-ES 2.0 is a subset of OpenGL 4.2); but thats not likely to help you for another 10+ years.

          Maybe some history will make the differences more palatable. Several companies independently took OpenGL and slimmed it down to run on their embedded platforms, unfortunately they also added stuff they thought would be useful (especially on processors without floating point units). Then they all got together and "standardized" OpenGL-ES which means it became worse than all of the fragmented "little" GLs. One of the guiding principles was to remove the feature that either have higher performance alternatives or that frequently have to be emulated in software (picking comes to mind). Unfortunately from there OpenGL-ES went through some divergent evolution. OpenGL-ES 2.0 is basically the same thing applied to OpenGL 2.0 minus fixed function plus a bunch of very core extensions.

          It's not prettier on the other side of the fence, every version of DX is a completely different API. DX10 is better than OpenGL, but it took microsoft 10 tries.

          • jwz says:

            Wow, and I thought the history of RSS was ridiculous...

          • jwz says:

            I managed to generate this: No GLES stuff in there because I can't find the list. (This is of dubious utility so far, because I don't think "does this function exist in OpenGL 3.3" is very close to the question I am trying to answer, but that's the version for which man pages exist.)

            I assume all the GLU stuff is just gone?

            • gen_witt says:

              Nice. Ya, there's a bunch of differences that are difficult to convey from a simple list. For example TexImage2D, may take power of two sizes, non-power of two sizes if and only if level == 0 (OpenGL-ES 2.0), or just any non-power of two sizes.

              Should be able to get a list of enumerants somewhere, that will be most of the rest of the differences.

              GLU should be implementable as a standalone library on top of GL. You might try ripping the one out of Mesa. A web search comes up with a couple of projects that have done just that, but I've never done used any of them.

            • mcc_baka says:

              The GLU stuff is all just gone, what I do when I need a GLU function is just copy the code for that function out of the open source release of MESA.

              (On the more general "how do I tell if this function exists in OES" question see my post below...)

            • gen_witt says:

              I pulled the ES functions from the headers, . Merging should be a simple matter of perl.

      • edouardp says:

        Is GLEW in general what Jamie needs to discover what is supported at compile time on the platform he's compiling against? One of my programmers at the previous job used GLEW to manage the insanity of OpenGL versions and extensions across Nvidia, ATI and Intel cards on Mac OS and Windows... Or does it not work on OpenGL ES?

        From experience all this is manageable if you are willing to put in the time in a commercial or larger open source project, but it just puts up a huge barrier to anyone who cannot put a full time position onto it. That seems like a bit of a barrier for bringing new developers into the fold...

        • jwz says:

          xscreensaver has almost no extension use, and it doesn't look to me like GLEW abstracts out stuff like glVertex. Looks like GLEW is mostly shorthand for checking glGetString(GL_EXTENSIONS) all over the place?

          • gen_witt says:

            GLEW's main function is deal with the total nightmare that is linking against OpenGL. Did you know that sometimes you use dlopen to get functions, but sometimes you use glXGetProcAddress? Did you know that linking against specific versions of the NVIDIA driver breaks dlopen?

        • gen_witt says:

          GLEW is OpenGL only. It mainly does function import and extension detection at runtime. I'm not sure what it can do at compile time.

          What it does have is an amazing set of scripts to download and parse the OpenGL extensions, most of the code is autogenerated.

      • ahruman says:

        All OpenGL specs since 1.1 are public here. All ES specs are here. My usual approach to finding minimum versions is to manually binary search the PDFs. Oh, what fun.

  2. h4lcyon says:

    It's "Nope, no easy way", pretty much.

    The iPhone does OpenGL ES 2.0, which doesn't support using a fixed function pipeline, shaders/vertex buffer objects all the way (Which also means: No arrays anymore, just VBOs. You can do interleaved with those, you just have to call things in a different way again, look for glVertexAttribPointer). OpenGL only supports shaders properly from OpenGL 2.0 upwards, which is ca. 2005, so if you want to have something that works all the way through, it is "multiple versions and ifdefs" for you.

    If you want to know something about some specific version of OpenGL, the quick reference cards are usually okay. This is the one for ES 2.0.

    (The new version isn't that bad, really, it just needs a lot getting used to, and there really aren't any good manuals or tutorials, except maybe )

  3. I know it's not what you want to hear, but game industry is moving to a model where you use a framework one level up from OpenGL so the middleware developer deals with this problem for you.

  4. crandall1 says:

    To the best of my knowledge, the following things are true:

    1. iPhone Edge and iPhone 3G use OpenGL ES 1.0; thus, you have to use glVertexPointer only, and no shaders.

    2. iPhone 3GS and iPhone 4 use OpenGL ES 2.0; thus, you can use shaders.

    3. It's a fucking pain in the ass to convert glBeginEdit()/glEndEdit() to glVertexPointer, as you've discovered, but a fairly simple conversion class can be made so you only have to change the glShit() to, say, myGLshit().

    The only solution I'm aware of is to use middleware, as the previous poster mentioned, such as OpenFrameworks, LibNUI, or the like, for iShit, which will allow you to use your older OpenGL jank with the new iShit. This carries its own raft of problems, of course.

    Either way, a fucking pain in the ass. Nothin' to it but to do it, really.

    • krick says:

      Most iPhone owners obsessively upgrade to the latest hardware anyway, so you're probably safe just targeting the latest and greatest.

      • jwz says:

        Yes, it's probably safe to ignore versions of the spec that only ran on embedded hardware that is no longer manufactured. That stuff doesn't last. However, it's much harder to ignore old versions of the spec that ran on desktops, because those tend to not go away for around a decade.

    • ahruman says:

      ES 2 doesn't say you can use shaders, it says you must use shaders. Fortunately, though, you can still get an ES 1 context on ES 2-capable iPhones, which is the reasonable way to go if you want parity with desktop 1.x.

  5. mcc_baka says:

    Hi, a few things:

    - glInterleavedArrays doesn't exist in es BUT you don't need it, because you can just call glVertexPointer, glNormalPointer, glColorPointer etc one after the other and set stride as appropriate. A good rule of thumb is that if there are two ways to do a thing in OpenGL, then OpenGL ES removed one of them, and the one they left behind is always the one which is simpler for hardware implementors to implement. This usually means that if you could describe it using the term "convenience function", OpenGL ES removed it.

    - Most of the rest of your questions aren't possible to answer unless you specify whether you're using OpenGL ES 1.x or OpenGL ES 2.x. Annoyingly, the two aren't compatible-- ES basically says You Will Use Shaders Or You Will Use The Fixed Function Pipeline But Never Both While A Single Program Is Running. (Note, OpenGL 3.0 makes this same demand in its not-ES form also, they're trying to phase the fixed function stuff out.) Your complaint about having to write a very long SIMD supercomputer program instead of just being able to use a lighting model makes it sound like you're using OpenGL ES 2.x. If you are trying to update older OpenGL code to run on ES systems, however, you'd almost certainly rather just use ES 1.x. If you're not using shaders I'm pretty sure there is basically no downside to going with 1.x-- I am not aware of any systems in the wild which don't support ES 1.x, whereas there are systems that don't support ES 2.x (although on some systems, like the iPad, 1.x can be slower). I think the lighting model in ES 1.x is basically like that of Ordinary OpenGL 1.x.

    - The thing you said about not being able to figure out what the specification is or what's supported or what-- the trick to this is, ignore the Wiki, look at this page AND ONLY THIS PAGE. This is the authoritative source for ES and also basically the only source I'm aware of that will not just wind up misleading you if you're an ES user. If you're using ES 1.x, look at the "Online Manual Pages" link, the "Extension Pack Specification" (several functions you'd think of as critical are kept in the extensions), and if you have to the spec. For ES 2.x there is also a very nice "quick reference card" PDF that simply lists which functions and shader-language thingies are supported. In either case if something's not in the man page list at that URL for that version of ES, it's not in that version of ES.

    - This aside, if you're unfortunate enough to be using OpenGL ES 2.x, my experience with that is that you're going to basically be miserable until you give in and buy this book (written by members of the ES standards board-- apparently writing books to sell instead of releasing proper documentation is something of an OpenGL tradition?).

    Does any of this help? Now that I'm about to post I see that gen_witt has said some of this already...

    • mcc_baka says:

      This was originally part of the above post, but it's less important and also apparently LiveJournal has a per-post character limit? Meh.

      - I said that ES 1.x and 2.x "aren't compatible". Note, I'm pretty sure this doesn't mean you need ifdefs. I'm not 100% certain on this, but at least on the systems I've tested (iOS), you can freely include headers for both ES1 and ES2 in a single program, and what will happen is that if you call a function for a version of ES that is not loaded the program will just crash. So when I had to write a program that could be either ES 2.0 or ES 1.0 depending on what system you're on I wound up writing a bunch of functions like:

      void jcColor4f(GLfloat r, GLfloat g, GLfloat b, GLfloat a) {
      if (es2) { glVertexAttrib4f(p->s_color, r, g, b, a); }
      else { glColor4f(r,g,b,a); }

      Where "es2" is a previously set flag recording whether I was successful in loading ES 2.x at startup (and "p" points to a structure holding the shader attribute ids for the currently loaded shader). Once you have this set up you basically just have to write some shaders re-implementing the 1.x fixed function pipeline in GLSL (I think the people who wrote the Orange Book made some of these, though I'm not sure what the licensing is). If self-plugging is okay, I'm planning to release my compatibility-aid code (and in general some sample code for coding OpenGL Ordinary, OpenGL ES 1.x and OpenGL ES 2.x with a single codebase) when I post the new version of the sample code package based on my game, but I'm not sure when I'll have that up and that won't include any lighting code...

    • jwz says:

      So wait, you seem to be saying that I get to pick whether I want to use ES1 or ES2. This seems... contrary to the apparent philosophy of the people currently driving the OpenGL specs. I assumed that the mere existence of ES2 meant that ES1 was already "considered obsolete" and that writing new code aimed at ES1 was foolhardy. How could this not be the case?

      Sadly, I already bought and read that book, on someone else's earlier advice.

      I think the last time I felt so much like throwing a book against the wall, it was written by Anne Rice. It is so amazingly badly written! Not only was it poorly structured, full of incomprehensible conceptual forward-references that made no sense until you've read the whole thing twice, but it was even amazingly poorly designed -- shit like, the copy talks about the "red square" and the "blue square" when the GRAYSCALE ILLUSTRATION shows TWO GRAY SQUARES. That kind of inexcusable crap was all over the place. Gaaaaaah.

      • mcc_baka says:

        "assumed that the mere existence of ES2 meant that ES1 was already "considered obsolete" and that writing new code aimed at ES1 was foolhardy. How could this not be the case?"

        Yeah... while this is the way every other API in the world works, I don't think this is a useful way to think about OpenGL ES. You'll probably have better luck if you think of it like, OES 1.x and 2.x are just two fairly different APIs for a mobile GPU, 1.x is an API for a fixed-function GPU similar to old OpenGL, 2.x is an API for a shader-based GPU. A device might support one or the other or both, and although it's a sound assumption devices in the future are going to be supporting 2.x I wouldn't be *totally* surprised if we continued seeing new 1.x-only devices occasionally popping up for some time in the future just because some vendor decided they weren't interested in shaders.

        Basically it seems like a lot of things about OpenGL ES make more sense if you assume the people who designed it did so thinking at all times about the needs of the people who design mobile GPUs, and not at all about programmers...

        Anyway on iPhone/iPad, you actually do have a choice of ES 1.x or ES 2.x-- when you create a drawing surface you specify which one you want, if you ask for a 2.x surface on a pre-3GS phone this will fail, if you ask for a 1.x surface you will always get it. I unfortunately do not have experience with Android devices, but I understand most Android phones are using the same PowerVR GPUs Apple is so I would expect they use the same setup with both 1.x and 2.x available as options. (Oh, and code targeting either ES 1.x or 2.x, if you dump it on a desktop with Ordinary OpenGL 2.0 headers and try to compile the code as if it were ordinary OpenGL, should compile and work just fine-- both ES 1.x and 2.x are subsets of ordinary-OpenGL 2.0, they just happen to be different subsets.)

        I think it's actually probably safer, if you're only going to include a code path for one of the two, to support ES 1.x than 2.x. The people who designed ES clearly were envisioning a future where someday an ES 2.x-only device could exist, but I'm not aware of any such device existing in the wild right now, and since both Apple and Android have guarantees of 1.x availability in their APIs (and it's probably not that difficult for an OS vendor to write a software compatibility layer where a GPU doesn't support 1.x natively but the OS is faking it using 1.x-like shaders-- the iPhone may be doing this already for all I know) 1.x support is probably going to be around basically forever.

        • jwz says:
            Basically it seems like a lot of things about OpenGL ES make more sense if you assume the people who designed it did so thinking at all times about the needs of the people who design mobile GPUs, and not at all about programmers...

          I had definitely, definitely picked up on that. Yes.

            1.x support is probably going to be around basically forever.

          Ok... but I'm still reeling from the notion of something called "GL" that doesn't have Vertex3f in it, so my trust in the word "forever" is somewhat shaken!

          • gen_witt says:

            OpenGL-ES 1.1 is almost always implemented as a wrapper on top of OpenGL-ES 2.0, which makes it roughly free to implement on new hardware. So you should be good until at least ES 3 (I wouldn't bank on forever in the cell phone space).

  6. bifrosty2k says:

    Wow, I am amazed that people even know about OpenGL in-depth enough to even talk about it to this level.

    Frankly, I'm amazed that programs built for Linux systems even a year old work currently given the state of how screwed a lot of distributions are. I had to build a "new" system using a 2 year old install CD (for compliance reasons), 90% of the stuff didn't work with anything relatively recent.

    • giantlaser says:

      Are you referring to the problem of running older distributions on newer hardware? That's nothing to do with distributions. That's a problem of drivers not existing in older kernels. The problem there is the ever-changing kernel APIs and the monolithic kernel design, which makes third parties not even consider writing drivers. If you could write a driver for 2.6.(pick any) and have it still work on 2.6.(pick another), it wouldn't be this bad.

      • bifrosty2k says:

        Nah, wish it was that simple, this is even installing old stuff on old hardware. I've spent a lot of time building stuff by hand, se la vie, this is the nature of linux.

      • fnivramd says:

        Two words for you. Class drivers.

        Vendor drivers are written by the EE nobody wanted working on the hardware, in the same way that flat pack instructions are written by the guy most easily spared from the production line. So you do not want to run vendor drivers. If you buy devices that implement a published device class, you can run drivers written by an actual programmer, and because next month's devices are the same class the driver won't be abandoned as soon as the next batch of hardware gets put on a ship from China.

        This isn't a Linux thing. Microsoft's logo program _requires_ that whole swathes of devices be class compliant or you can't put a logo on them. Vendor drivers suck, and everybody wants rid of them.

  7. baconmonkey says:

    How many angels can dance on the head of a pin?

    • jwz says:

      You have to write a custom shader for that, and the specification only guarantees between 4 and 8, though 65535 is common.

    • lionsphil says:

      Whoa whoa whoa, that's the nasty old, obsolete way of talking about it—it's all much better now. Being able to simply use a pin was strictly redundant, so it was removed; now you need to write a small program which is compiled at runtime which defines a sphere on a cylinder tapering to a point and pass the context for this into your dancing routine. Getting each angel-dancing program to include its own minor variation of pin code is clearly the most efficient and sensible thing to do on limited platforms rather than having common functionality be part of a core library where it can jointly maintained and be shared on disk and in memory.

      • Right, but if you optimize the dancing routine for the specific shape of the pinhead you can usually get a few more angels on it, and that's the most important thing.

      • jabberwokky says:

        I have been assured that I can get infinite angels on the head of my simulated pin. I have my Turing machine right here, but I can't seem to find that box of infinite tape...

  8. ext_244831 says:

    Thank god. I was starting to half-believe that it was just me.