help me debug some X11+OpenGL stuff

Hey, if you have access to X11 running on non-x86 platforms, help me out by running a test program and telling me what it did.

    test-texture.c
    gcc -o test-texture test-texture.c -I/usr/X11R6/include -L/usr/X11R6/lib -lX11 -lGL -lGLU

<LJ-CUT text=" --More--(13%) ">

Help me fill in the question marks:

        8     16     24     32  
      Linux x86:   ok ok ? ok
      OSX PPC:   ? ? ? ok
      Sparc:   ? ? ? ?
      Linux x86 -> OSX PPC:   ? ? ? ok
      Linux x86 -> Sparc:   ? ? ? ?
      Sparc -> Linux x86:   ? ? ? ?
      OSX PPC -> Linux x86:   ? ? ? BAD

    Tell me what it printed to stdout, and whether the colors in the window were right. It should look like this:

    "Linux x86 -> OSX PPC" means "program is running on Linux x86, with $DISPLAY pointing at an X server running on OSX PPC".

    "Depth 32" means 4 bytes per pixel.

    "Depth 24" means 3 bytes per pixel (this is less common.) If your video driver supports it at all, you might need to turn it on by adding DefaultFbBpp 24 and/or Option "Pixmap" "24" to xorg.conf or XF86Config.

    "Depth 8" means TrueColor, not PseudoColor / colormapped. Visual "TrueColor" will probably be needed.

What's the big idea?

    I've got this image data that came from the X server (via XGetImage), and it's in some arbitrary format. It might be in any bit order, byte order, and number of bits, depending on the whims of the architecture and video driver. I want to feed it to OpenGL for use as a texture.

    Currently, I do this by iterating over the X data and constructing 32-bit RGBA native-endian data to hand to glTexImage2D or gluBuild2DMipmaps. But, copying/reformatting all that data is slow. (Slow enough to matter, it turns out.) So instead, I'd like to just hand it to OpenGL directly, and say, "hey GL, this data is 16 bit BGR bigendian, deal."

    I'm trying to figure out how to express that to OpenGL. I'm having a hard time, because it's very poorly documented. Thus, this test program.

Update: Plase try the new version here.

Tags: , , , , , ,

38 Responses:

  1. cvisors says:

    damn I went to compile this on my alpha (compiles fine btw) and then remembered that the video card doesn't work in X at the present moment, we are still trying to sort out some bugs here and there, (using a friends port of fedora core2)

    sorry about that..

    Benjamin

  2. duskwuff says:

    Apple X11 on OS X.

    Root:     1024 x 768 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 00FF0000 0000FF00 000000FF

    Endian: Big
    Visual: 0x36 TrueColor
    Texture: BGRA UNSIGNED_INT_8_8_8_8
    • duskwuff says:

      And in 16-(well, 15-)bit mode:

      Root:     1024 x 768 x 15
      Image: 640 x 480 x 15
      format = ZPixmap
      bytes = MSBFirst
      bits = MSBFirst
      pad = 32
      bpl = 1280
      bpp = 16
      rgb = 00007C00 000003E0 0000001F

      Endian: Big
      Visual: 0x36 TrueColor
      Texture: RGB UNSIGNED_SHORT_5_6_5
    • jwz says:

      But I ran it on OSX X11 in 32 bit and it looked fine here! Same text output as yours.

      The Mac I tried it on gets "illegal instruction" when I run any GL program when X is started after switching to "thousands of colors", and X won't start at all in "256 colors".

      • amcmillan says:

        I get the same results as zetawoof here, with "X11 1.0 - XFree86 4.3.0" on "Mac OS X 10.3.7". Runs fine in millions and thousands, although the colors are messed up.

        In 256 colors X11 seems to hang, but going into X11 Preferences and setting "256 colors" under Output gets X11 to run. However I just get this output from text-texture:

        Root: 1280 x 854 x 8
        Image: 640 x 480 x 8
        format = ZPixmap
        bytes = MSBFirst
        bits = MSBFirst
        pad = 32
        bpl = 640
        bpp = 8
        rgb = 00000000 00000000 00000000
        error: couldn't find a visual

      • phs says:

        I get the same yellow result as everyone else here (Powerbook w/Rage 128, Mac OS 10.3.7, XFree86 4.3.0).

        Changing line 434 to GL_UNSIGNED_INT_8_8_8_8_REV fixes it.

        stdout from working version:

        :; ./test-texture

        Root: 1152 x 768 x 24
        Image: 640 x 480 x 24
        format = ZPixmap
        bytes = MSBFirst
        bits = MSBFirst
        pad = 32
        bpl = 2560
        bpp = 32
        rgb = 00FF0000 0000FF00 000000FF

        Endian: Big
        Visual: 0x36 TrueColor
        Texture: BGRA UNSIGNED_INT_8_8_8_8_REV
    • partylemon says:

      I get EXACTLY the same on Mac OS X 10.3.7 (7S215) running Apple's X11 1.0 - Xfree86 4.3.0 (always nice to include those) with my oh-so-powerful Rage 128.

    • hub_ says:

      LinuxPPC (client & server) ATI Rage 128 card, X.org

      Same colors as above.

      Root: 1024 x 768 x 24
      Image: 640 x 480 x 24
      format = ZPixmap
      bytes = MSBFirst
      bits = MSBFirst
      pad = 32
      bpl = 2560
      bpp = 32
      rgb = 00FF0000 0000FF00 000000FF

      Endian: Big
      Visual: 0x23 TrueColor
      Texture: BGRA UNSIGNED_INT_8_8_8_8

  3. cvisors says:

    Okay compiles fine under IRIX, with the following output:

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 000000FF 0000FF00 00FF0000

    Endian: Big
    Visual: 0x39 TrueColor
    Texture: BGRA UNSIGNED_INT_8_8_8_8

    unfortunately the colours of the window are quite wrong

    the white is yellow, the red text is black, the green text is blue.

    If need be I could take a screen shot for you.

    Benjamin

    • jwz says:

      Can you try other values for "type" and "format" (around line 434) until you find one that works? "man glDrawPixels" lists the possible values.

      • uon says:

        (Sorry, mispasted first time) - compiled on IRIX, displayed on linux x86-64, 32bpp (xdpyinfo) screwed up the colours (screenshot) and printed this:


        Root: 1280 x 1024 x 24
        Image: 640 x 480 x 24
        format = ZPixmap
        bytes = LSBFirst
        bits = LSBFirst
        pad = 32
        bpl = 2560
        bpp = 32
        rgb = 00FF0000 0000FF00 000000FF

        Endian: Big
        Visual: 0x21 TrueColor
        Texture: BGRA UNSIGNED_INT_8_8_8_8_REV

        Changed to use UNSIGNED_INT_8_8_8_8 and it displayed correctly (screenshot) and printed this:


        Root: 1280 x 1024 x 24
        Image: 640 x 480 x 24
        format = ZPixmap
        bytes = LSBFirst
        bits = LSBFirst
        pad = 32
        bpl = 2560
        bpp = 32
        rgb = 00FF0000 0000FF00 000000FF

        Endian: Big
        Visual: 0x21 TrueColor
        Texture: BGRA UNSIGNED_INT_8_8_8_8

      • cvisors says:

        Jwz, sorry just saw your new post and I will run that on IRIX when I get home this evening, unfortunately the wonders of time zones here

        B

  4. nothings says:

    Regarding documentation, maybe you already worked this out, but I believe it's:

    OpenGL 1.2 and later: all the packed pixel formats are supported (or it's a bug)
    OpenGL 1.1 and earlier: if the GL_EXT_packed_pixels extension is defined, then only the packed pixel formats that don't end in "_REV" are supported.
    OpenGL 1.1 and earlier, without EXT_packed_pixels: no packed pixel formats

    Of course, due to driver bugs etc., maybe you still need to collect the data. But you could try checking this stuff via glGetString(GL_VERSION) and glGetString(GL_EXTENSIONS), unless you know everybody really just always has at least 1.2.

    • jwz says:

      I guess folks who don't have 1.2 won't be able to compile it at all, since the _REV symbols won't be defined?

      • nothings says:

        Hey, good point. Although it's technically possible to compile it with different headers from what the server actually supports, I guess that doesn't seem likely in practice.

        While I'm here...

        But, copying/reformatting all that data is slow. (Slow enough to matter, it turns out.) So instead, I'd like to just hand it to OpenGL directly, and say, "hey GL, this data is 16 bit BGR bigendian, deal."

        It seems like with < 24bits, the OpenGL driver is just going to convert in software as well, especially if it has to generate mipmaps. Although since you're asking for any-old format (see below) the driver might be able to skip converting and use it directly if it happened to be internally compatible.

        I guess it's also possible the driver's converter is significantly more optimal than your own.

        GLint internal_format = 3;   /* suggest that GL use 3 bytes internally */

        As I understand it, this comment is wrong; '3' is just a deprecated way of saying "GL_RGB", meaning "I want 3 components", and it doesn't imply anything about the size. "use 3 bytes internally" would be GL_RGB8.

        • jwz says:

          Although since you're asking for any-old format (see below) the driver might be able to skip converting and use it directly if it happened to be internally compatible.

          Right, my thinking was that the format that X gives me back by default is likely to be the native format of the video driver, and therefore, is going to be the format that GL would be converting everything to internally anyway.

      • hatter says:

        Stock solaris with X doesn't seem to know about glx/glu.h so doesn't go anywhere, either. Tried on vaguely sane sol8 and sol9 on sparc. Installing opengl 1.3 on sol8/sparc, it's all kinds of confused :

        24 bit display

        "this text is black" is ok
        "this text is red" is green
        "this text is green" is red
        "this text is blue" is black

        • hatter says:

          First time lucky fixing it though, removed all the conditionals, and :


          type = GL_UNSIGNED_INT_8_8_8_8_REV;
          format = GL_BGRA;

          gives the correct colours for everything. For completeness, the original code had "this is white" in yellow.

  5. jsbowden says:

    7:28am drake /home/jamie/test %gcc -o test-texture test-texture.c -I/usr/X11R6/include -L/usr/X11R6/lib -lX11 -lGL -lGLU

    7:28am drake /home/jamie/test %./test-texture

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 000000FF 0000FF00 00FF0000

    Endian: Big
    Visual: 0x47 TrueColor
    X Error of failed request: BadMatch (invalid parameter attributes)
    Major opcode of failed request: 1 (X_CreateWindow)
    Serial number of failed request: 6187
    Current serial number in output stream: 6191

    7:28am drake /home/jamie/test %uname -aR
    IRIX64 drake 6.5 6.5.25m 07080049 IP35

    7:29am drake /home/jamie/test %hinv
    2 800 MHZ IP35 Processors
    CPU: MIPS R16000 Processor Chip Revision: 2.2
    FPU: MIPS R16010 Floating Point Chip Revision: 2.2
    Main memory size: 2048 Mbytes
    Instruction cache size: 32 Kbytes
    Data cache size: 32 Kbytes
    Secondary unified instruction/data cache size: 4 Mbytes
    Integral SCSI controller 2: Version IDE (ATA/ATAPI) IOC4
    CDROM: unit 0 on SCSI controller 2
    Integral SCSI controller 0: Version QL12160, low voltage differential
    Disk drive: unit 1 on SCSI controller 0
    Integral SCSI controller 1: Version QL12160, low voltage differential
    IOC3/IOC4 serial port: tty3
    IOC3/IOC4 serial port: tty4
    Graphics board: V12
    Integral Gigabit Ethernet: tg0, module 001c01, PCI bus 1 slot 4
    Iris Audio Processor: version MAD revision 1, number 1
    IOC3/IOC4 external interrupts: 1

    I can display it to a remote Exceed session on my laptop but it looks like what <lj user="zetawoof"> posted above.

  6. ralesk says:

    Hmm, so far the only way I've encountered depth 24 was 4bpp RGB noA.  For x86, the intel740 AGP2x video card would be an excellent alpha-incapable test unit.

  7. gen_witt says:

    There is a (small) chance some of your BGR and BGRA problems come from not chekcing the servers supported OpenGL version and extensions. GL_BGRA was promoted from extension GL_EXT_bgra, in version 1.2.

    It's quite common for the headers avaliable on the system to not match the servers implementation, one way or the other. For example, linux ships with GL 1.3 headers, regardless of host capabilities.

  8. feren says:

    Solaris 9 (aka SunOS5.9, aka Solaris 2.9, aka whatever the hell else their marketing droids decide to name it on a whim) on a UltraSPARC-IIIi workstation...

    Last return status: 0
    jolsen@mcp [/tmp]
    $ uname -a
    SunOS mcp 5.9 Generic_112233-12 sun4u sparc SUNW,Sun-Blade-1500

    Last return status: 0
    jolsen@mcp [/tmp]
    $ ./test-texture

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 000000FF 0000FF00 00FF0000

    Endian: Big
    Visual: 0x2d TrueColor
    Texture: BGRA UNSIGNED_INT_8_8_8_8

    The color is all gimped up, with green being red, blue being green, red being black and white being yellow. For your amusement, I have also supplied a screenshot:

  9. ultranurd says:

    I'm running Apple X11 on OS 10.3.7.

    Worked fine in "Millions of colors" and "Thousands of colors":


    Root: 1152 x 768 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 00FF0000 0000FF00 000000FF

    Endian: Little
    Visual: 0x36 TrueColor
    Xlib: connection to ":0.0" refused by server
    Xlib: No protocol specified

    Texture: BGRA UNSIGNED_INT_8_8_8_8

    Not so much with the working in "256 colors"...

    The terminal output is the same, but the X-forwarded image is wacky. Unfortunately I can't give you a screenshot, because the colors come out differently in the screen capture!

    The upper-left and lower-left backgrounds are white. The upper-right and lower-right backgrounds are blue.
    The first stripe fades from blue to cyan. The second stripe fades from blue to magenta. The third stripe is constant blue. The fourth stripe fades from blue to white.
    On the left, the text is blue, cyan, magenta, and blue. On the right, the text is white, cyan, magenta, blue (which is invisible on the background).

  10. ultranurd says:

    I think that OS X wants BGRA if it's coming from a Linux box. I've worked on a robot teleoperation interface the last two summers, where the robots are P3s running Linux, and I'm developing on my PBG4/400 running OS X. The robots have cameras and capture cards, and broadcast frames as arrays of unsigned chars.

    I've used both GTK+ and SDL as the basis of the interface in two different versions, and in both cases I had to reverse the ordering of the color bytes but leave the alpha byte where it is.

  11. mstyne says:
    feynman:~/src mstyne$ uname -a
    Darwin feynman.hq.voxel.net 7.7.0
    Darwin Kernel Version 7.7.0: Sun Nov 7 16:06:51 PST 2004;
    root:xnu/xnu-517.9.5.obj~1/RELEASE_PPC
    Power Macintosh powerpc

    feynman:~/src mstyne$ ./test-texture

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 00FF0000 0000FF00 000000FF

    Endian: Big
    Visual: 0x36 TrueColor
    Texture: BGRA UNSIGNED_INT_8_8_8_8

    jwz OpenGL Test GFX

  12. yakko says:

    Here's the screenshot on Solaris 8. The output from test-texture is:

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 000000FF 0000FF00 00FF0000

    Endian: Big
    Visual: 0x2c TrueColor
    Texture: BGRA UNSIGNED_INT_8_8_8_8

    I had to change line 434 to GL_UNSIGNED_INT_8_8_8_8_REV and line 435 to GL_RGBA, to come up with this screen, and this output:

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 000000FF 0000FF00 00FF0000

    Endian: Big
    Visual: 0x2c TrueColor
    Texture: RGBA UNSIGNED_INT_8_8_8_8_REV

  13. colonwq says:

    Here is the text output:

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = LSBFirst
    bits = LSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 00FF0000 0000FF00 000000FF

    Endian: Little
    Visual: 0x27 TrueColor
    disabling TCL support
    Texture: BGRA UNSIGNED_INT_8_8_8_8_REV

    The colors in the graphic looked correct.

  14. colonwq says:

    Displaying from a solars 2.9 04/04 to a Debian sarge

    Root: 1280 x 1024 x 24
    Image: 640 x 480 x 24
    format = ZPixmap
    bytes = LSBFirst
    bits = LSBFirst
    pad = 32
    bpl = 2560
    bpp = 32
    rgb = 00FF0000 0000FF00 000000FF

    Endian: Big
    Visual: 0x27 TrueColor
    Texture: BGRA UNSIGNED_INT_8_8_8_8_REV

    The colors of the image are mixed:
    The white back ground is yellow.
    The red text is green
    The green text is red
    The blue text is black.
    The same goes with the four color bars.

    Rotated colors from an X app displayed to my desktop is not unusual. Veritas 3.4 has scrambled colors.

    :wq

  15. sunsetdriver says:

    i tried linux/arm but not glx extensions on that x server (and i assume no libs).


    client: linux/ppc; server: linux/ppc PASS
    Root: 1024 x 768 x 16
    Image: 640 x 480 x 16
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 1280
    bpp = 16
    rgb = 0000F800 000007E0 0000001F

    Endian: Big
    Visual: 0x23 TrueColor
    Texture: RGB UNSIGNED_SHORT_5_6_5

    client: linux/x86: server: linux/ppc FAIL
    Root: 1024 x 768 x 16
    Image: 640 x 480 x 16
    format = ZPixmap
    bytes = MSBFirst
    bits = MSBFirst
    pad = 32
    bpl = 1280
    bpp = 16
    rgb = 0000F800 000007E0 0000001F

    Endian: Little
    Visual: 0x23 TrueColor
    Texture: RGB UNSIGNED_SHORT_5_6_5

    client: linux/x86; server: linux/ppc


    client: linux/ppc: server: linux/x86 FAIL
    Root: 1280 x 1024 x 16
    Image: 640 x 480 x 16
    format = ZPixmap
    bytes = LSBFirst
    bits = LSBFirst
    pad = 32
    bpl = 1280
    bpp = 16
    rgb = 0000F800 000007E0 0000001F

    Endian: Big
    Visual: 0x23 TrueColor
    Texture: RGB UNSIGNED_SHORT_5_6_5

    client: linux/x86; server: linux/ppc

  16. strangehours says:

    [caveat: it's been a long time since I've looked at this stuff]

    I don't think this will work for any 8 or 16 bit formats, at the very least. *_REV means a format-defined bit shuffle in these cases, so it won't undo the endian difference.

    For 24 bit, will the XServer ever add a pad byte (I thought not)? OGL only understands 1, 2 and 4 byte formats, so I can't see how that can ever work. I also think you need to interrogate the 32 XServer pixel format to determine the R,G,B,A ordering. Assuming BGRA or ARGB may be incorrect, as previous posts have implied. ABGR is also supported by OGL, and, I would imagine, is thus a possibility for the XServer too.

    see: http://www.opengl.org/documentation/specs/version1.2/1.2specs/packed_pixels.txt

    Maybe the best bet is to just do the byteswap by hand in cases where the endianness is wrong, but the pixel format is otherwise native, and fall back to old code for all the really hard cases (24 bit, really odd pixel formats)? AFAICT, the mismatch always implies that the screensaver is being run remotely, so the speed will be dominated by the network overhead rather than the byteswap.

    • jwz says:

      I'm happy doing it the "slow" way in weirdo cases like remote displays. My goal here is just to avoid having to reformat a lot of data in the normal, default case. It seems likely that X's default packing and OpenGL's default packing are going to be the same when they're both running on the same box, right? So I need to A) handle all such "normal" cases, and B) detect when we're in a "weird" case. So far, I don't fully understand how to do either.

      • strangehours says:

        I think the key here is checking the color masks from the XImage:

        This works for me under OSX, for 32 and 16 bit formats. It doesn't really change your code all that much, but it *does* mean that you can catch RGB555 format pixels. Also, GL_UNPACK_SWAP_BYTES does work for byte swapping any 16 or 32 bit packed pixel formats.


        struct {
        int depth;
        int byte_order;
        unsigned red_mask, green_mask, blue_mask;
        GLint type, format;
        int swap;
        } supported[] = {
        { 32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, GL_UNSIGNED_BYTE, GL_BGRA, 0 },
        { 32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, GL_UNSIGNED_INT_8_8_8_8, GL_BGRA, 1 },
        { 16, MSBFirst, 0x007c00, 0x0003e0, 0x00001f, GL_UNSIGNED_SHORT_1_5_5_5_REV, GL_RGBA, 0 },
        { 16, LSBFirst, 0x007c00, 0x0003e0, 0x00001f, GL_UNSIGNED_SHORT_1_5_5_5_REV, GL_RGBA, 1 },
        { 0 }
        };
        int i;
        int swap;
        for (i = 0; i < sizeof(supported)/sizeof(supported[0]); ++i) {
        if (!supported[i].depth) abort();
        if (image->bits_per_pixel == supported[i].depth &&
        image->byte_order == supported[i].byte_order &&
        image->red_mask == supported[i].red_mask &&
        image->green_mask == supported[i].green_mask &&
        image->blue_mask == supported[i].blue_mask) {
        type = supported[i].type;
        format = supported[i].format;
        swap = supported[i].swap;
        break;
        }
        }
        if (swap) glPixelStorei (GL_UNPACK_SWAP_BYTES, 1);
        • strangehours says:

          However, it doesn't work for an OSX server, linux client.

        • jwz says:

          Converted to a table like yours, the results so far are... anomalous:

           16, MSBFirst, 0x007c00, 0x0003e0, 0x00001f, BGRA, SHORT_1_5_5_5_REV	/* Mac 7.7.0, XFree 11.0 40300000, 1.5 NVIDIA-1.3.36, No, Big */
          16, MSBFirst, 0x007c00, 0x0003e0, 0x00001f, BGRA, SHORT_4_4_4_4_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.3 NVIDIA-1.3.36, No, Big */

          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8 /* IRIX IP32 6.5, Gentoo (X.Org 6.8.1.901 r0-0.3.3) 11.0 60801901, 1.5.2, No, Big */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8 /* SunOS sun4u 5.9, Gentoo (X.Org 6.8.1.901 r0-0.3.3) 11.0 60801901, 1.5.2, No, Big */

          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Linux i686 2.6.10-1-k7, XFree 11.0 40300001, 1.2 Mesa 4.0.4, Yes, Little */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Linux i686 2.6.6, XFree 11.0 40300001, 1.3 Mesa 4.0.4, No, Little */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Linux i686 2.6.9-2-686, XFree 11.0 40300001, 1.3 Mesa 4.0.4, No, Little */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Linux i686 2.6.10-1.737_FC3, X.Org 11.0 60801000, Yes, Little */

          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, BYTE /* Linux parisc 2.4.18-pa13, XFree 11.0 40300001, 1.3 Mesa 4.0.4, No, Big */

          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, INT_8_8_8_8 /* IRIX64 IP27 6.5, X.Org 11.0 60700000, 1.5.2 NVIDIA 66.29, No, Big */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, INT_8_8_8_8 /* SunOS sun4u 5.8, X.Org 11.0 60700000, 1.5.2 NVIDIA 66.29, No, Big */


          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, INT_8_8_8_8_REV /* Linux i686 2.4.20-18.9smp, XFree 11.0 40300000, 1.4.1, Yes, Little */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, INT_8_8_8_8_REV /* Linux i686 2.4.22-gg9, XFree 11.0 40300000, 1.5.1, Yes, Little */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, INT_8_8_8_8_REV /* Linux i686 2.6.9-1-686, XFree 11.0 40300001, 1.2 Mesa 4.0.4, Yes, Little */
          32, LSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, INT_8_8_8_8_REV /* Linux x86_64 2.6.10-1.9_FC2smp, X.Org 11.0 60700000, 1.5.2, Yes, Little */

          32, MSBFirst, 0x0000ff, 0x00ff00, 0xff0000, RGBA, INT_8_8_8_8_REV /* SunOS sun4u 5.8, Sun Microsystems, Inc. 11.0 6410, 1.3 Sun OpenGL 1.3, No, Big */

          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Linux ppc 2.6.10-2-powerpc, X.Org 11.0 60801099, 1.2 (1.5 Mesa 6.2.1), Yes, Big */
          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.1 ATI-1.3.26, No, Big */
          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.3 ATI-1.3.36, No, Big */
          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.3 NVIDIA-1.3.36, No, Big */
          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.5 ATI-1.3.36, No, Big */
          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.5 NVIDIA-1.3.36, No, Big */
          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, BGRA, INT_8_8_8_8_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.5 NVIDIA-1.3.36, No, Big */
          32, MSBFirst, 0xff0000, 0x00ff00, 0x0000ff, RGBA, INT_8_8_8_8_REV /* Mac 7.7.0, XFree 11.0 40300000, 1.1 ATI-1.3.26, No, Big */
          • strangehours says:

            I wonder if there's a bit of a problem here:

            If the R,G,B components in your image are swapped, All that happens is that the 'This is red' text is blue, and vice versa. The image still appears superficially to be correct. This fooled me a couple of times.

            The second 16 bit entry is almost surely wrong, as the sizes of the bitfields don't match the sizes of the masks.

  17. strangehours says:

    This may or may not be slow (and, depending upon where the problems you're having lie, might also not work), but you could try:

    GLXPixmap glXCreateGLXPixmap( Display *dpy, XVisualInfo *vis, Pixmap pixmap);

    To create a pixmap with the same depth/visual as your source image that you can blit to via X, and read from using glReadPixels. That way you might be able convince OpenGL to make all the choices for you (you just specify the visual of the source data, and the format/type of the data to read back and then blat to a texture).

    • jwz says:

      I can't make that work; I get a BadMatch when I try to glXMakeCurrent (dpy, glx_pixmap, glxc). I'm using the same XVisualInfo that I used with glXCreateContext.