gcc -o test-texture test-texture.c -I/usr/X11R6/include -L/usr/X11R6/lib -lX11 -lGL -lGLU
<LJ-CUT text=" --More--(13%) ">
Help me fill in the question marks:
|Linux x86 -> OSX PPC:||?||?||?||ok|
|Linux x86 -> Sparc:||?||?||?||?|
|Sparc -> Linux x86:||?||?||?||?|
|OSX PPC -> Linux x86:||?||?||?||BAD|
Tell me what it printed to stdout, and whether the colors in the window were right. It should look like this:
"Linux x86 -> OSX PPC" means "program is running on Linux x86, with $DISPLAY pointing at an X server running on OSX PPC".
"Depth 32" means 4 bytes per pixel.
"Depth 24" means 3 bytes per pixel (this is less common.) If your video driver supports it at all, you might need to turn it on by adding DefaultFbBpp 24 and/or Option "Pixmap" "24" to xorg.conf or XF86Config.
"Depth 8" means TrueColor, not PseudoColor / colormapped. Visual "TrueColor" will probably be needed.
What's the big idea?
- I've got this image data that came from the X server (via XGetImage), and it's in some arbitrary format. It might be in any bit order, byte order, and number of bits, depending on the whims of the architecture and video driver. I want to feed it to OpenGL for use as a texture.
Currently, I do this by iterating over the X data and constructing 32-bit RGBA native-endian data to hand to glTexImage2D or gluBuild2DMipmaps. But, copying/reformatting all that data is slow. (Slow enough to matter, it turns out.) So instead, I'd like to just hand it to OpenGL directly, and say, "hey GL, this data is 16 bit BGR bigendian, deal."
I'm trying to figure out how to express that to OpenGL. I'm having a hard time, because it's very poorly documented. Thus, this test program.
Update: Plase try the new version here.