- test-texture3.c
gcc -o test-texture3 test-texture3.c -I/usr/X11R6/include -L/usr/X11R6/lib -lX11 -lGL -lGLU -lm
<LJ-CUT text="It should look like this:">
It should look like this:

Please tell me whether it looks right, and what it printed to stdout.
If at all possible, try it on a variety of machines and in a variety of bit depths. Also try running on one machine and displaying on another.
I'm reasonably certain that some of you lied to me last time, so please be careful! If the text that says "RED" is not red, that's wrong! Perhaps it would be easier if you look at the rainbow at the bottom (ROYGBIV = red orange yellow green blue indigo violet, for those of you who have been out of elementary school for a while.)
I haven't gotten any responses from folks running in packed-24 or TrueColor-8, so if you can manage that, it would be helpful. I'm pretty sure one can coax SGIs to do those, at least.
Don't worry if the font size is off; it's just the colors that matter.
Think an O2 will do it? I hope I've got a compiler on mine, else I'll pass. I'm not ready to face the hell that is inst anytime soon.
What, you don't like inst? I mean, have you /tried/ other package managers? As far as I'm concerned, inst just works, and it works great.
My eyelid is twitching savagely. I hope you're happy.
Gotta say I agree. Inst still sucks, but it mostly sucks less. I've got a couple hundred machines running via RoboInst like gangbusters, and I can bring the freaking things all the way to the PROM to do updates from the comfort of my office if I want to. Contrast that with the amount of time and effort involved in running our RedHat machines using RPM and up2date. Linux can eat a bowl of dick.
Inst is one of the things I was hoping SGI might open source. I should send somebody email about that. Not even apt has got RoboInst-like functionality.
Okay at work at the moment, but I will definately check this when I get home.. in about 8 hours..
B
Works flawlessly on OS X (in 16- and 24-bit modes). I'm going to test x86 Linux -> OS X in a minute.
wtf does "x86 Linux -> OS X" refer to?
x86 Linux host machine displaying via network on an OS X client machine.
Or, in X notation, the x86 Linux machine is running the X client displaying on OS X's X11 server.
Solaris 8 on Sparc, displayed on Solaris 8 on Sparc:System: SunOS sun4u 5.8Server: Sun Microsystems, Inc. 11.0 6410OpenGL: 1.2 Sun OpenGL 1.2 patch 108131-03 for SolarisPacked: NoEXT_A: Yes, NoEndian: BigRoot: 1152 x 900 x 24Image: 420 x 300 x 24 format = ZPixmap bytes = MSBFirst bits = MSBFirst pad = 32 bpl = 1680 bpp = 32 rgb = 00FF0000 0000FF00 000000FFVisual: 0x22 TrueColorTexture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
The colors look correct, but the text is too large (http://wiki.loopysoft.com/moin.cgi/TestForJwz?action=show). Otherwise, all is well.
I get the same output and text size issue
-cb
What are you working on? More screen saver stuff?
Basically, I'm trying to speed up glslideshow. See "What's the big idea?" in the first post.
it would be interesting to see how this correlates with the format specified by OES_read_format, which newish Mesa now supports. on the hardware i have it's exact (sample output from a matrox g400 on linux):
i tweaked your code slightly to get that last line. unfortunately i don't think OES_read_format is available in anything except Mesa CVS yet.
Fine (modulo a slight twiddle to compile on non-GNU cc) on Linux, IRIX, Solaris, and AIX.
System: Linux x86_64 2.6.10-1.9_FC2smp
Server: The X.Org Foundation 11.0 60700000
OpenGL: 1.5.2 NVIDIA 66.29
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1280 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
--
System: IRIX64 IP27 6.5
Server: The X.Org Foundation 11.0 60700000
OpenGL: 1.5.2 NVIDIA 66.29
Packed: No
EXT_A: No, No
Endian: Big
Root: 1280 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 1
--
System: SunOS sun4u 5.8
Server: The X.Org Foundation 11.0 60700000
OpenGL: 1.5.2 NVIDIA 66.29
Packed: No
EXT_A: No, No
Endian: Big
Root: 1280 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 1
--
AIX 0031D60B4C00 2
Server: The X.Org Foundation 11.0 60700000
OpenGL: 1.5.2 NVIDIA 66.29
Packed: No
EXT_A: No, No
Endian: Big
Root: 1280 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 1
Oh, if this is any good, I cheated and used vncserverGL -depth N to create a software X server, and tested it displaying to that as well. All looked OK (if a bit on the dithertastic side); results posted here.
Looks fine with an AIX 4.3 client and XFree86 on Cygwin server:
The _REV texture type didn't work with the previous one, but now this swap bit is set...
I have a Matrox G400. Redhat FC3 with no updates. DISPLAY=":0.0"
The default install puts DefaultDepth 24 in xorg.conf. I get:
and a white window with nothing in it
If I make it DefaultDepth 16 in the xorg.conf than the texture gets displayed perfectly (Reds are actually red) and the following is shown on stdout
In the "white window" case, do other GL programs work at all?
Some do and some don't. All print out errors:
From xscreensaver 4.18 that is in Fedora Core 3
glsnake, glblur and glforrestfire all work. glmatrix creates a window, kills the window and fails with error message:
in addition to the error messages above
Other gl programs like glxgears and glxinfo work.
feynman:~/src mstyne$ ./test-texture3
System: Darwin Power Macintosh 7.7.0
Server: The XFree86 Project, Inc 11.0 40300000
OpenGL: 1.5 NVIDIA-1.3.36
Packed: No
EXT_A: Yes, Yes
Endian: Big
Root: 1280 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = MSBFirst
bits = MSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x36 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
This in 32-bit mode:
This in 16-bit mode:
X11 still refuses to play nice in 256 color mode:
Worked for me.
System: CYGWIN_NT-5.1 i686 1.5.12(0.116/4/2)
Server: The Cygwin/X Project 11.0 60801000
OpenGL: 1.2 (1.5 Mesa 6.1)
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1600 x 1200 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x22 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
Works fine on my system.
System: Linux i686 2.6.8.1-10mdk
Server: Mandrakelinux (X.Org X11 6.7.0, patch level 2mdk) 11.0 60700000
OpenGL: 1.3 Mesa 5.0.2
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1024 x 768 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x29 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
works fine on Mac OS 10.3.8.
doesnt compile on digital unix. if youre interested i can give more details.
Why didn't it compile? missing math.h and -lm, or something else?
You made me think about Xlib. Ow.
Compiled and run on Xsgi, looks great, outputs the following:
System: IRIX64 IP30 6.5
Server: Silicon Graphics 11.0 6600
OpenGL: 1.1 Irix 6.5
Packed: Yes
EXT_A: Yes, No
Endian: Big
Root: 1280 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = MSBFirst
bits = MSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 000000FF 0000FF00 00FF0000
Visual: 0x2b TrueColor
Texture: ??? UNSIGNED_INT_8_8_8_8 0
Needs a math.h include and -lm to compile on IRIX. Also, I was getting a BadMatch exception when trying to run it. Specifying a value of '0' for depth in the call to XCreateWindow() doesn't work on Xsgi. I changed that arg to vi->depth and it worked. If I get a minute (or even better if you get a minute) we could whip up some glXChooseVisual() hoodoo that'll cycle through every available visual (18 on this Octane!) and ask Yes or No.
Also, hooray for real X servers.
Just figured out why 0 doesn't work in the XCreateWindow() call: xdpyinfo(1) reports visual 0x2b:
visual:
visual id: 0x2b
class: TrueColor
depth: 15 planes
available colormap entries: 32 per subfield
red, green, blue masks: 0x1f, 0x3e0, 0x7c00
significant bits in color specification: 8 bits
The root window is running in visual 0x2d:
visual:
visual id: 0x2d
class: TrueColor
depth: 24 planes
available colormap entries: 256 per subfield
red, green, blue masks: 0xff, 0xff00, 0xff0000
significant bits in color specification: 8 bits
depth 0 to XCreateWindow is CopyFromParent. The hoodoo in your program that chooses a visual ends up with 0x2b from Xsgi, and then asks it to CopyFromParent for the depth, which is 24 bits and unsupported by the 0x2b visual. If I were running the root window at Xsgi's default depth of 8 I'd never have noticed. Probably why SGIs still ship with a root window depth of 8, so CopyFromParent always succeeds no matter what the application does. Ow. X.
Thanks, patched...
It's pretty close on my P4m laptop with radeon mobility 7500. I do have my X dpi set to 125, my 14" screen runs at 1400x1050. The text doesn't quite fit where it's supposed to be. Here's the output:
burner@phoenix:~$ ./test-texture3
System: Linux i686 2.6.10-1-686
Server: The XFree86 Project, Inc 11.0 40300001
OpenGL: 1.2 Mesa 4.0.4
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1400 x 1050 x 16
Image: 420 x 300 x 16
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 840
bpp = 16
rgb = 0000F800 000007E0 0000001F
Visual: 0x27 TrueColor
Texture: RGB UNSIGNED_SHORT_5_6_5 0
Screenshot here: http://www.core.binghamton.edu/~burner/no_cigar.png
Results from an athlon/ubuntu/xorg client to my laptop's Xserver: (picture looks the same as parent message)
burner@firestorm:~$ ./test-texture3
System: Linux i686 2.6.10-3-k7
Server: The XFree86 Project, Inc 11.0 40300001
OpenGL: 1.3 Mesa 4.0.4
Packed: No
EXT_A: No, No
Endian: Little
Root: 1400 x 1050 x 16
Image: 420 x 300 x 16
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 840
bpp = 16
rgb = 0000F800 000007E0 0000001F
Visual: 0x27 TrueColor
Texture: RGB UNSIGNED_SHORT_5_6_5 0
From my p4m/debian/xfree86 laptop to my athlon/nvidia/xorg client (image looks the same as in parent message):
burner@firestorm:~$ ssh phoenix
Linux phoenix 2.6.10-1-686 #1 Tue Jan 18 04:34:19 EST 2005 i686 GNU/Linux
You have mail.
Last login: Thu Feb 10 18:59:25 2005 from firestorm.burner
burner@phoenix:~$ ./test-texture3
System: Linux i686 2.6.10-1-686
Server: The X.Org Foundation 11.0 60801902
OpenGL: 1.5.2 NVIDIA 66.29
Packed: No
EXT_A: No, No
Endian: Little
Root: 1600 x 1200 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
And running locally on my athlon/nvidia/xorg/ubuntu. Since the picture still looks the same as above, perhaps it's nothing to do with my 125 dpi res on the laptop...
burner@firestorm:~$ ./test-texture3
System: Linux i686 2.6.10-3-k7
Server: The X.Org Foundation 11.0 60801902
OpenGL: 1.5.2 NVIDIA 66.29
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1600 x 1200 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
On my desktop box I get a solid black window:
System: Linux i686 2.6.3-15mdk
Server: Mandrake Linux (XFree86 4.3, patch level 30mdk) 11.0 40300001
OpenGL: 1.2 Mesa 4.0.4
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1600 x 1200 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x25 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
=====
On the Linux box next to me it works as advertised at 24bpp:
Xlib: extension "XFree86-DRI" missing on display ":0.0".
System: Linux i686 2.6.3-7mdk
Server: Mandrake Linux (XFree86 4.3, patch level 32.3.100mdk) 11.0 40300001
OpenGL: 1.3 Mesa 4.0.4
Packed: No
EXT_A: Yes, No
Endian: Little
Root: 1600 x 1200 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x23 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
==
and at 16bpp:
Xlib: extension "XFree86-DRI" missing on display ":0.0".
System: Linux i686 2.6.3-7mdk
Server: Mandrake Linux (XFree86 4.3, patch level 32.3.100mdk) 11.0 40300001
OpenGL: 1.3 Mesa 4.0.4
Packed: No
EXT_A: Yes, No
Endian: Little
Root: 1600 x 1200 x 16
Image: 420 x 300 x 16
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 840
bpp = 16
rgb = 0000F800 000007E0 0000001F
Visual: 0x23 TrueColor
Texture: RGB UNSIGNED_SHORT_5_6_5 0
System: Linux i686 2.4.20-18.9smp
Server: The XFree86 Project, Inc 11.0 40300000
OpenGL: 1.4.1 NVIDIA 53.36
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1600 x 1200 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
I had the same kinda bloated text, but the colours were fine.
System: Linux i686 2.4.20-19.9.lair.1smp
Server: The XFree86 Project, Inc 11.0 40300000
OpenGL: 1.5.1 NVIDIA 61.11
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 2560 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x21 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
System: Linux i686 2.6.7
Server: The X.Org Foundation 11.0 60801000
OpenGL: 1.2 Mesa 6.1
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 1024 x 768 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x27 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
Works correctly with Linux on a Powerbook G4.
Works great on my laptop. Here is the info:
System: Linux ppc 2.6.10-2-powerpc
Server: The X.Org Foundation 11.0 60801099
OpenGL: 1.2 (1.5 Mesa 6.2.1)
Packed: Yes
EXT_A: Yes, Yes
Endian: Big
Root: 1024 x 768 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = MSBFirst
bits = MSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 00FF0000 0000FF00 000000FF
Visual: 0x23 TrueColor
Texture: BGRA UNSIGNED_INT_8_8_8_8_REV 0
And on my workstation, uglier:
System: Linux ppc 2.2.20-pmac
Server: The XFree86 Project, Inc 11.0 40201000
OpenGL: 1.4 Mesa 5.0
Packed: Yes
EXT_A: Yes, Yes
Endian: Big
Root: 1152 x 864 x 15
Image: 420 x 300 x 15
format = ZPixmap
bytes = MSBFirst
bits = MSBFirst
pad = 32
bpl = 840
bpp = 16
rgb = 00007C00 000003E0 0000001F
Visual: 0x23 TrueColor
Texture: BGRA UNSIGNED_SHORT_1_5_5_5_REV 0
Heck if I know Here is the long output. Sorry its soo long feel free to delete:
test-texture3.c: In function `test_window':
test-texture3.c:322: parse error before `glarea'
test-texture3.c:323: `glarea' undeclared (first use in this function)
test-texture3.c:323: (Each undeclared identifier is reported only once
test-texture3.c:323: for each function it appears in.)
test-texture3.c:344: parse error before `struct'
test-texture3.c:83: warning: unused variable `swap'
test-texture3.c:82: warning: unused variable `type'
test-texture3.c:81: warning: unused variable `format'
test-texture3.c:66: warning: unused variable `i'
test-texture3.c:66: warning: unused variable `status'
test-texture3.c: At top level:
test-texture3.c:348: warning: type defaults to `int' in declaration of `supported'
test-texture3.c:349: warning: braces around scalar initializer
test-texture3.c:349: warning: (near initialization for `supported[0]')
test-texture3.c:349: warning: excess elements in scalar initializer
test-texture3.c:349: warning: (near initialization for `supported[0]')
test-texture3.c:349: warning: excess elements in scalar initializer
test-texture3.c:349: warning: (near initialization for `supported[0]')
test-texture3.c:349: warning: excess elements in scalar initializer
test-texture3.c:349: warning: (near initialization for `supported[0]')
test-texture3.c:350: `GL_UNSIGNED_BYTE_3_3_2' undeclared here (not in a function)
test-texture3.c:350: warning: excess elements in scalar initializer
test-texture3.c:350: warning: (near initialization for `supported[0]')
test-texture3.c:350: warning: excess elements in scalar initializer
test-texture3.c:350: warning: (near initialization for `supported[0]')
It goes on a lot more.
I switched to 800x600 so I could read the text. The window matched your sample image just fine. The red was red and not magenta. :P Here's my output:
user@host:~/bin> gcc -o test-texture3 test-texture3.c -I/usr/X11R6/include -L/usr/X11R6/lib -lX11 -lGL -lGLU -lm
user@host:~/bin> ./test-texture3
System: Linux i686 2.6.5-7.145-default
Server: The XFree86 Project, Inc 11.0 40399902
OpenGL: 1.2 (1.4 Mesa 5.0.2)
Packed: Yes
EXT_A: Yes, Yes
Endian: Little
Root: 800 x 600 x 16
Image: 420 x 300 x 16
format = ZPixmap
bytes = LSBFirst
bits = LSBFirst
pad = 32
bpl = 840
bpp = 16
rgb = 0000F800 000007E0 0000001F
Visual: 0x24 TrueColor
Texture: RGB UNSIGNED_SHORT_5_6_5 0
X connection to :0.0 broken (explicit kill or server shutdown).
11:53am drake /home/jamie/test %./test-texture3
System: IRIX64 IP35 6.5
Server: Silicon Graphics 11.0 6600
OpenGL: 1.2 Irix 6.5
Packed: Yes
EXT_A: Yes, No
Endian: Big
Root: 1280 x 1024 x 24
Image: 420 x 300 x 24
format = ZPixmap
bytes = MSBFirst
bits = MSBFirst
pad = 32
bpl = 1680
bpp = 32
rgb = 000000FF 0000FF00 00FF0000
Visual: 0x47 TrueColor
Texture: ??? UNSIGNED_INT_8_8_8_8 0
11:53am drake /home/jamie %uname -aR
IRIX64 drake 6.5 6.5.25m 07080049 IP35
11:54am drake /home/jamie %hinv
2 800 MHZ IP35 Processors
CPU: MIPS R16000 Processor Chip Revision: 2.2
FPU: MIPS R16010 Floating Point Chip Revision: 2.2
Main memory size: 2048 Mbytes
Instruction cache size: 32 Kbytes
Data cache size: 32 Kbytes
Secondary unified instruction/data cache size: 4 Mbytes
Integral SCSI controller 2: Version IDE (ATA/ATAPI) IOC4
CDROM: unit 0 on SCSI controller 2
Integral SCSI controller 0: Version QL12160, low voltage differential
Disk drive: unit 1 on SCSI controller 0
Integral SCSI controller 1: Version QL12160, low voltage differential
IOC3/IOC4 serial port: tty3
IOC3/IOC4 serial port: tty4
Graphics board: V12
Integral Gigabit Ethernet: tg0, module 001c01, PCI bus 1 slot 4
Iris Audio Processor: version MAD revision 1, number 1
IOC3/IOC4 external interrupts: 1
Looks like it should, displays natively with no problems, and now crashes Exceed like clockwork if I try to display back to my laptop.