We've got the hardware mostly built, and I believed the software to be done, until we actually tried it out in the club and discovered that, yes, the iSight camera is a piece of shit. It's terrible in low light where "low" means "less bright than the surface of the sun". It's so bad that even using a spotlight in the booth won't help, because it would have to be so bright that it would illuminate the whole room. And possibly cause skin cancer.
So one option is to get a Firewire DV camcorder that is good in low light and use that. But I don't know which, and I don't have one, and I don't know if that'd be good enough in low light either. (I'm guessing "probably not": even those Sony Nightshots we use for the webcast cameras aren't exactly "photo quality" in the dark.)
The other option is to use a digital still camera with a flash, which is what Photoboof uses (running on Windows). The trick there is that you want the photo camera to behave like a video camera by giving you frames continuously, and only fire the flash when requested.
We've got this groundscore Canon PowerShot S30 that we were thinking of using, and when you use Canon's RemoteCapture software, it does exactly what we need: it shows live video (at maybe 10FPS), and when you click, saves a flash picture, all without a CF card being involved.
But, we can't just use RemoteCapture, because it's a hairy UI (not "kiosk-y" at all), and it's not AppleScriptable. Still, it shows that this camera hardware is at least capable of doing what we need...
netik fought with gphoto2 for a while, and found that it can fire the camera and get the picture out, but it can't do video. Me, I can't even get gphoto2 to build on my iMac. DarwinPorts has libgphoto2, but it's two years old, and apparently the oldest gphoto2 that is still available doesn't work with that verion of the library. Or something. I have an instinctual aversion to this software anyway; it has the Linux Stink on it in a big way.
So then I thought I'd try doing it by hand with libptp, and just hack out the raw commands to the camera that way. But, on MacOS, that just dumps core at startup. I patched around that, but now I can't make any sense out of the data coming out of libusb. Like, you get this list of USB busses and those have a list of devices on them (sensible enough). But the numbers in these device structures -- vendor ID, product ID, etc. -- have no correlation to the vendor and product IDs that are printed by System Profiler! The camera shows up in System Profiler, but when I'm looking at the data structures in ptpcam.c, I can't even figure out which of the device structs represents the camera. Let alone why it passes it by as if it's not a camera at all.
This is BS. There's got to be an easier way.
(Before you suggest it: using two cameras, one for video and one for stills, is a stupid idea that would work terribly.)