Debregeas et amis say it looks as if the ridges and whorls in fingerprints filter mechanical vibrations in a way that best allows nerve endings to sense them. The mechanoreceptors that do this job are called Pacinian corpuscles. They sit at the ends of nerves and are responsible for sensing pressure and pain. These devices can sense vibrations over a wide area of skin but are sensitive only to a limited range of vibrations. In fact biologists have known for some time that Pacinian corpuscles are most sensitive to vibrations at 250Hz.
Debregeas and co have investigated this problem using a "CYBER FINGER" that they built in their lab, complete with synthetic fingerprints on the same scale as human ones and a microelectromechanical sensor that measures force with a spatial resolution of millimetres. They say that fingerprints resonate at certain frequencies and so tend to filter mechanical vibrations. It turns out that their resonant frequency is around 250Hz.
That means that fingerprints act like signal processors, conditioning the mechanical vibrations so that the Pacinian corpuscles can best interpret them. It's this optimisation process that allows us to sense textures with a spatial resolution far smaller than the distance between Pacinian corpuscles in the skin.
There is a growing awareness that the processing power of the nervous system, including the brain, simply cannot handle the volume of number crunching that has to be done to keep a living body on the road. Instead, it looks increasingly clear that the brain outsources much of this work to the body itself: to the joints, ligaments, muscles, skin etc. Understanding how these materials do all this processing is turning materials science into a branch of computer science. It's even got a name: morphological computing.
The researchers had subjects listen to spoken syllables while hooked up to a device that would simultaneously blow a tiny puff of air onto the skin of their hand or neck. The syllables included "pa" and "ta," which produce a brief puff from the mouth when spoken, and "da" and "ba," which do not produce puffs. They found that when listeners heard "da" or "ba" while a puff of air was blown onto their skin, they perceived the sound as "ta" or "pa."
Dr. Gick said the findings were similar to those from the 1976 study, in which visual cues trumped auditory ones -- subjects listened to one syllable but perceived another because they were watching video of mouth movements corresponding to the second syllable. In his study, he said, cues from sensory receptors on the skin trumped the ears as well. "Our skin is doing the hearing for us," he said.
"What's so persuasive about this particular effect," he added, "is that people are picking up on this information that they don't know they are using." That supports the idea that integrating different sensory cues is innate.