“Extracting information out of those 1,000 signals in an analytical way–it’s very, very hard to do,” says Ciocarlie, who developed the system. “I would venture to say that it’s impossible without modern machine learning.”

Courtesy of Columbia University

Machine learning comes into play when they’re calibrating the system. They can stick the finger on a table, point it upward, and use a separate robotic arm to prod the finger in precise spots, using a specific amount of pressure. Because they know exactly where the robotic arm is jabbing the finger, they can see how the photodiodes detect light differently at each location. (If you take a look at the GIF above, you can see the system both localizing the touch and the intensity as the red dot swells with more pressure.) Despite the large amount of data collected per jab, with machine learning, the system can crunch it all.

“So that’s the missing piece, the thing that’s really become available to the field really in the last maybe five years or so,” says Ciocarlie. “We now have the machine-learning methods that we can add on top of these many, many optical signals, so that we can decipher the information that’s in there.”

This mimics how humans learn to wield our own sense of touch. As children, we grab everything we can, banking our memories of how objects feel. Even as adults, our brains continue to catalog the feel of things–for example, how much resistance to expect from a steering wheel when you’re turning left, or how hard to bang a hammer against a nail. “If we were to put you into the body of another person somehow, you would have to relearn all the motor skills,” says Columbia electrical engineer Ioannis Kymissis, who developed the system with Ciocarlie. “And that’s one of the nice things about the plasticity of the brain, right? You can have a stroke, you can knock out half of the brain and still relearn and then function.”

This new robotic finger, though, has its limits. While it can gauge the pressure it’s placing on an object, it’s missing out on a bunch of other data that people can sense through our own hands but often take for granted, like temperature and texture. But interestingly enough, the researchers think they could listen to the robotic finger’s slip, or its motion as it slides over a surface.

“When you have slip, there’s a little bit of a singing–if you ever put your ear against the table and run your finger on the table,” says Kymissis. If you’re holding on to, say, a wet glass, the slip might happen on a small scale, then “spread” to your hand’s entire contact area as the glass slides out of your grasp. By listening to the characteristic noise of an object slipping out of a robot hand equipped with these new fingers, the machine could correct its grip before the slip spreads across the whole hand.