In their experiments with Hamilton and three other subjects, the team found that after doing the grafts, the nerves that control the thumb interacted with this new muscle just as they would if the person still had their thumb. “We know the intent–in that case, to flex the thumb–just like the nerve and muscle interacted when there was a thumb,” says Cederna.

Courtesy of University of Michigan Engineering

Next, the team had the subjects simply imagine a bunch of different hand movements. As they did so, an EKG picked up the signals of their nerves activating, just as the nerves would have done before the person lost their limb. They tracked these to pair particular nerve signals with particular movements. “The anatomy is making these signals very different from one another, and is very finger-specific,” says University of Michigan biomedical engineer Cindy Chestek, who codeveloped the system. One nerve might be highly active for controlling the thumb, for instance, but remain silent when another finger is moving.

All of this information is fed to algorithms, which learn to detect the nerve signals involved in making a fist, for example. The system then translates that collection of signals into commands telling the robotic hand how to scrunch all five fingers together.

“With about 15 minutes of training data, we train our algorithm and we start running online,” says Chestek. Then the subjects can try controlling the robotic hand. “And they can do it on the first try,” she says. That’s a big difference from what they may have experienced before with other prosthetics, which require more practice and are less intuitive to use. “The learning is in the algorithms, not in the people,” she says.

The hardware closely mimics the natural movements of a human limb, allowing the subjects like Hamilton to pull off fine manipulations like closing zippers. “It was pretty much a sense of having a real hand back, almost, as far as usefulness,” says Hamilton. “It worked very well, very seamlessly.”

The hardware is technically called the DEKA, after DEKA Research and Development, which invented the robotic hand. But the team also fondly calls it Luke’s hand, after the Skywalker. It’s made of a semitranslucent white sheath over a robotic skeleton, and was attached to the participants’ residual limbs using a specially designed socket.

“The very exciting part about this work is that it’s a biological interface,” says biomedical engineer Paul Marasco of the Cleveland Clinic, who wasn’t involved in the research. “They do the amplification biologically, and so once they’ve done that surgery, the interface itself is actually really pretty solid.” That means strong, clear signals that translate into complex manipulations of the robotic arm.