© All Rights Reserved 2025 | Privacy Policy
Tax ID / EIN: 23-7441306
Skyline of Las Vegas
Real news. Real stories. Real voices.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Supported by

Scientists use AI to help make bionic hands work better for people who need them

SCOTT DETROW, HOST:

Scientists are using artificial intelligence to help bionic limbs act more like natural ones. NPR's Jon Hamilton reports on an experimental hand that shares control with the user to carry out tricky tasks like holding up a Styrofoam coffee cup.

JON HAMILTON, BYLINE: The latest bionic hands can swivel, move individual fingers and manipulate objects. They can also detect electrical signals coming from the muscles that used to control those actions. But Marshall Trout, a researcher at the University of Utah, says most prosthetic hands still aren't very smart.

MARSHALL TROUT: The person has to sit there and really focus on what they're doing. They have to maintain, like, line of sight with whatever it is they're trying to handle, which is really not at all how a intact hand behaves.

HAMILTON: Which is one reason that many people who get a fancy prosthetic hand stop using it. So Trout and a team of scientists set out to make a smarter prosthetic that would act more like a person's own hand.

TROUT: I just know where my coffee cup is and I can reach without having to pay too much attention. And as my hand gets closer, I can kind of feel where it is, and my hand will just naturally squeeze and make contact with it. And that's kind of what we wanted to try to recreate with the system.

HAMILTON: The team turned to artificial intelligence to take on some of these subconscious functions. This meant detecting not just the signal coming from a muscle, but the intention behind it. For example, Trout says, the AI control system learned to detect the tiniest twitch in a muscle that flexes the hand.

TROUT: The moment we detect that small amount of flexion, that's when the machine controller kicks on, saying, oh, I'm trying to grasp something. I'm not just sitting still.

HAMILTON: To make the approach work, the scientists modified a bionic hand by adding sensors that see and feel, that allowed the AI system to gauge the distance to an object and assess its shape. Meanwhile, pressure sensors on the fingertips told the user how firmly their prosthetic hand was holding the object. To test the system, Trout's team asked four people missing a natural hand to use the bionic version to drink from a cup with and without help from AI.

TROUT: With machine assistance, they could very reliably pick up these cups and mime drinking a sip of water out of the cup. And then without the machine assistance, the person just crushed it or dropped it every single time.

HAMILTON: The results appear in the journal Nature Communications. And Jacob George, who oversaw the research at the Utah NeuroRobotics Lab, says they address a problem that people often encounter when they use superhuman technology.

JACOB GEORGE: You can make a robotic hand that can do tasks better than a human user. But when you actually give that to someone, they don't like it. You know, they actually hate this.

HAMILTON: Because they feel like it's not a part of them and it's out of their control. George says this artificial intelligence system shares control with the user.

GEORGE: The machine is doing something and the human is doing something, and we're combining those two together.

HAMILTON: George says that's a critical step toward prosthetic limbs that feel like an extension of the person's own body.

GEORGE: Ultimately, when you create an embodied robotic hand, it becomes a part of that user's experience. It becomes a part of themselves and not just a tool.

HAMILTON: One reason we feel that way about our own hands is that they're controlled in part by reflexes in the brain stem and spinal cord. John Downey of the University of Chicago, who was not involved in the study, says that means the thinking part of our brain doesn't have to worry about the details of every motion.

JOHN DOWNEY: All of our motor control involves reflexes that are subconscious. And so providing robotic imitations of those reflex loops is going to be important.

HAMILTON: But Downey says sometimes a human brain is needed to do things artificial intelligence still can't.

DOWNEY: We can very finely manipulate things like putting a thread through needle and things like that, which take almost no force at all, all the way up to, you know, grabbing a child and lifting them. The dynamic range on that is far beyond what robots typically handle.

HAMILTON: For now, anyway.

Jon Hamilton, NPR News.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Jon Hamilton
Jon Hamilton is a correspondent for NPR's Science Desk. Currently he focuses on neuroscience and health risks.
Sink your teeth into our annual collection of dining — and drinking — stories, including a tally of Sin City's Tiki bars, why good bread is having a moment, and how one award-winning chef is serving up Caribbean history lessons through steak. Plus, discover how Las Vegas is a sports town, in more ways than one. Bon appétit!