Reporting Matt Roush
Engineering researchers at four U.S. universities are embarking on a four-year project to design a prosthetic arm that amputees can control directly with their brains and that will allow them to feel what they touch.
The research at the University of Michigan, Rice University, Drexel University and the University of Maryland is made possible by a $1.2 million grant from the National Science Foundation’s Human-Centered Computing program.
The team plans to incorporate technology that feeds both tactile information from prosthetic fingertips and grasping-force information from a prosthetic hand to the brain through a robotic exoskeleton and touchpads that vibrate, stretch and squeeze the skin where the prosthesis attaches to the body.
Video about the project produced by Rice University:
Their process is non-invasive. In experiments, subjects wear a cap to test the technology. Down the road, patients would discreetly affix a neural decoder to the scalp like a pair of glasses, envisions Brent Gillespie, associate professor of mechanical engineering at the University of Michigan.
“What we’re doing is in some ways rather low-tech, but it’s expected to have a high pay-off because we are integrating control and sensory feedback. We aim to close control loops,” Gillespie said. “Other research groups are working on drawing signals more directly from the nerves of the brain, but they are finding that this approach is fraught with technical hurdles. What we’re doing is a lot closer to being realized and commercialized.”
Gillespie’s co-investigators on the project are Marcia O’Malley at Rice, Patricia Shewokis at Drexel, and José Contreras-Vidal at Maryland. The team has previously demonstrated technology that allowed amputees to correctly perceive and manipulate objects with a prosthetic gripper based on sensory feedback that was provided to their residual limbs.
“The investigators on this grant have already demonstrated that much of this is possible,” O’Malley said. “What remains is to bring all of it — non-invasive neural decoding, direct brain control and haptic sensory feedback — together into one device.”
The new technology is expected to be a big leap over what’s used in existing prosthetic devices, which don’t allow amputees to feel what they touch. Some state-of-the-art prostheses today use force-feedback systems that vibrate like a mobile phone to provide limited information about objects a prosthetic hand is gripping.
“Often, these vibrotactile cues aren’t very helpful,” O’Malley said. “Many times, individuals simply rely on visual feedback — watching their prosthesis grasp an object — to infer whether the object is soft or hard, how tightly they are grasping it and the like. There’s a lot of room for improvement.”
Gillespie will focus on the haptic and sensory feedback aspects of the project. Haptics is the science of applying touch sensation and control to interactions with computers.
“Sensory feedback, especially haptic feedback, is often overlooked, but we think it’s the key to closing the loop between the brain and motorized prosthetic devices,” Gillespie said.
Contreras-Vidal has previously demonstrated technology that allowed test subjects to move a cursor on a computer screen simply by thinking about it. That technology non-invasively taps into the user’s neural network using a cap of electrodes that read electrical activity on the scalp via electroencephalography. The team plans to combine this EEG information with real-time data about blood-oxygen levels in the user’s frontal lobe using functional near-infrared technology developed by Optical Brain Imaging Team at Drexel.
“The idea is to provide a range of sensory feedback that can be integrated by the user, much like able-bodied individuals integrate a variety of tactile, kinesthetic and force information from nerves in their skin and muscles,” Contreras-Vidal said.
Shewokis said, “We want to provide intuitive control over contact tasks, and we’re also interested in strengthening the motor imagery the patients are using as they think about what they want their arm to do. Ideally, this tactile, or haptic feedback will improve the signal from the EEG and fNIR decoder and make it easier for patients to get their prosthetic arms to do exactly what they want them to do. We are moving toward incorporating the ‘brain-in-the loop’ for prosthetic use and control.”