text-only page produced automatically by LIFT Text Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation
Discoveries
design element
Discoveries
Search Discoveries
About Discoveries
Discoveries by Research Area
Arctic & Antarctic
Astronomy & Space
Biology
Chemistry & Materials
Computing
Earth & Environment
Education
Engineering
Mathematics
Nanoscience
People & Society
Physics
 

Email this pagePrint this page

Discovery
Technology May Soon Turn Thoughts Into Action

NSF funds research to develop potentially life-changing technology for millions of prosthetic-dependent people

University of Michigan engineering researcher Brent Gillespie uses a prototype of a device.

University of Michigan engineering researcher Brent Gillespie uses a prototype of a device.
Credit and Larger Version

July 27, 2011

"If you can imagine it, you can achieve it." An inspirational saying, yes. But prosthetic limbs that amputees may directly control with their brains and that will allow them to feel what they touch? Science fiction?

"There's nothing fictional about this," said Maria O'Malley of Rice University.

O'Malley is one of four principal investigators from four U.S. universities funded by the National Science Foundation embarking on a four-year program to design such prosthetics. Brent Gillespie from the University of Michigan, Jose Contreras-Vidal from the University of Maryland and Patricia Shewokis from Drexel University round out the collaborative research team.

"These researchers have developed a unique approach," said NSF Human-centered Computing program manager Ephraim P. Glinert. "The team will build upon their prior work to design and validate non-invasive neural decoders that generate agile control in upper limb prosthetics. This is exciting, and will have broad implications for potentially millions, including many recent war veterans who face daily challenges associated with prosthetic limbs."

O'Malley agrees. "Researchers have already demonstrated that much of this is possible. What remains is to bring all of it--non-invasive neural decoding, direct brain control and haptic sensory feedback--together in one device."

Contreras-Vidal, who is an associate professor of kinesiology, and his team have created a non-invasive, sensor-lined cap that forms a "brain computer interface" that could control computers, robotic prosthetic limbs, motorized wheelchairs and even digital avatars. The team has published three major papers on their technology over the past 18 months, the latest a just released study in the Journal of Neurophysiology in which they successfully used EEG brain signals to reconstruct the complex 3-D movements of the ankle, knee and hip joints during human treadmill walking.

In two earlier studies they showed similar results for 3-D hand movement, and they also showed subjects wearing a brain cap could control a computer cursor with their thoughts.

Contreras-Vidal's previously demonstrated technology that allowed test subjects to move a cursor on a computer screen simply by thinking about it. Non-invasively tapping into the user's neural network using a brain cap of electrodes that read electrical activity on the scalp via electroencephalography (EEG) made this discovery possible.

The team plans to combine this EEG information with real-time data about blood-oxygen levels in the user's frontal lobe using functional near-infrared (fNIR) technology developed by Shewokis at Drexel.

Shewokis said, "We want to provide intuitive control over contact tasks, and we're also interested in strengthening the motor imagery the patients are using as they think about what they want the arm to do. Ideally, this tactile, or haptic, feedback will improve the signal from the EEG and fNIR decoder and make it easier for patients to get their prosthetic arms to do exactly what they want them to do. We are moving toward incorporating the 'brain-in-the loop' for prosthetic use and control."

O'Malley said the new technology is a big leap over what's used in existing prosthetic devices, which don't allow amputees to feel what they touch. Some state-of-the-art prostheses today use force-feedback systems that vibrate--much like the 'vibrate' mode on a mobile phone--to provide limited information about objects a prosthetic hand is gripping.

"Often, these vibrotactile cues aren't very helpful," O'Malley said. "Many times, individuals simply rely on visual feedback--watching their prosthesis grasp an object--to infer whether the object is soft or hard, how tightly they are grasping it and the like. There's a lot of room for improvement."

Gillespie said, "This truly unique team has been given the opportunity to help solve the challenging problem of brain-to-machine interface. I'm excited about our breakthroughs and the promise for future results. We are approaching the dilemma with big respect for the brain/body connection and hope to discover methods to harness the body in new ways.

"Sensory feedback, especially haptic feedback, is often overlooked, but we think it's the key to closing the loop between the brain and motorized prosthetic devices," he said. "These results indicate that we stand a very good chance to help amputees and also help others who may be suffering from motor impairments."

Glinert is hopeful for even broader impacts, "This research will revolutionize the control and interface of upper limb prosthetics. The work will lead to a better understanding of the role of sensory feedback in brain-computer interfaces and will lay the foundation for restoration of motor and sensory function for amputees and individuals with neurological disease."

--  Lisa-Joy Zgorski, National Science Foundation (703) 292-8311 lisajoy@nsf.gov
--  Lee Tune, University of Maryland (301) 405-4679 ltune@umd.edu
--  Jade Boyd, Rice University (713) 348-6778 jadeboyd@rice.edu

Investigators
Jose Contras-Vidal
Brent Gillespie
Marcia O'Malley
Patricia Shewokis

Related Institutions/Organizations
Drexel University
University of Maryland College Park
University of Michigan Ann Arbor
William Marsh Rice University

Related Awards
#0812569 RI-Small: Cognitive Modeling of Human Motor Skill Acquisition
#1064703 HCC: Medium: Collaborative Research: Improved Control and Sensory Feedback for Neuroprosthetics
#1065027 HCC: Medium: Collaborative Research: Improved Control and Sensory Feedback for Neuroprosthetics
#1065497 HCC: Medium: Collaborative Research: Improved Control and Sensory Feedback for Neuroprosthetics

Years Research Conducted
2009 - 2016

Total Grants
$12,693,153

Related Websites
Video interviews with University of Maryland researchers: http://dl.dropbox.com/u/3305200/EEG%20NAT%20Sound%20PKG%20%28Ext.%20Edition%20with%20CSlide%29.mov
Rice University video of prosthetic arm: http://www.youtube.com/watch?v=z1oIsXqc0U4
AUDIO of Drexel's Patricia Shewokis: http://drexel.edu/univrel/audio/Patricia-Shewokis/
University of Maryland press release: http://www.newsdesk.umd.edu/scitech/release.cfm?ArticleID=2475
Think, Move, Live, University of Maryland, TERP: http://www.terp.umd.edu/4.8/think/
Neural decoding of treadmill walking from non-invasive, electroencephalographic (EEG) signals, Journal of Neurophysiology: http://www.nsf.gov/news/longurl.cfm?id=240

University of Maryland Professor José 'Pepe' Contreras-Vidal wears his non-invasive Brain Cap.
University of Maryland Professor José 'Pepe' Contreras-Vidal wears his non-invasive Brain Cap.
Credit and Larger Version

Signals can be used to study the cortical dynamics of walking and develop brain-machine interfaces.
A person using a brain-machine interface.
Credit and Larger Version

University of Maryland's Brain Cap headset being adjusted for testing.
University of Maryland's Brain Cap headset being adjusted for testing.
Credit and Larger Version



Email this pagePrint this page
Back to Top of page