Serving Arizona State University Online Since 1995  Current Issue: Thursday, January 25, 2007



STUDENT MEDIA LINKS








SEARCH
FEATURES
LINKS

 

 

Researchers work to develop prosthetic hand controlled by brain

Process involves using virtual reality to make a virtual hand

 by Stephanie Naufel
 published on Thursday, January 25, 2007

ROBOTIC PROSTHETICS: David Meller, a bioengineering Ph.D. student, models the glove used to map the movement of the hand. Next to him is the robotic arm that will eventually be used to test software for a robotic prosthetic hand controlled by the brain./issues/news/699404
Lee Kauftheil / THE STATE PRESS
ROBOTIC PROSTHETICS: David Meller, a bioengineering Ph.D. student, models the glove used to map the movement of the hand. Next to him is the robotic arm that will eventually be used to test software for a robotic prosthetic hand controlled by the brain.
 

advertisement

Need a hand? ASU can help.

Researchers with various specialties have teamed up this semester to develop a prosthetic hand controlled entirely by brain signals.

The project, called "Cortical Control of a Dexterous Prosthetic Hand," is funded by a $5 million grant awarded in September.

Now Steve Helms Tillery, assistant professor of engineering, and the team of researchers are trying to understand sensations and how the brain uses that sensation in hand movement.

The research involves using virtual reality to build a virtual hand. A test subject would wear a glove that monitors the shape and position of his or her hand as the subject reaches out to touch animated objects in the virtual environment.

"We'll know when you use this hand to touch the animated object, and we can process the sensory signals, as if you were actually touching it," Helms Tillery said. "Because the hand is so complicated, what we really want to understand is how these sensory signals work when you touch things."

In the next phase of the research, a robot will present the subject with real objects for him or her to touch.

Researchers hope to then be able to monitor the signals that are associated with touching real objects versus the signals involved in touching virtual objects.

"We can look at all the signals that are interrelated and then try to tease out the signals that are related to touching objects," Helms Tillery said.

Ultimately, the researchers would like to incorporate these signals in building a state-of-the-art prosthetic hand.

Bioengineering junior Kimia Seyedmadani will be working on designing the real objects that the robot will present to the test subject.

"We are seeing what [the subject] feels about the surface," Seyedmadani said. "What do you feel when something is hot? I know what it looks like when you burn your hand, but what do you feel?"

Seyedmadani is doing the research through the Fulton Undergraduate Research Initiative, a program that allows students to do undergraduate research under the guidance of a mentor.

Researchers from many disciplines are working on the project, Helms Tillery said.

"Because of the nature of the project, it requires a lot of areas of expertise, and so we have everything from roboticists and mathematicians, people who do what's called psychophysics ... and people who study brain signals," he said.

The project is a nation-wide collaboration through the National Institutes of Health Bioengineering Partnership. The University of Pittsburgh and Carnegie Mellon University are also involved, among others.

Reach the reporter at: stephanie.naufel@asu.edu.



Print This Story, click here

Sponsors
RC Helicopters


Copyright 2001-06, ASU Web Devil. All rights reserved. No reprints without permission.

Online Editor In Chief: Jolie McCullough | Online Adviser: Jason Manning | Technical Contact: Jason Wulf

Contact Info | Privacy Policy