Jul 27, 2011 14:00 GMT  ·  By
University of Michigan expert Brent Gillespie uses a prototype of a device that provides feedback to the wearer's arm while objects are moved with a prosthetic “hand”
   University of Michigan expert Brent Gillespie uses a prototype of a device that provides feedback to the wearer's arm while objects are moved with a prosthetic “hand”

Scientists in the United States have just kicked off a 4-year research project, which aims to develop and refine a working brain-computer interface. The device would enable those who are paralyzed, or otherwise incapacitated, to operate computers and wheelchairs using only their brains.

A BCI device has a very simple operating principle, that is however extremely difficult to translate into practice. A cap on the patient's head analyzes neural impulses flowing through the brain, and then centralizes them via a computer.

Specialized software then detects which area of the brain was activated, and what organ or muscle that region controls. It then sends electrical signals to either robotic arms, wheelchairs, or cursors on a computer screen, carrying out the operations it interprets the brain to want.

Scientists from the Rice University, the University of Michigan, Drexel University and the University of Maryland will conduct this investigation with a $1.2 million grant, which was provided by the Human-Centered Computing program managed by the US National Science Foundation (NSF).

Several small-scale demonstrations have already proven that the technology is feasible. All that remains to be done is to eliminate uncertainties and false results, reduce the incidence of mistakes, and improve the BCI device's ability to harvest and interpret electrical impulses flowing through neurons.

“There's nothing fictional about this. The investigators on this grant have already demonstrated that much of this is possible,” explains Marcia O'Malley, a research scientist at the Rice University.

“What remains is to bring all of it – noninvasive neural decoding, direct brain control and tactile sensory feedback – together into one device,” adds the expert, who also holds an appointment as the co-principal investigator of the new research.

The other three co-investigators are Brent Gillespie from the University of Michigan, Patricia Shewokis from the Drexel University, and José Contreras-Vidal, from the University of Maryland. All of them have extensive experience in working with technology and amputees.

“Neuroprosthetic control is an important part of our project, but an equally important challenge is providing sensory feedback for contact tasks that are performed with the prosthesis,” Gillespie reveals.

In addition to being able to control stuff, patients could soon be able to feel them as well. The group plans to incorporate feedback technologies into its BCI, that would provide tactile information to wearers by tugging, stretching or pressing on the path of skin where robotic prosthetics are installed.

“The idea is to provide a range of sensory feedback that can be integrated by the user, much like able-bodied individuals integrate a variety of tactile, kinesthetic and force information from nerves in their skin and muscles,” Contreras-Vidal concludes.