Feb 5, 2011 00:01 GMT  ·  By
Using gestures to give commands to robots and computers will aid surgeons perform shorter, safer operations on their patients
   Using gestures to give commands to robots and computers will aid surgeons perform shorter, safer operations on their patients

In a bid to reduce the length of surgeries, and to cause a decline in current post-op infection rates, experts at the Purdue University have developed a new integrated system that allows surgeons to control robots in the OR with a simple gesture of their hand or arm.

The gestures will be analyzed by a complex system, featuring both a hand-gesture recognition program and a robotic nurse, which will handle the instruments. This will reduce the risk of contamination.

A surgeon could for example use a predefined gesture to order a computer to display an image of the patient on the screen. The image could be an X-ray, a snapshot from a CT scan, or a life feed of what the doctor is doing at that very moment.

The new technology, aptly called “vision-based hand gesture recognition,” could also be used to coordinate relief efforts following a major disaster or terrorist attack. A wide array of other uses could be unlocked as the technology matures.

This approach to surgery is the brain child of Purdue University assistant professor of industrial engineering Juan Pablo Wachs. This is “a concept Tom Cruise demonstrated vividly in the film 'Minority Report',” he explains.

At this point, surgeons access medical images or other patient records by querying for them on a computer keyboard and mouse. This means they have to step away from the operating table, which boosts the time they need to finish the procedures, and also raises the risk of bacterial infection.

At the heart of the new system is an algorithm that recognizes hand gestures, translates them into a series of relevant commands, and then sends those commands to computers or robots. But the team admits that there are a multitude of issues to work out first.

“One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions,” Wachs explains.

“You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use,” he adds.

The new work was conducted by experts at Purdue, the Naval Postgraduate School in Monterey, California, and the Ben-Gurion University of the Negev, Israel.

Details of the robotic system and its algorithm will appear in the February issue of the journal Communications of the ACM (Association for Computing Machinery).

“Eventually we also want to integrate voice recognition, but the biggest challenges are in gesture recognition. Much is already known about voice recognition,” the team leader concludes.