Aug 24, 2010 09:19 GMT  ·  By

Microsoft Surface can be transformed into a fully-fledged robot command and control center delivering a unique NUI (natural user interface) experience, illustrative of the innovations made possible through the combination of the company’s technologies.

University of Massachusetts Lowell Robotics Lab member Mark Micire defended his doctoral thesis, titled "Multi-Touch Interaction for Robot Command and Control" at the start of this week.

As the label implies, Micire explored a scenario in which Microsoft Surface and the Microsoft Robotics Developer Studio have been used in tandem in order to help Microsoft’s tabletop computer evolve into a robot command and control center.

“To boil it down, his work has been to use the technology in Microsoft Surface and Microsoft Robotics Developer Studio to aid in controlling individual and groups of robots working on emergency response in the field,” Microsoft’s Eric Havir stated.

“This allows not only direct control of the robots through the natural user interface, but also uses the display to provide feedback to the command staff about the environment and performance of the tasks.”

Just watch the video embedded at the bottom of this article in order to get an idea of what a Multi-Touch Interaction for Robot Command and Control implies.

Micire noted that the inspiration for the project came as a response to the technological problems highlighted by response to Hurricane Katrina in 2005.

During the response efforts following the hurricane that hit New Orleans five years ago, response groups on the grown were reduced to use hand-drawn paper maps, in a digital era in which satellite photography and multi-touch are pervasive.

At the same time, Micire noted that hand held digital photography and robot cameras were available only to on-site operators rather than at the command center.

Obviously, the Multi-Touch Interaction for Robot Command and Control is designed to resolve these issues.

“Using large commercially available tabletop multi-touch devices, we have designed and tested software for controlling individual robots as well as groups of robots for emergency response in a natural and easy to learn interface,” Micire noted.

“Ease of learning was achieved through a detailed investigation of the natural gestures expressed by users while instructing robots to perform navigation and tasks. We also developed a virtual joystick interface that abandons traditional visual affordances uses human bio-mechanics to increase performance and provide ergonomic benefits over traditional physical joystick controllers".

“The research provides lessons learned and recommendations for future multi-touch command and control and proposes a significant shift in the way that interfaces should be designed for this class of multi-touch hardware,” he explained.

The Microsoft Robotics Developer Studio 2008 R3 is available for download here.