A team of scientists at the Massachusetts Institute of Technology (MIT) is developing an imaging system that will enable unmanned aerial vehicles (UAV) aboard aircraft carriers to interpret and follow the commands of flight controllers directly, by recognizing their hand motions and gestures.
PhD student Yale Song, his advisor Randall Davis, and researcher David Demirdjian are leading the effort, which has already yielded a rough version of the future system. The work is based on the fact that traffic controllers already use established gesture patterns when controlling UAV.
In the future, the vision-based gesture recognition system currently being developed at MIT may be converted for other uses as well, such as in applications where users may be able to interact directly with their computers.
The basis of the entire system is a 3D camera and an advanced piece of software the team is developing.