Relies on a short-depth camera instead of an infrared sensor

Oct 19, 2011 00:01 GMT  ·  By

A certain PhD student called Chris Harrison, and others, just invented a new technology that can allow for “graphical, interactive, multitouch input on arbitrary, everyday surfaces.”

The idea Chris Harrison had is not so much a touch interface as it is a gesture recognition technology, since it isn't actually the touched surface that picks up the signals.

Instead, the system relies on a depth camera with a short range, not an infrared sensor, in order to gauge the viewing angle and other characteristics of surfaces.

Once that is done, the OmiTouch system, as it is called, can interpret the touch inputs, even pinch-to-zoom, regardless of what the surface is like (uneven, hairy, etc.)

Check out the video above for a demonstration while we start imagining how this thing could be combined with the transparent window displays Samsung promised.