You're looking at what could very well be the best touch-based interface yet

May 23, 2014 06:45 GMT  ·  By

Touchscreens are pretty standard at this point, but even the best software interfaces and gesture tracking programs aren't exactly perfect for manipulating three-dimensional graphical models. For that, some special expertise was needed.

As it happens, researchers at the Ishikawa Watanabe Laboratory in Japan were able to provide the required expertise.

Sure, they teamed up with holographic display company zSpace, but they did what they set out to do: release the most responsive gesture-based interface yet.

The goal was to increase the “immersiveness” of interactive displays by allowing the “screen” to tell the interpret certain touch maneuvers in different ways.

Say you place two fingers on a touch panel and spin the “pinching grip” around. The display would interpret that as you trying to spin the whole screen around, or a photo if you're in a virtual album.

With the new technology from Japan, you can “spin” things in different ways because the interface can see how your whole hands move, instead of just trying to guess your plans from where your fingertips come in contact with the screen.

Sure, the researchers had to use two 500 fps, high-speed cameras to pull that off (they were installed overhead), so it's not a totally practical setup, at least not if you want it mobile, compatible with tablets and laptops.

It should do well enough in a design lab though, especially since holographic displays are still a ways off and the world kind of needs a stepping stone.

We imagine that this new gestural interface would make a good pair with the quasi-holographic projection technology that MIT unveiled recently.

You don't even have to be limited to your hands either. The folks at zSpace and the Ishikawa Watanabe Laboratory included support for a stylus, adding one extra perspective for controlling the camera.

And since there is off-screen command support, you could, in theory, work together with one or more of your acquaintances on a project or other.

The folks at the Ishikawa Watanabe Laboratory have published a short video in which they demo their “high-speed 3D hand gesture interface.” You can see it below.

Clearly, there is room for improvement, since the virtual objects don't follow the touch-based nudged instantly, but it's still the smoothest solution of this sort ever.

Besides, in a machine-human interaction system that relies on overhead cameras, some latency is expected. After all, the video feed from the cameras needs to be processed, sent via cables to the main unit and processed again, then added to the context of the touchscreen, so to speak.