Image search, gaming and robotics can all benefit greatly

Oct 24, 2012 07:49 GMT  ·  By

Artificial intelligence and empathic technology are a long way off, but there is no doubt that steps in their direction are being taken constantly. The Facial Expression Estimation technology is one such step.

Developed by Omron, the technology is part of the OKAO Vision collection and relies on a statistical classification method, as well as Omron's 3D model-fitting technology.

Depending on how the parts of the face are positioned in a picture, the method can identify seven facial expression or moods (for now). These are anger, sadness, fear, disgust, surprise, happiness and neutral.

Image search is one of the obvious uses of this. Users would be able to find photos of people experiencing whatever sensation they are interested in, or issue a command whereby tags are automatically added to a photo, saying whose and what mood is contained.

In robotics, personal assistants could gain the ability to react based on the feelings of their owners, provided said owners let the feelings show.

Games could reap many benefits too. When playing a console game, for instance, especially with motion controllers, the player's mood could be detected automatically and, based on it, the game settings could be adjusted on the fly.

Game avatars could mimic the expressions of players in real time too, perhaps even cause the virtual environment and NPCs to act in a certain way.

A game world quest giver, for instance, could be reluctant to approach the main character and dump a quest on them when they are visibly enraged or frustrated over something.

MMOs could become even more interactive than they already are. Currently, it takes special “emotes” and chat commands to make an avatar interact. Not exactly something that players are willing to bother with all the time.

All that is a long way off though. Only photos can be “analyzed” at the moment, and they need to be in 64 pixels or higher size.