Electrodes on the muscles could facilitate interactions

Oct 28, 2009 09:36 GMT  ·  By

Scientists have been trying for a long time to get away from the limitations that average keyboards and mouse devices impose on computer users, and have thus created wireless gesture controllers and touch-screens for that. But now, experts at Microsoft, the University of Washington in Seattle (UWS), and the University of Toronto (UT) are working on an entirely different human-computer interface, one that taps into a person's muscles to control an object on-screen wirelessly, Technology Review reports.

The gestural-interaction device would essentially act in the same way brain caps do. These devices, which are beginning to make considerable headway in the electronics industry, are hats of sorts that are laden with sensors. These instruments capture the brain-wave activity and transform it into electrical impulses that can control objects on-screen. Similarly, the new system relies on electrodes placed on the forearms, to essentially synthesize an artificial image of the arm inside the virtual environment the user is trying to control.

The electrical signals that the electrodes read are then associated with specific hand gestures, including putting two fingers together, grabbing something, or squeezing something harder than one would usually. All of these information sets can then be translated into the virtual environment. This line of study, the researchers say, could be used in the near future in entertainment, such as, for instance, playing the popular game Guitar Hero without the plastic, guitar-shaped controller, or for changing songs in an mp3 player.

Microsoft researcher Desney Tan believes that the new product will be “going after healthy consumers who want richer input modalities.” The main idea was to make the system as unobtrusive as possible, and also very light and easy to wear. According to the team, it's easy to place the electrodes directly on the skin, under sleeves. They function accurately, and, after a brief period of associating gestures with commands, they can be readily used in a number of applications. The researchers also mention that the learning process is being handled by standard machine-learning algorithms, which should help keep the costs of the new sensors even lower.