Jun 27, 2011 11:55 GMT  ·  By

A team of experts is currently working with funds from the US National Science Foundation (NSF) to develop a new robotic language. With it, robots will be able to communicate amongst themselves more efficiently, which should ultimately increase their productivity and overall performances.

Interestingly, the language is modeled on the way humans learn to talk while they are very young. Researchers are emulating the way children learn to put thoughts into words, and say that robots will soon be able to do the same.

Linguist Jeffrey Heinz and mechanical engineer Bert Tanner are the primary architects of the new language, which they say could be ready soon. The effort has already been underway for several years.

A meeting that took place at the University of Delaware was what started the collaboration. Recently hired faculty in arts and sciences was called to the dean's office, and asked to give a one-minute slide presentation of the concepts they were working on.

“That's how we became aware of what each other was doing. We started discussing ideas about how we could collaborate, and the NSF project came as a result of that,” Tanner explains.

“Once we started seeing things aligning with each other and clicking together, we thought, 'Oh, maybe we really have something here that the world needs to know about',” he goes on to say.

The language is addressed to robots that will work during dangerous situations and emergencies. It will allow each individual to better communicate with each other, and to coordinate activities in real-time.

What this ability will do is essentially eliminate the time in which robots stand idly by as they are issued commands by humans. An increased degree of autonomy is preferable in emergencies.

“We would like to make the robots adaptive – learn about their environment and reconfigure themselves based on the knowledge they acquire,” Tanner says. A milestone will be achieved when a robot looks at what another is doing, and realize that it should be doing the same thing.

“The robots will be designed to do different tasks. We have eyes that see, ears that hear and we have fingers that touch. We don’t have a 'universal sensory organ.' Likewise in the robotics world, we're not going to design a universal robot that's going to be able to do anything and everything,” Heinz adds.

When specialized robots will be put together, each of them will know what it can do, but also what any other machine around is capable of. Via uninterrupted communications, the most suitable set of capabilities could be used at any given point.

Aware of their own limitations, robots would surrender a task to machines that are uniquely equipped to perform it. In the mean time, they could orient to another task, or remain in standby, in case their assistance is needed.

“If the robot has something in its claw, it cannot possibly hold another thing at the same time. It has to lay it down before it picks up something else. We are trying to teach a robot these kinds of constraints, and this is where some of the techniques we find in linguistics come into play,” Tanner concludes.