Researchers from the KTH Royal Institute of Technology in Sweden have just completed a project that’s making robot cooperation possible using body language.
The project, developed by Dimos Dimarogonas, an associate professor at KTH, has developed protocols that enable robots to ask for help from each other and to recognize when other robots need assistance — and then change their plans accordingly to help out.
"Robots can stop what they're doing and go over to assist another robot which has asked for help," said Dimarogonas. "This will mean flexible and dynamic robots that act much more like humans — robots capable of constantly facing new choices and that are competent enough to make decisions."
Dimarogonas and his team feel that this kind of interaction is necessary if robots and autonomous machines are to play bigger roles in industrial settings, where shared work may be necessary. For example, one robot may need an extra hand to lift and carry something or even hold an object in place.
According to Dimarogonas, the concept can be scaled up to include a variety of functions in a home, factory or other kinds of workplaces.
Two off-the-shelf robots demonstrate how robots can pick up each other's signals for assistance. (Image Credit: KTH Royal Institute of Technology)
As part of the project, the researchers demonstrated the new abilities of off-the-shelf autonomous machines, including NAO robots. One of the team’s videos shows a robot pointing out an object to another robot, indicating that it needs the robot to lift the item.
"The visual feedback that the robots receive is translated into the same symbol for the same object," said Dimarogonas. "With updated vision technology they can understand that one object is the same from different angles. That is translated to the same symbol one layer up to the decision-making — that it is a thing of interest that we need to transport or not. In other words, they have perceptual agreement."
Another demonstration showed two robots carrying an object together-- one leads the other by sensing the the force it exerts on the object.
"It's just like if you and I were carrying a table and I knew where it had to go," he said. "You would sense which direction I wanted to go by the way I turn and push, or pull."
All of the interactions discussed here take place entirely without any human interaction, in a real-time, autonomous way. The project also uses a novel communication protocol that sets it apart from other collaborative robot concepts.
"We minimize communication. There is a symbolic communication protocol, but it's not continuous. When help is needed, a call for help is broadcast and a helper robot brings the message to another robot. But it's a single shot,” added Dimarogonas.