Researchers from Carnegie Mellon University found that robot perception is improved when robots can hear. Until now, robots have depended on sight and touch for perception, but their perception is not always accurate. By adding hearing to a robot’s list of senses, robots can be more precise when sorting items.
Sounds help robots differentiate between objects, determine what action caused a sound and predict the physical properties of objects. Robots that used sound successfully also classified objects 76% of the time.
For the study, the team created a large dataset of video and audio of 60 common objects. They recorded the objects moving around a tray and crashing into the sides. They cataloged 15,000 interactions with the objects.
The interactions were captured with Tilt-Bot, a square tray attached to the arm of the Sawyer robot. With Tilt-Bot, the team placed an object on the tray and let the Sawyer robot spend a few hours moving the tray in random directions with varying levels of tilt. Cameras and microphones recorded the actions. The team also collected data with Sawyer pushing objects around on a surface without the tray.
The team also studied how robots gather information from sound. For example, the robot used sound to estimate the amount of granular material by shaking the container. The robot could use what it learned about the sound of a set of objects to make predictions about the physical properties of unseen objects.
This research was presented at the virtual Robotics Science and Systems Conference.