The autonomous blimp that can detect motion and learns about the users’ reactions. Source: Georgia Institute of Technology Researchers at the Georgia Institute of Technology have created a pair of autonomous blimps that feature a 3-D printed gondola frame, carries sensors, and a mini camera.
The gondola frame is attached to either an 18- or 36-inch diameter balloon and can detect faces and hands, allowing people to direct the blimps with movements while it learns about the human operator identifying everything from hesitant glares to eager smiles. The smaller blimp can carry a payload of five grams while the larger one can support 20 grams.
The goal of the project is to better understand how people interact with flying robots.
“Roboticists and psychologists have learned many things about how humans relate to robots on the ground, but we haven't created techniques to study how we react to flying machines,” said Fumin Zhang, associate professor at Georgia Tech. “Flying a regular drone close to people presents a host of issues. But people are much more likely to approach and interact with a slow-moving blimp that looks like a toy.”
Researchers say the blimps’ circular shape makes them harder to steer with manual controllers but allows them to turn direction quickly. Georgia Tech sees the blimps playing a role in how people react to flying companions.
“Imagine a blimp greeting you at the front of the hardware store, ready to offer assistance,” Zhang said. “People are good at reading people's faces and sensing if they need help or not. Robots could do the same. And if you needed help, the blimp could ask, then lead you to the correct aisle, flying above the crowds and out of the way.”
The project will be presented at the 2017 IEEE International Conference on Robotics and Automation (ICRA) on May 29-June 3 in Singapore.