Stanford University robotic students, as part of their demonstration day that took place this week, have developed a number of toy-sized robots that scour a miniature city searching for animals in trouble.
The robots, about the size of a milk jug, use laser sensors and cameras in order to locate animals, record obstacles and map its environment and then report back to an emergency responder to help guide a rescue crew to the animal.
“These robots are small but they contain a representative set of sensors that you would see on a real self-driving car,” said Marco Pavone, assistant professor of aeronautics and astronautics at Stanford and instructor for the course. “So, in a way, it is a sort of miniature city where self-driving robots are moving around in a way that is analogous to how a real self-driving car would behave in the world.”
Students programmed the robots to work at varying levels of autonomy, using industry-standard software and image classification developed through deep learning algorithms.
The class, which is open to undergraduates as well as graduate students, required students to learn to program the robots using mathematical concepts and build different components of the autonomy software during the course. Then they worked as teams to deploy the robot in a simulated scenario to see how they responded.
“All of these components work well on their own but putting them together is what tends to be difficult,” said Benoit Landry, a graduate student in aeronautics and astronautics and teaching assistant for the class. “So that’s really something that we’re trying to emphasize: How do you piece all of these complicated parts together to make a whole that works?”
This meant not just having the hardware and software components act in harmony but also having the robots deal with the unexpected. The robots were programmed to seek out unmapped areas as well as pause occasionally to understand the world it was entering before moving forward.
Other robots were used to identify differences between the internal map and what its sensors were seeing. A different robot looked for rogue elements such as an image of a bike where the robot would wait for the bike to pass before moving safely forward.
Pavone’s lab develops planning, decision-making and artificially intelligent algorithms for autonomous robots such as self-driving cars, drones and autonomous spacecraft. Stanford says this class of robots would not have been possible a few years ago, but because technology has advanced rapidly and autonomous technology is becoming more of a reality today on our roads, oceans, sky and space, it is an important development for budding engineers.
“Just a few years ago, this kind of project would have required large teams of researchers and significant investments,” Pavone said. “Now, leveraging a variety of tools recently developed by the robotics community, we can teach it in a quarter to undergraduates. This is a testament of how quickly this field is progressing.”