One day robots will be functioning alongside humans with ease—or so it is predicted. In order for them to do so, they’ll need to understand just how humans move (including the unwritten rules of walking).
Researchers from Stanford University have developed a short, non-humanoid prototype of a robot that can self-navigate the streets.
The robot, nicknamed “Jackrabbot” after the jackrabbits that dart across the Stanford campus, looks a lot like a ball on wheels. Jackrabbot is equipped with sensors that allow it to comprehend its surroundings and make its way around the streets and hallways while possessing normal human etiquette.
Jackrabbot was designed by the team in an attempt to build a new generation of social robot that can learn how to move among humans. The idea behind the project is that students can observe how Jackrabbot navigates among students on the sidewalks and around the halls of the campus to see how the bot learns unwritten rules of operating, in order to determine if they can live in harmony among humans in crowded spaces.
“By learning social conventions, the robot can be part of ecosystems where humans and robots coexist,” said Silvio Savarese, an assistant professor of computer science and director of the Stanford Computational Vision and Geometry Lab.
Unlike written traffic rules, human social conventions are not exactly explicit or written down along with lane markings and traffic lights, so this will require a bit more skill on the part of the robot.
That is why the lab is using machine-learning techniques to create algorithms that will allow the robot to recognize and react appropriately to unwritten rules of pedestrian traffic. The computer scientists have been collecting images and videos of people moving around the campus and transforming them into coordinates. From those coordinates, they can train an algorithm.
“Our goal in this project is to actually learn those (pedestrian) rules automatically from observations—by seeing how humans behave in these kinds of social spaces,” said Savarese. “The idea is to transfer those rules into robots.”
Currently Jackrabbot can move automatically and navigate without human assistance indoors, so now the team is fine-tuning the robot’s capabilities to be able to navigate outdoors. The next task is the implementation of “social aspects” of pedestrian navigation, such as deciding right-of-way on the sidewalk.
“We have developed a new algorithm that is able to automatically move the robot with social awareness, and we’re currently integrating that in Jackrabbot,” said Alexandre Alahi, a postdoctoral researcher in the lab.
The team acknowledges the Jackrabbot is an expensive prototype, but Savarese estimates that in five or six years social robots could cost only about $500, making it possible for companies to release them to the mass market.
“It’s possible to make these robots affordable for on-campus delivery, or for aiding impaired people to navigate in a public space like a train station or for guiding people to find their way through an airport,” said Savarese.
The researchers will present their system for predicting human trajectories in crowded spaces at the Computer Vision and Pattern Recognition Conference in Las Vegas at the end of June. The conference paper is titled “Social LSTM: Human Trajectory Prediction in Crowded Spaces.”
Video produced by Tom Abate and Vignesh Ramachandran.