Recently, there have been reports of drone incidents: the unmanned aerial vehicles (UAVs) have been crashing into things and causing some trouble. A researcher from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has developed an obstacle-detection system that will enable drones to dodge and dive through trees at speeds up to 30 mph, and potentially avoiding accidents.
“Everyone is building drones these days, but nobody knows how to get them to stop running into things,” says Andrew Barry, CSAIL Ph.D. student, who developed the system as part of his thesis with MIT professor Russ Tedrake. “Sensors like LIDAR are too heavy to put on small aircraft, and creating maps of the environment in advance isn’t practical. If we want drones that can fly quickly and navigate in the real world, we need better, faster algorithms.”
Barry’s system runs about 20 times faster than existing software on the market and uses stereo-vision algorithm, which allows the drone to detect objects and then create a full map of its surroundings in real time. The software operated at 120 frames per second, extracts depth information at a speed of 8.3 milliseconds per frame and is open-source and available online.
In the video above, you will see a drone weighing just over one pound with a 34-inch wingspan, made from off-the-shelf components (total cost about $1,700) dodge and dip trees. The drone is equipped with a camera on each wing and two processors similar to the ones found in a cellphone.
How It Works
While traditional algorithms would use the images captured by each camera and then skim through the depth-field at multiple distances to determine if an object is in the drone’s path, they would not allow for the drone to fly faster than five or six miles per hour.
Barry decided that faster speeds did not really have much of an effect on the frames so he tried using a subset of measurements.
“You don’t have to know about anything that’s closer or further than that,” Barry says. “As you fly, you push that 10-meter horizon forward, and, as long as your first 10 meters are clear, you can build a full map of the world around you.”
The result was software capable of swift recovery of the missing depth information by integrating results from the drone’s odometry and previous distances.
Barry hopes to further improve the algorithms so that they can work at more than one depth, as well as in dense environments such as forests.
“Our current approach results in occasional incorrect estimates known as ‘drift,’” says Barry. “As hardware advances allow for more complex computation, we will be able to search at multiple depths and therefore check and correct our estimates. This lets us make our algorithms more aggressive, even in environments with larger numbers of obstacles.”
To contact the author of this article, email engineering360editors@ihs.com