One thing all of these delivery projects have in common is that they are programming drones to fly through tight spaces that are filled with obstacles while traveling at high speeds. This can become problematic, as small drones are limited in how much they can carry on-board for real-time processing.
Current approaches rely on maps that tell drones exactly where they are in relation to obstacles and how to get to a particular location. While this works for static environments, real-world settings are unpredictable and if a location is off by a small margin, drones can easily crash.
In order to better predict what’s in front of the drones, MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed a new system that allows drones to fly consistently at 20 miles per hour through environments such as forests and warehouses.
Called NanoMap, the system considers the drone’s position in the world over time to be uncertain, then models and accounts for that uncertainty. Researchers say this method leads to a much higher level of reliability in terms of drones being able to fly in close quarters and avoid obstacles.
NanoMap uses a depth-sensing system to stitch together a series of measurements about the drone’s surroundings, allowing it to make motion plans for its current field of view, but also to anticipate how it should move around in the hidden fields of view that it has already seen.
Pete Florence, MIT CSAIL graduate student and lead researcher on NanoMap, tells Electronics360 that the system works with any drone as long as it has a depth sensor. MIT CSAIL used a drone in testing that included components such as Intel’s NUC i7 on a 450-millimeter DJI-size frame.
Florence says NanoMap is the first system that enables drone flight with 3D data that is aware of the uncertainty around it, meaning the drone considers that it doesn’t exactly know its position and orientation as it moves through the world. Future iterations of the system may incorporate other pieces of information, such as the uncertainty in the drone’s individual depth-sensing measurements.
“It’s kind of like saving all of the images you’ve seen of the world as a big tape in your head,” Florence says. “For the drone to plan motions, it essentially goes back into time to think individually of all the different places that it was in.”
The system works well with smaller drones that move through small spaces and in tandem with a second system that is focused on more long-horizon planning. The Defense Advanced Research Projects Agency (DARPA) was initially involved in testing and provided financial support.
Florence says the system still has plenty of room for improvement. “We are nowhere near the capability of hawks zipping through forests with ultimate agility,” he says. “There’s lots of work to do to get there in terms of getting better at planning, control, perception and what we call local obstacle avoidance.”
Researchers say the system could be used in fields ranging from search-and-rescue and defense to package delivery and entertainment. It could also be applied to self-driving cars and other forms of autonomous navigation.
The team is also in discussion with a number of delivery companies that have expressed interest in commercializing the technology, but it is currently it is in the prototype phase, Florence says.