Acquired Electronics360

Aerospace

Robotic Insects Will Be Given Laser Eyes, Use Same Tech as Driverless Vehicles

26 October 2015

Robotic insects are being improved upon every day. Now, researchers from the University at Buffalo (UB) are giving robotic bees tiny laser-powered sensors that act as eyes so they can sense the size, shape and distance of approaching objects.

“Essentially, it’s the same technology that automakers are using to ensure that driverless cars don’t crash into things,” says Karthik Dantu, UB computer scientist. “Only we need to shrink that technology so it works on robot bees that are no bigger than a penny.”

The researchers teamed up with Harvard University and the University of Florida to work on this project, an offshoot of the RoboBee initiative, led by Harvard and Northeastern University, which aims to create insect-inspired robots that could eventually be used in agriculture and disaster relief.

So far, research has proved that robot bees are capable of tethered flight and moving while submerged in water, but they do experience limitations in Robotic insects. Image Credit: Microrobotics Lab, Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired EngineeringRobotic insects. Image Credit: Microrobotics Lab, Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineeringdepth perception. For example, a robot bee cannot sense what is in front of it.

If robotic bees are going to avoid flying into walls or land on precise targets, this would be problematic, according to Dantu.

The UB team is addressing the issue by giving them remote sensing technology called LIDAR (light detection and ranging), the same laser-based sensor system that works in driverless cars.

LIDAR works like radar, but it emits invisible laser beams instead of microwaves. The beams capture light reflected from distant objects and then sensors measure the time it takes for the light to return in order to calculate the distance and shape of the objects.

Lastly, the information is analyzed by computer algorithms to form a coherent image of the car’s path, which enables the car to “see” what is around.

Dantu and the team want to shrink this system down into a system called “micro-LIDAR.”

To make it happen, University of Florida researchers are developing the tiny sensor that measures the light’s reflection, while Dantu is creating novel perception and navigation algorithms that enable the bees to process and map the world around it. Harvard researchers will then incorporate the technology into the bees.

Once completed, the technology would not be limited to robot insects. The sensors could also be used in wearables, medical devices and mobile devices.

For more information visit: The University at Buffalo

To contact the author of this article, email engineering360editors@ihs.com



Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement

CALENDAR OF EVENTS

Date Event Location
30 Nov-01 Dec 2017 Helsinki, Finland
23-27 Apr 2018 Oklahoma City, Oklahoma
18-22 Jun 2018 Honolulu, Hawaii
Find Free Electronics Datasheets
Advertisement