As the development of autonomous vehicles continues to move forward, one challenge for developers is how these cars will be able to navigate in inclement weather conditions.
To help these vehicles see better in bad weather, researchers from the University of California (UC), San Diego have developed a new type of radar that improves the image capability of radar sensors to help predict the shape and size of objects in a scene.
Self-driving vehicles rely on technology such as lidar and radar to see the environment around it and navigate between obstacles. Lidar works by bouncing laser beams off surrounding objects to create high-resolution 3D pictures. But in fog, dust, rain or snow, lidar has issues creating these 3D images. Radar, which transmits radio waves to generate images on the road, can see in all weather but only captures a partial picture of the road scene.
UC San Diego researchers have created an inexpensive approach to achieving bad weather perception for self-driving cars that mixes both technologies.
“It’s a LiDAR-like radar,” said Dinesh Bharadia, a professor of electrical and computer engineering at the UC San Diego Jacobs School of Engineering. “Fusing LiDAR and radar can also be done with our techniques, but radars are cheap. This way, we don’t need to use expensive LiDARs.”
The UC San Diego system consists of two radar sensors placed on the hood and spaced an average car’s width apart. The two radar sensors arranged in this way enable the system to see more space and detail than a single radar sensor.
During testing on clear days and nights, the system performed as well as lidar sensors by determining the dimensions of cars moving in traffic. In foggy weather, the performance remained unchanged. The team hid a vehicle using a fog machine behind the system and it was able to accurately predict its 3D geometry. A lidar sensor failed the test, researchers said.
Generally, radar suffers from poor imaging quality because radio waves are transmitted and bounced off objects and only a fraction of the signals are reflected to the sensor. Meaning objects, pedestrians and vehicles are difficult to identify.
Placing two sensors about 1.5 meters apart on the hood of the car with an overlapping field of view, researchers were able to create a region of high-resolution, with a high probability of detecting the objects in front of it.
An additional benefit to the system is that it deals with the noise that typically comes from radar. The sensor picks up echo signals that are reflections of radio waves not directly from the objects that are being detected. UC San Diego developed new algorithms that fuse the information from the two radar sensors together and produce a new image free of noise.
“There are currently no publicly available datasets with this kind of data, from multiple radars with an overlapping field of view,” Bharadia said. “We collected our own data and built our own dataset for training our algorithms and for testing.”
The researcher’s dataset consists of 54,000 radar frames of driving scenes during the day and night in live traffic and in simulated fog conditions.
The next steps are working to collect data in the rain and the university is working with Toyota to fuse the new radar technology with cameras. Researchers said this radar system could potentially replace lidar in the long term for autonomous vehicles, but more would need to be done.
“Radar alone cannot tell us the color, make or model of a car,” Bharadia said. “These features are also important for improving perception in self-driving cars.”