Waymo, the autonomous vehicle arm of Alphabet, has demonstrated how its self-driving perception system operates in real-world scenarios on the streets of San Francisco.
During the summer, Waymo started self-driving rides in San Francisco in an all-electric I-PACE vehicle equipped with the fifth generation of Waymo Driver autonomous software.
The perception system allows these self-driving cars to navigate narrow, two-way streets that are common in the city while identifying cyclists or pedestrians that are passing by. It also can determine complex scenarios such as when a worker pops out from behind a truck in the middle of the street or a truck blocking traffic due to a delivery, which are common occurrences in cities.
Waymo Driver’s perception system includes two aspects: a custom suite of sensors developed for autonomous operations; and a software package that makes sense of this information.
The sensors allow the vehicles to correctly classify objects, estimate the paths of these objects and understand their intent or behavior. The sensors have been optimized for the sole task of driving and to react to unexpected behavior from multiple directions such as a skater overtaking a car on a busy street or identifying debris at great distances.
In the video, a split screen is shown of the car driving and what objects, people and other cars are in front of it. The other half of the screen shows how the car perceives the world though Waymo Driver and how it identifies objects and pedestrians. The autonomous car is shown navigating through narrow streets with cars parked on it at night, avoiding a bicyclist in front of it, traveling through neighborhoods with cars narrowly approaching and on busy city streets with ample amounts of activity.
The sensors
Lidar is used to create 3D pictures both up close and at long distances. These are known as point clouds that capture the side and distance of objects up to 300 meters away in all lighting conditions. Due to cameras and radar technology frequently failing to protect pedestrians in dark conditions, this is a critical role for lidar sensors.
A range of cameras are also in the Waymo Driver system, giving the self-driving cars different perspectives of the road. These can capture long-range objects and complement the rest of the system to give it a better understanding of the environment.
Radar imaging is also included to perceive a vehicle or pedestrian’s speed and trajectory, even in tough weather conditions such as snow, fog and rain.
Additionally, Waymo Driver includes artificial intelligence (AI) and machine learning for perception, behavior prediction and planning, using the more than 20 million autonomously driven miles Waymo has completed on the roads.
The sensors and AI allows Waymo to use sensor fusion to improve the detection and characterization of what the autonomous vehicles sees in its environment.