Nvidia has added Aeva’s frequency modulated continuous wave (FMCW) 4D lidar sensors to its Drive autonomous vehicle platform.
Nvidia’s Drive is an open, end-to-end platform that allows developers to train, test and validate safe self-driving technology.
Aeva’s sensor detects 3D position and instant velocity for each point at distances up to 500 m. The sensor has 4D perception capabilities that Aeva claims is not possible with legacy lidar sensors including:
Ultra-resolution – A real-time camera-level image with up to 1,000 lines per frame with no motion blur for the static scene. Aeva said this feature provides up to 20 times the resolution of legacy time of flight lidar sensors. This allows for the detection of roadway markings, drivable regions, vegetation, road barriers and other hazards at up to twice the distance of time-of-flight lidar.
4D localization – Technology that enables real-time ego vehicle motion estimation with six degrees of freedom, motion compensation and on-line sensor extrinsic calibration to aid with sensor fusion. This feature also enables accurate vehicle positioning and navigation without additional sensors or GPS for autonomous navigation in GPS-denied and areas such as tunnels and parking structures.
Other features include freedom of interference from sunlight and other sensors, elimination of retroreflector blooming and ghosting from objects like street signs and roadway markings and better ability to see through dust, fog, rain and snow.
“Aeva delivers a unique advantage for perception in automated vehicles because it leverages per-point instant velocity information to detect and classify objects with higher confidence across longer ranges,” said Gary Hicok, senior VP of engineering at Nvidia. “With Aeva as part of our Drive ecosystem network, we can provide customers access to this next generation of sensing capabilities for safe autonomous driving.”