A data-driven simulation engine developed at MIT represents a new tool for accelerating the research and development of adaptive robust control for autonomous driving. Code for the autonomous vehicle simulator, designed to test and train vehicles under real-world conditions, is being open sourced to the public.
In contrast to previous iterations and other autonomous vehicle simulators, VISTA 2.0 achieves high-fidelity real-world data-driven simulation to simulate complex sensor types and massively interactive scenarios andThe open-source data-driven simulator for multi-sensor perception of embodied agents leverages real-world data. Source: MIT intersections at scale.
To synthesize 3D lidar point clouds, developers projected data collected by a test vehicle into a 3D space coming from lidar data and then let a new virtual vehicle drive around locally from where that original vehicle was. All of that sensory information was projected back into the frame of view of this new virtual vehicle, with the help of neural networks.
Together with the simulation of event-based cameras, which operate at speeds greater than thousands of events per second, the system was demonstrated to replicate this multimodal information and to do so all in real time. The simulator affords scope to train neural nets offline, and also to test online on the car in augmented reality setups for safe evaluations.