A Realistic Radar Simulation Framework for CARLA
Planning to Explore via Self-Supervised World Models
A Realistic Radar Simulation Framework for CARLA

Satyam Srivastava
f20190188@pilani.bits-pilani.ac.in
Jerry Li
jli793@ucr.edu
Pushkal Mishra
pumishra@ucsd.edu
Kshitiz Bansal
ksbansal@ucsd.edu
Dinesh Bharadia
dineshb@ucsd.edu
Submitted to CVPR 2025

Comparison of views from Camera, Semantic LiDAR, and Shenron Radar in CARLA. The orange lines outline the road, red and magenta highlights vehicles, and blue indicates a static object.



The advancement of self-driving technology has become a focal point in outdoor robotics, driven by the need for robust and efficient perception systems. This paper addresses the critical role of sensor integration in autonomous vehicles, particularly emphasizing the underutilization of radar compared to cameras and LiDARs. While extensive research has been conducted on the latter two due to the availability of large-scale datasets, radar technology offers unique advantages such as all-weather sensing and occlusion penetration, which are essential for safe autonomous driving. This study presents a novel integration of a realistic radar sensor model within the CARLA simulator, enabling researchers to develop and test navigation algorithms using radar data. Utilizing this radar sensor and showcasing its capabilities in simulation, we demonstrate improved performance in end-to-end driving scenarios. Our findings aim to rekindle interest in radar-based self-driving research and promote the development of algorithms that leverage radar's strengths.


Website Template Originally made by Phillip Isola and Richard Zhang for colorful ECCV project; the code can be found here.