forked from dineshb-ucsd/wcsng-new.github.io
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #109 from ucsdwcsng/pushkal-dev
Cover image and structure
- Loading branch information
Showing
2 changed files
with
65 additions
and
51 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -3,8 +3,9 @@ layout: publication | |
title : "C-Shenron: A Realistic Radar Simulation Framework for CARLA" | ||
short_title: "C-Shenron" | ||
tags: Vehicle | ||
# cover: /assets/images/c-shenron/c-shenron.png | ||
conference: "Submitted to CVPR 2025" | ||
cover: /assets/images/c-shenron/c-shenron.png | ||
disp_cover: "False" | ||
conference: "In Submission" | ||
authors: "Satyam Srivastava, Jerry Li, Pushkal Mishra, Kshitiz Bansal, Dinesh Bharadia" | ||
author_list: | ||
- name: Satyam Srivastava | ||
|
@@ -20,20 +21,16 @@ author_list: | |
- name: Dinesh Bharadia | ||
url: https://dineshb-ucsd.github.io/ | ||
email: [email protected] | ||
conference_site: https://cvpr.thecvf.com/Conferences/2025 | ||
paper: /files/c-shenron_paper.pdf | ||
# github: https://github.com/ucsdwcsng/carla-radarimaging/ | ||
conference: "In Submission" | ||
miscs: | ||
- content_type: Paper | ||
content_url: /files/c-shenron_paper.pdf | ||
- content_type: Supplementary Material | ||
content_url: /files/c-shenron-supplementary-materials.pdf | ||
- content_type: Driving Videos | ||
content_url: https://www.youtube.com/playlist?list=PLMklUDp_gXNE2W83f0UNoK7Vrs9QZROIv | ||
|
||
banner: "📢 Code and Dataset will be released after the acceptance." | ||
description: # all combinations are possible: (title+text+image, title+image, text+image etc), things will be populated in orders | ||
- title: Overview | ||
text: "The advancement of self-driving technology has become a focal point in outdoor robotics, driven by the need for robust and efficient perception systems. This paper addresses the critical role of sensor integration in autonomous vehicles, particularly emphasizing the underutilization of radar compared to cameras and LiDARs. While extensive research has been conducted on the latter two due to the availability of large-scale datasets, radar technology offers unique advantages such as all-weather sensing and occlusion penetration, which are essential for safe autonomous driving. This study presents a novel integration of a realistic radar sensor model within the CARLA simulator, enabling researchers to develop and test navigation algorithms using radar data. Utilizing this radar sensor and showcasing its capabilities in simulation, we demonstrate improved performance in end-to-end driving scenarios. Our findings aim to rekindle interest in radar-based self-driving research and promote the development of algorithms that leverage radar's strengths." | ||
- title: High Level Implementation | ||
text: <p>The following diagram illustrates a high level overview of our sensor integration into CARLA and the evaluation framework for End-to-End Driving.</p> | ||
|
||
|
@@ -75,4 +72,6 @@ video_matrix: | |
- description: "Camera + Radar" | ||
link: "/assets/gif/c-shenron/case3/Camera+Radar.gif" | ||
text: "This is a special test scenario in CARLA where the traffic lights in opposing lanes are turned on to test the situational awareness of the driving agent. Here the vehicle is attempting to make a right turn at the intersection when the lights from crossing lane are on. The Camera only model fails to stop in time and crashes into the incoming car from the crossing lane. However the other two models using LiDAR and Radar manage to avoid the crash by stopping abruptly and proceeding only when it's safe." | ||
overview: # Modify this | ||
text: "The advancement of self-driving technology has become a focal point in outdoor robotics, driven by the need for robust and efficient perception systems. This paper addresses the critical role of sensor integration in autonomous vehicles, particularly emphasizing the underutilization of radar compared to cameras and LiDARs. While extensive research has been conducted on the latter two due to the availability of large-scale datasets, radar technology offers unique advantages such as all-weather sensing and occlusion penetration, which are essential for safe autonomous driving. This study presents a novel integration of a realistic radar sensor model within the CARLA simulator, enabling researchers to develop and test navigation algorithms using radar data. Utilizing this radar sensor and showcasing its capabilities in simulation, we demonstrate improved performance in end-to-end driving scenarios. Our findings aim to rekindle interest in radar-based self-driving research and promote the development of algorithms that leverage radar's strengths." | ||
--- |