Projects per year
Abstract
Fusing Radar and Lidar sensor data can fully utilize their complementary advantages and provide more accurate reconstruction of the surrounding for autonomous driving systems. Surround Radar/Lidar can provide 360° view sampling with the minimal cost, which are promising sensing hardware solutions for autonomous driving systems. However, due to the intrinsic physical constraints, the rotating speed of surround Radar, and thus the frequency to generate Radar data frames, is much lower than surround Lidar. Existing Radar/Lidar fusion methods have to work at the low frequency of surround Radar, which cannot meet the high responsiveness requirement of autonomous driving systems. This paper develops techniques to fuse surround Radar/Lidar with working frequency only limited by the faster surround Lidar instead of the slower surround Radar, based on widely-used object detection model called MVDNet. The basic idea of our approach is simple: we let MVDNet work with temporally unaligned data from Radar/Lidar, so that fusion can take place at any time when a new Lidar data frame arrives, instead of waiting for the slow Radar data frame. However, directly applying MVDNet to temporally unaligned Radar/Lidar data greatly degrades its object detection accuracy. The key information revealed in this paper is that we can achieve high output frequency with little accuracy loss by enhancing the training procedure to explore the temporal redundancy in MVDNet so that it can tolerate the temporal unalignment of input data. We explore several different ways of training enhancement and compare them quantitatively with experiments. © 2024 IEEE.
| Original language | English |
|---|---|
| Title of host publication | Proceedings - 2024 IEEE 30th International Conference on Embedded and Real-Time Computing Systems and Applications, RTCSA 2024 |
| Place of Publication | Los Alamitos, Calif. |
| Publisher | IEEE |
| Pages | 31-36 |
| ISBN (Electronic) | 9798350387957 |
| ISBN (Print) | 9798350387964 |
| DOIs | |
| Publication status | Published - 2024 |
| Event | 30th IEEE International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA 2024) - Sokcho, Korea, Republic of Duration: 21 Aug 2024 → 23 Aug 2024 https://rtcsa2024.github.io/ |
Publication series
| Name | Proceedings - IEEE International Conference on Embedded and Real-Time Computing Systems and Applications, RTCSA |
|---|---|
| ISSN (Print) | 2325-1271 |
| ISSN (Electronic) | 2325-1301 |
Conference
| Conference | 30th IEEE International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA 2024) |
|---|---|
| Abbreviated title | IEEE RTCSA 2024 |
| Place | Korea, Republic of |
| City | Sokcho |
| Period | 21/08/24 → 23/08/24 |
| Internet address |
Bibliographical note
Full text of this publication does not contain sufficient affiliation information. With consent from the author(s) concerned, the Research Unit(s) information for this record is based on the existing academic department affiliation of the author(s).Funding
This work is partially supported by Hong Kong GRF under grant no. 15206221 and 11208522.
RGC Funding Information
- RGC-funded
Fingerprint
Dive into the research topics of 'Timely Fusion of Surround Radar/Lidar for Object Detection in Autonomous Driving Systems'. Together they form a unique fingerprint.-
GRF: Managing Information Synchronicity in Real-Time Systems
GUAN, N. (Principal Investigator / Project Coordinator)
1/01/23 → …
Project: Research
-
GRF: Building a Theoretical Foundation for Real-time ROS
GUAN, N. (Principal Investigator / Project Coordinator)
1/01/22 → 18/11/25
Project: Research