Emma : An accurate, efficient, and multi-modality strategy for autonomous vehicle angle prediction
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 41-49 |
Number of pages | 9 |
Journal / Publication | Intelligent and Converged Networks |
Volume | 4 |
Issue number | 1 |
Online published | Mar 2023 |
Publication status | Published - Mar 2023 |
Link(s)
DOI | DOI |
---|---|
Attachment(s) | Documents
Publisher's Copyright Statement
|
Link to Scopus | https://www.scopus.com/record/display.uri?eid=2-s2.0-85160908689&origin=recordpage |
Permanent Link | https://scholars.cityu.edu.hk/en/publications/publication(b8dfe071-bd8f-411b-a025-ee7a52141553).html |
Abstract
Autonomous driving and self-driving vehicles have become the most popular selection for customers for their convenience. Vehicle angle prediction is one of the most prevalent topics in the autonomous driving industry, that is, realizing real-time vehicle angle prediction. However, existing methods of vehicle angle prediction utilize only single-modal data to achieve model prediction, such as images captured by the camera, which limits the performance and efficiency of the prediction system. In this paper, we present Emma, a novel vehicle angle prediction strategy that achieves multi-modal prediction and is more efficient. Specifically, Emma exploits both images and inertial measurement unit (IMU) signals with a fusion network for multi-modal data fusion and vehicle angle prediction. Moreover, we design and implement a few-shot learning module in Emma for fast domain adaptation to varied scenarios (e.g., different vehicle models). Evaluation results demonstrate that Emma achieves overall 97.5% accuracy in predicting three vehicle angle parameters (yaw, pitch, and roll), which outperforms traditional single-modalities by approximately 16.7%–36.8%. Additionally, the few-shot learning module presents promising adaptive ability and shows overall 79.8% and 88.3% accuracy in 5-shot and 10-shot settings, respectively. Finally, empirical results show that Emma reduces energy consumption by 39.7% when running on the Arduino UNO board. © All articles included in the journal are copyrighted to the ITU and TUP.
Research Area(s)
- multi-modality, autonomous driving, vehicle angle prediction, few-shot learning
Bibliographic Note
Research Unit(s) information for this publication is provided by the author(s) concerned.
Citation Format(s)
Emma: An accurate, efficient, and multi-modality strategy for autonomous vehicle angle prediction. / Song, Keqi; Ni, Tao; Song, Linqi et al.
In: Intelligent and Converged Networks, Vol. 4, No. 1, 03.2023, p. 41-49.
In: Intelligent and Converged Networks, Vol. 4, No. 1, 03.2023, p. 41-49.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Download Statistics
No data available