Stereo Vision Meta-Lens-Assisted Driving Vision

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

23 Scopus Citations
View graph of relations

Author(s)

  • Takeshi Yamaguchi
  • Zihan Geng
  • Takuo Tanaka

Detail(s)

Original languageEnglish
Pages (from-to)2546–2555
Number of pages10
Journal / PublicationACS Photonics
Volume11
Issue number7
Online published8 Mar 2024
Publication statusPublished - 17 Jul 2024

Link(s)

Abstract

Object detection and depth perception are key foundations of object tracking and machine navigation, facilitating a thorough perception and understanding of the surrounding environment. Currently, autonomous vehicles employ complex and bulky systems with high cost and energy consumption to achieve demanding multimodal vision. An imperative exists for the development of compact and reliable technology to enhance the cost-effectiveness and efficiency of autonomous driving systems. Meta-lens, a novel flat optical device, has an artificial nanoantenna array to manipulate the light properties. It is lightweight, ultrathin, and easy to integrate, making it suitable for various applications. We developed a stereo vision meta-lens imaging system for assisted driving vision, a comprehensive perception including imaging, object detection, instance segmentation, and depth information. The compact system comprises a band-pass filter, a stereo vision meta-lens, and a complementary metal oxide semiconductor (CMOS) sensor. In comparison to traditional two-camera-based stereo vision systems, the meta-lens stereo vision imaging system eliminates the need for distortion correction or camera calibration. A tailored data processing pipeline is proposed with an intensity and depth gradient cross-validation optimization mechanism and three deep learning modules for object detection, instance segmentation, and stereo matching foundations. Final assisted driving vision provides multimodal perception by integrating the raw image, instance labels, bounding boxes, segmentation masks in depth pseudo color, and depth information for each detected object. Our assisted driving vision based on a stereo meta-lens system offers a comprehensive perception for scene understanding of machines, benefiting the applications of human–computer interaction, machine navigation, autonomous driving, and augmented reality. © 2024 American Chemical Society.

Research Area(s)

  • meta-lens, stereo vision, depth sensing, image segmentation, recognition

Citation Format(s)

Stereo Vision Meta-Lens-Assisted Driving Vision. / Liu, Xiaoyuan; Li, Wuyang; Yamaguchi, Takeshi et al.
In: ACS Photonics, Vol. 11, No. 7, 17.07.2024, p. 2546–2555.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Download Statistics

No data available