Skin-interfaced multimodal sensing and tactile feedback system as enhanced human-machine interface for closed-loop drone control

Chunki Yiu (Co-first Author), Yiming Liu (Co-first Author), Wooyoung Park (Co-first Author), Jian Li, Xingcan Huang, Kuanming Yao, Yuyu Gao, Guangyao Zhao, Hongwei Chu, Jingkun Zhou, Dengfeng Li, Hu Li, Binbin Zhang, Lung Chow, Ya Huang, Qingsong Xu*, Xinge Yu*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

23 Downloads (CityUHK Scholars)

Abstract

Unmanned aerial vehicles have undergone substantial development and market growth recently. With research focusing on improving control strategies for better user experience, feedback systems, which are vital for operator awareness of surroundings and flight status, remain underdeveloped. Current bulky manipulators also hinder accuracy and usability. Here, we present an enhanced human-machine interface based on skin-integrated multimodal sensing and feedback devices for closed-loop drone control. This system captures hand gestures for intuitive, rapid, and precise control. An integrated tactile actuator array translates the drone’s posture into two-dimensional tactile information, enhancing the operator’s perception of the flight situation. Integrated obstacle detection and neuromuscular electrical stimulation–based force feedback system enable collision avoidance and flight path correction. This closed-loop system combines intuitive controls and multimodal feedback to reduce training time and cognitive load while improving flight stability, environmental awareness, and the drone’s posture. The use of stretchable electronics also addresses wearability and bulkiness issues in traditional systems, advancing human-machine interface design. Copyright © 2025 The Authors, some rights reserved.
Original languageEnglish
Article numbereadt6041
JournalScience Advances
Volume11
Issue number13
Online published26 Mar 2025
DOIs
Publication statusPublished - Mar 2025

Funding

This work was supported by the National Key R&D Program of China (grant no. 2024YFB4707503 to X.Y.), the Innovation and Technology Fund of Innovation and Technology Commission (grant no. ITS/119/22 to X.Y.), the Shenzhen Science and Technology Innovation Commission (grant no. SGDX20220530111401011 to X.Y.), the Research Grants Council of the Hong Kong Special Administrative Region (grant nos. 11211523, 11213721, and 11215722 to X.Y.), the National Natural Science Foundation of China (grant no. 62122002 to X.Y.), the City University of Hong Kong (grant nos. 9667221, 9667246, 9680322, and 9667199 to X.Y.), the InnoHK Project on Project 2.2\u2014AI-based 3D ultrasound imaging algorithm at Hong Kong Centre for Cerebro-Cardiovascular Health Engineering (COCHE) (to X.Y.), and the National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (grant no. RS-2024-00411904 to X.Y.).

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being

Publisher's Copyright Statement

  • This full text is made available under CC-BY 4.0. https://creativecommons.org/licenses/by/4.0/

RGC Funding Information

  • RGC-funded

Fingerprint

Dive into the research topics of 'Skin-interfaced multimodal sensing and tactile feedback system as enhanced human-machine interface for closed-loop drone control'. Together they form a unique fingerprint.

Cite this