Projects per year
Abstract
This paper addresses a significant problem in American Sign Language (ASL) translation systems that has been overlooked. Current designs collect excessive sensing data for each word and treat every sentence as new, requiring the collection of sensing data from scratch. This approach is time-consuming, taking hours to half a day to complete the data collection process for each user. As a result, it creates an unnecessary burden on end-users and hinders the widespread adoption of ASL systems. In this study, we identify the root cause of this issue and propose GASLA–a wearable sensor-based solution that automatically generates sentence-level sensing data from word-level data. An acceleration approach is further proposed to optimize the data generation speed. Moreover, due to the gap between the generated sentence data and directly collected sentence data, a template strategy is proposed to make the generated sentences more similar to the collected sentence. The generated data can be used to train ASL systems effectively while reducing overhead costs significantly. GASLA offers several benefits over current approaches: it reduces initial setup time and future new-sentence addition overhead; it requires only two samples per sentence compared to around ten samples in current systems; and it improves overall performance significantly. © 2024 IEEE.
Original language | English |
---|---|
Pages (from-to) | 8634-8648 |
Journal | IEEE Transactions on Mobile Computing |
Volume | 23 |
Issue number | 9 |
Online published | 5 Jan 2024 |
DOIs | |
Publication status | Published - Sept 2024 |
Funding
This work is supported by the GRF grants from Research Grants Council of Hong Kong (Project No. CityU 11217420 and CityU 11213622), the CityU SRG-Fd (Project No. 7005658), and the GRF grants from Research Grants Council of Hong Kong (Project No. CityU 21201420 and CityU 11201422).
Research Keywords
- Mobile computing
- wearable sensing
- sign language translation
Publisher's Copyright Statement
- COPYRIGHT TERMS OF DEPOSITED POSTPRINT FILE: © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. Li, J., Xu, J., Liu, Y., Xu, W., & Li, Z. (2024). Enhancing the Applicability of Sign Language Translation. IEEE Transactions on Mobile Computing. Advance online publication. https://doi.org/10.1109/TMC.2024.3350111.
Fingerprint
Dive into the research topics of 'Enhancing the Applicability of Sign Language Translation'. Together they form a unique fingerprint.-
GRF: Pushing the Boundaries of Wearable Sensing: A Tale of Two Modalities
XU, W. (Principal Investigator / Project Coordinator) & Ma, D. (Co-Investigator)
1/01/23 → …
Project: Research
-
GRF: A Trinity Platform with Synthesized mmWaves for Human Motion Sensing
LI, Z. (Principal Investigator / Project Coordinator) & Zhang, J. (Co-Investigator)
1/09/22 → …
Project: Research
-
GRF: Tracking 3D Hand Pose Using Wearable Armband
LI, Z. (Principal Investigator / Project Coordinator)
1/01/21 → 24/03/25
Project: Research