Enhancing the Applicability of Sign Language Translation

Jiao Li, Jiakai Xu, Yang Liu, Weitao Xu, Zhenjiang Li*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

107 Downloads (CityUHK Scholars)

Abstract

This paper addresses a significant problem in American Sign Language (ASL) translation systems that has been overlooked. Current designs collect excessive sensing data for each word and treat every sentence as new, requiring the collection of sensing data from scratch. This approach is time-consuming, taking hours to half a day to complete the data collection process for each user. As a result, it creates an unnecessary burden on end-users and hinders the widespread adoption of ASL systems. In this study, we identify the root cause of this issue and propose GASLA–a wearable sensor-based solution that automatically generates sentence-level sensing data from word-level data. An acceleration approach is further proposed to optimize the data generation speed. Moreover, due to the gap between the generated sentence data and directly collected sentence data, a template strategy is proposed to make the generated sentences more similar to the collected sentence. The generated data can be used to train ASL systems effectively while reducing overhead costs significantly. GASLA offers several benefits over current approaches: it reduces initial setup time and future new-sentence addition overhead; it requires only two samples per sentence compared to around ten samples in current systems; and it improves overall performance significantly. © 2024 IEEE.
Original languageEnglish
Pages (from-to)8634-8648
JournalIEEE Transactions on Mobile Computing
Volume23
Issue number9
Online published5 Jan 2024
DOIs
Publication statusPublished - Sept 2024

Funding

This work is supported by the GRF grants from Research Grants Council of Hong Kong (Project No. CityU 11217420 and CityU 11213622), the CityU SRG-Fd (Project No. 7005658), and the GRF grants from Research Grants Council of Hong Kong (Project No. CityU 21201420 and CityU 11201422).

Research Keywords

  • Mobile computing
  • wearable sensing
  • sign language translation

Publisher's Copyright Statement

  • COPYRIGHT TERMS OF DEPOSITED POSTPRINT FILE: © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. Li, J., Xu, J., Liu, Y., Xu, W., & Li, Z. (2024). Enhancing the Applicability of Sign Language Translation. IEEE Transactions on Mobile Computing. Advance online publication. https://doi.org/10.1109/TMC.2024.3350111.

Fingerprint

Dive into the research topics of 'Enhancing the Applicability of Sign Language Translation'. Together they form a unique fingerprint.

Cite this