Enhancing the Applicability of Sign Language Translation
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 8634-8648 |
Journal / Publication | IEEE Transactions on Mobile Computing |
Volume | 23 |
Issue number | 9 |
Online published | 5 Jan 2024 |
Publication status | Published - Sept 2024 |
Link(s)
DOI | DOI |
---|---|
Attachment(s) | Documents
Publisher's Copyright Statement
|
Link to Scopus | https://www.scopus.com/record/display.uri?eid=2-s2.0-85182368208&origin=recordpage |
Permanent Link | https://scholars.cityu.edu.hk/en/publications/publication(4bb626ab-6027-4331-9e44-f02c8011969e).html |
Abstract
This paper addresses a significant problem in American Sign Language (ASL) translation systems that has been overlooked. Current designs collect excessive sensing data for each word and treat every sentence as new, requiring the collection of sensing data from scratch. This approach is time-consuming, taking hours to half a day to complete the data collection process for each user. As a result, it creates an unnecessary burden on end-users and hinders the widespread adoption of ASL systems. In this study, we identify the root cause of this issue and propose GASLA–a wearable sensor-based solution that automatically generates sentence-level sensing data from word-level data. An acceleration approach is further proposed to optimize the data generation speed. Moreover, due to the gap between the generated sentence data and directly collected sentence data, a template strategy is proposed to make the generated sentences more similar to the collected sentence. The generated data can be used to train ASL systems effectively while reducing overhead costs significantly. GASLA offers several benefits over current approaches: it reduces initial setup time and future new-sentence addition overhead; it requires only two samples per sentence compared to around ten samples in current systems; and it improves overall performance significantly. © 2024 IEEE.
Research Area(s)
- Mobile computing, wearable sensing, sign language translation
Citation Format(s)
Enhancing the Applicability of Sign Language Translation. / Li, Jiao; Xu, Jiakai; Liu, Yang et al.
In: IEEE Transactions on Mobile Computing, Vol. 23, No. 9, 09.2024, p. 8634-8648.
In: IEEE Transactions on Mobile Computing, Vol. 23, No. 9, 09.2024, p. 8634-8648.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Download Statistics
No data available