NoteWordy : Investigating Touch and Speech Input on Smartphones for Personal Data Capture
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Article number | 581 |
Pages (from-to) | 568-591 |
Journal / Publication | Proceedings of the ACM on Human-Computer Interaction |
Volume | 6 |
Issue number | ISS |
Online published | 14 Nov 2022 |
Publication status | Published - Dec 2022 |
Link(s)
Abstract
Speech as a natural and low-burden input modality has great potential to support personal data capture. However, little is known about how people use speech input, together with traditional touch input, to capture different types of data in self-tracking contexts. In this work, we designed and developed NoteWordy, a multimodal self-tracking application integrating touch and speech input, and deployed it in the context of productivity tracking for two weeks (N = 17). Our participants used the two input modalities differently, depending on the data type as well as personal preferences, error tolerance for speech recognition issues, and social surroundings. Additionally, we found speech input reduced participants' diary entry time and enhanced the data richness of the free-form text. Drawing from the findings, we discuss opportunities for supporting efficient personal data capture with multimodal input and implications for improving the user experience with natural language input to capture various self-tracking data.
Research Area(s)
- personal informatics, productivity, Self-tracking, speech input, speech interface design
Citation Format(s)
NoteWordy: Investigating Touch and Speech Input on Smartphones for Personal Data Capture. / LUO, Yuhan; LEE, Bongshin; KIM, Young-Ho et al.
In: Proceedings of the ACM on Human-Computer Interaction, Vol. 6, No. ISS, 581, 12.2022, p. 568-591.
In: Proceedings of the ACM on Human-Computer Interaction, Vol. 6, No. ISS, 581, 12.2022, p. 568-591.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review