NoteWordy : Investigating Touch and Speech Input on Smartphones for Personal Data Capture

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

4 Scopus Citations
View graph of relations

Author(s)

  • Yuhan LUO
  • Bongshin LEE
  • Young-Ho KIM
  • Eun Kyoung CHOE

Related Research Unit(s)

Detail(s)

Original languageEnglish
Article number581
Pages (from-to)568-591
Journal / PublicationProceedings of the ACM on Human-Computer Interaction
Volume6
Issue numberISS
Online published14 Nov 2022
Publication statusPublished - Dec 2022

Abstract

Speech as a natural and low-burden input modality has great potential to support personal data capture. However, little is known about how people use speech input, together with traditional touch input, to capture different types of data in self-tracking contexts. In this work, we designed and developed NoteWordy, a multimodal self-tracking application integrating touch and speech input, and deployed it in the context of productivity tracking for two weeks (N = 17). Our participants used the two input modalities differently, depending on the data type as well as personal preferences, error tolerance for speech recognition issues, and social surroundings. Additionally, we found speech input reduced participants' diary entry time and enhanced the data richness of the free-form text. Drawing from the findings, we discuss opportunities for supporting efficient personal data capture with multimodal input and implications for improving the user experience with natural language input to capture various self-tracking data.

Research Area(s)

  • personal informatics, productivity, Self-tracking, speech input, speech interface design

Citation Format(s)

NoteWordy: Investigating Touch and Speech Input on Smartphones for Personal Data Capture. / LUO, Yuhan; LEE, Bongshin; KIM, Young-Ho et al.
In: Proceedings of the ACM on Human-Computer Interaction, Vol. 6, No. ISS, 581, 12.2022, p. 568-591.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review