AmplitudeArrow : On-the-Go AR Menu Selection Using Consecutive Simple Head Gestures and Amplitude Visualization

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

View graph of relations

Author(s)

  • Yang Tian
  • Youpeng Zhang
  • Yukang Yan
  • Xiaojuan Ma
  • Yuanchun Shi

Related Research Unit(s)

Detail(s)

Original languageEnglish
Journal / PublicationIEEE Transactions on Visualization and Computer Graphics
Publication statusOnline published - 20 Jan 2025

Abstract

Heads-up computing aims to provide synergistic digital assistance that minimally interferes with users' on-the-go daily activities. Currently, the input modalities of heads-up computing are mainly voice and finger gestures. In this work, we propose and evaluate the AmplitudeArrow (AA) technique designed for on-the-go AR menu selection to demonstrate that consecutive simple head gestures can also be an effective input modality for heads-up computing. Specifically, AA arranges menu icons into one/two row(s). To select a target icon, the user first makes their head yaw to pre-select the target icon or the column containing it and then makes their head pitch to make the arrow in the target icon expand until the arrow covers the target icon completely, i.e., the pitch amplitude surpasses the selection confirmation threshold. User studies indicated that AA demonstrated robust resistance to walking-caused head perturbation and external factors such as other people/obstacles, delivering high accuracy (error rate < 5%) and fast speed (< 1.5s per selection) when there were no more than six icon columns (twelve icons) distributed horizontally and evenly in a menu area with a horizontal visual angle of 43°. © 1995-2012 IEEE.

Research Area(s)

  • Amplitude visualization, augmented reality, consecutive simple head gestures, heads-up computing, on-the-go menu selection

Citation Format(s)