Eyes-Free Interaction on Wearables beyond the Touch Screen

輕觸式螢幕外的可穿戴裝置無眼互動

Student thesis: Doctoral Thesis

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Awarding Institution
Supervisors/Advisors
Award date29 Jan 2020

Abstract

A touch screen is an inevitable part for daily mobile interaction. Touch gestures, such as tapping, swiping, dragging, and holding, are natural, intuitive, and fast to perform on a touch screen. However, the on-screen interaction usually requires a graphical interface to assist its usage and the interface takes up the input space and may distract users’ attentions. The mobility and the small input space of smartphones and wearable devices create interaction challenges. For example, it is difficult to operate smartwatches with one hand when the other hand is busy with other tasks. Compared with the on-screen input, the interaction beyond the touch screen increases the input vocabulary, and avoids the screen occlusion problem. It also supports eyes-free input which can potentially benefit various scenarios, e.g., in a meeting or talking with friends.

In this thesis, I explore the design and implementation of the eyes-free input approaches beyond the typical touch screen on wearable devices, mainly smartwatches. The three contributions address the input challenges on wearables including: limited input control, difficulty of one-handed input, and inaccuracy on small input space. I present the following three techniques to extend eyes-free input with the bezel-initiated interaction, to enable same-side-hand eyes-free typing with thumb-to-finger touch, and to enhance eyes-free typing with micro thumb-tip gestures.

I start with exploring the eyes-free bezel-initiated swipe (BIS) on round smartwatch by leveraging a commonly used swiping gesture. The BIS supports natural, intuitive, and fast input for applications ranging from smartwatch to VR without additional hardware. This expands the input space of smartwatches and improves the input accuracy on small screens. Unlike the existing techniques on rectangular smartwatches, the eyes-free BIS on round smartwatches is largely unexplored. To improve the BIS interaction, I analyze the user behaviors and performance on different bezel button sizes. Supervised machine learning is adopted for BIS detection to improve the accuracy for small buttons. I study the performance of the personal and the general supervised machine learning models and discuss potential smartwatch applications.

BIS increases the input possibilities but it faces difficulty when the other hand is not available. To address the input challenge of one-handed input on smartwatches, I present an eyes-free typing method, FingerT9, which facilitates the same-side-hand input with the watch-wearing hand. Typing requires high accuracy of input, and is hard to perform with one hand. In some mobile scenarios, users’ hand may be occupied, such as for carrying a bag, this situation makes two-handed interaction impossible. FingerT9 leverages the action of thumb-to-finger touch on the finger segments to support same-side-hand typing on smartwatches without looking at the input space on finger segments. A T9 keyboard is mapped to the finger segments. Two user studies showed that FingerT9 performed better than tilt-based input in typing speed, error rate, and efficiency. This avoids the problems of screen occlusion and enables typing using only the watch-wearing hand.

Typing on finger segments requires relatively large finger movement. To improve the one-handed typing performance, I then investigate typing with micro-gestures on fingertip. I consider user’s spatial awareness on the input space and present TipText, a new text entry technique using micro thumb-tip gestures. With TipText, a miniature QWERTY keyboard is resided invisibly on the first segment of the user’s index finger. Text entry can be carried out using the thumb-tip gesture to tap on the index fingertip. I conducted a series of user studies and computer simulation typing tasks to explore over 1,146,484 possible keyboard layouts. I then optimize the keyboard layout for eyes-free input by utilizing a spatial model reflecting the users’ natural spatial awareness of key locations on the index finger. The final keyboard layout achieved an average typing speed of 11.9WPM in a one-day evaluation.