Large-scale tactile sensing towards human-machine interactions

Student thesis: Doctoral Thesis

Abstract

The tactile interface is one of the most critical elements in intelligent human-machine interactions, allowing us to acquire large-scale tactile information from the surroundings and further give feedback to the machines. As one of the most direct physical extensions of our consciousness, our hands serve as the primary source of nuanced tactile sensations and function as intelligent tools for crafting and using objects in the physical realm. In addition, hand-centric interaction devices, such as joysticks, mice, keyboards, and touchpads, are also regarded as a major method to bridge the gap between human beings and the virtual world. Regrettably, despite the critical role of the hand, decoding the strength and distribution of the forces generated by hands is still challenging, which has significantly impeded progress in various fields, such as precise medical treatment, efficient sports training, virtual reality (VR) manipulation, robotics, and more.

Recent advances in soft tactile skin provide opportunities for capturing force distribution. However, the electrical wire array–based method faces reliability issues and cross-talk problems. While the vision-based method offers higher robustness by eliminating messy wires, they encounter accuracy issues in multiple large-areas contacts and have a limited sensing range, often hindered by easily occluded markers and/or usually requiring massive datasets. Moreover, the information obtained from almost all these soft tactile sensors is inherently the analog signals coupled from several unknown loading sources, making the force decoding very complicated, especially for forces applied from multiple points over a large area, just as hand grip force.

To overcome the large-scale contacts problems and enhance human-machine interactions, this thesis proposes a digital channel–enabled hand force sensing and processing strategy and then develops a phygital tactile sensing system (PhyTac).

Firstly, to tackle the marker occlusion problem in 3D space, this study lays out the receptors in a 3D spiral distribution, mimicking the leaf structure of aloe polyphylla. By incorporating color distinction – using two spirals with blue and green colors, respectively - we achieve a remarkable marker density up to 1.63/cm2, the highest record by far in vision-based sensors with parallel-type sensing surfaces. Furthermore, even with such a high density, we successfully avoid marker fusion at different depths during motion, achieving high-accuracy tracking with a root-mean-square error (RMSE) of only 0.18 mm.

Secondly, inspired by the decoding principles of touch neuronal systems, this study deploys soft bowl-shaped threshold switches on the outer shell of PhyTac to output “on” and “off” digital signals, which can not only encode threshold forces but also help localize the loading points. When the multiple-point load is applied on the PhyTac, only the loading point will activate the switch though numerous receptors are being deformed. Then, the corresponding receptors to the activated switch will be counted as the key nodes of interest (KOI) from coupled mass signals, offering physically meaningful high-quality data.

Thirdly, this study proposes an approach integrating the mechanical model of the outer shell into neural network training, named physical model-enhanced neural network (FEM-NN), aiming to circumvent the need for massive datasets and mitigate inaccuracies arising from arbitrary extrapolation. The results demonstrate the capability to reconstruct the distributed force map with high accuracy (0.11 N, 97.7% accuracy) within a large sensing range (0.5 to 25 N for a single point) using just a small dataset (45 KB), superior to the traditional physical model-based method and convolutional neural network-based method.

Finally, this study demonstrates the capability of PhyTac in constructing spatial-temporal hand force maps enables versatile applications, including finger and palm force evaluation, dynamic hand monitoring in tennis playing, VR manipulation, and human-robot interaction.

In conclusion, this thesis innovates in the design, fabrication and algorithm of large-scale tactile systems, standing out from previous approaches in terms of small datasets, large sensing range, large sensing area, and high accuracy. This research not only enhances our understanding of hand-centric actions but also highlights the convergence of physical and digital realms, paving the way for advancements in AI-based sensor technologies.
Date of Award4 Aug 2025
Original languageEnglish
Awarding Institution
  • City University of Hong Kong
SupervisorJiachen ZHANG (Supervisor) & Yajing SHEN (Supervisor)

Keywords

  • tactile sensing
  • vision technology
  • spiral arrangement
  • neural network
  • physical model
  • virtual reality
  • medical applications
  • human-robot interactions

Cite this

'