Embodied Interaction Techniques for Immersive Virtual Reality: From Lower Limb to Whole Body Experience in Virtual Reality


Student thesis: Doctoral Thesis

View graph of relations


Related Research Unit(s)


Awarding Institution
Award date6 Sept 2023


With the continuous advancement of sensor and computing technologies, embodied interaction (EI), which refers to the interaction between humans and digital systems or technology that involves the body and physical actions, has become an increasingly popular interaction method. It has been widely applied in many fields, including gaming, education, and healthcare. The advantages of EI are that it can enhance users’ immersion and engagement in virtual environments, providing a more realistic interactive experience.

In virtual reality (VR), EI technology can allow users to control virtual characters or objects through bodily movements and perceive the corresponding feedback, making users feel more natural and enhancing their experience. Currently, EI studies in VR mainly focus on the head and hands, while research on other body parts (such as the lower limbs) is relatively scarce. To meet this challenge, I first conducted two studies on lower limb-based EI in VR through input and output methods respectively, to fill the research gap in this area. Then I proposed a haptic authoring toolkit for a more comprehensive study and discussion of EIs based on different body parts in VR.

New techniques such as gesture recognition and artificial intelligence provide new methods and opportunities for embodied experiences in VR. In the first research, I mainly focused on how to provide more natural EIs through input techniques when moving and navigating in virtual scenes. More specifically, I present a series of studies investigating gesture-based Locomotion in Place (LIP) techniques for controlling the virtual walking speed in VR. This research aims to contribute an efficient and engaging LIP technique for immersive VR with the gesture-based speed-control mechanism. The first user study selected the gesture of marching in place according to the user preference and its distinguishable motion features for gain control. With the motion data recorded in the first user study, I experimented with the data-driven techniques for gain classification, and achieved an overall accuracy above 90% with Support-Vector-Machine (SVM) classification model. The last user study further showed the effectiveness of using marching in place (MIP) with gesture-based gain control for target reaching in VR, with its comparable accuracy towards teleportation and significantly higher user ratings on the VR experience.

In terms of output techniques, although there have been many related studies on haptic feedback to enhance EI, most of the research has focused on the upper limbs, especially the hands, and few studies on the lower limbs. So in the second research, I mainly focus on how to improve the user’s embodied experience in the virtual scene through the output techniques based on the lower limbs. I propose PropelWalker, a propeller-based lower-limb haptic device that simulates vertical force perception for walking in different virtual fluid materials. First, I designed and created a prototype of the PropelWalker, and conducted a pilot study to determine the optimal hardware setup. Subsequently, I performed a technical evaluation of the device, which showed that it could generate a vertical force of up to 27 N in two directions (i.e., up and down) within 0.85 seconds. I then proceeded to conduct two user perception studies to understand the ability of PropelWalker to generate discriminative force stimuli. First, I performed a just noticeable difference (JND) experiment to investigate the human perception threshold of leg airflow feedback. Our second perception study showed that users could differentiate between four force levels generated by the PropelWalker to simulate different walking media (i.e., dry ground, water, dirt, and sand) with an average accuracy of 94.2%. Finally, our VR user experience study shows that PropelWalker can significantly improve users’ sense of presence in VR.

In addition, few researchers have researched the EI of different body parts in VR due to limitations in technology, cost, and application scenarios. And it remains challenging for designers to prototype, program, and deploy soft haptic actuators for VR. To more comprehensively explore the potential and application of EI in various body parts in VR, I further propose FlexVibe, a skin-based haptic toolkit that can be used to provide rapid haptic authoring and allow immediate playback for experiencing in immersive environments, enabling rapid iteration of haptic interaction designs. This system integrates a software interface for rapid haptic prototyping with a set of flexible vibration haptic modules and a control board on a flexible circuit. Haptic authoring tools allow users to quickly authorize the haptic parameters of soft hardware modules that can be attached to different body parts. To motivate the design of the FlexVibe toolkit, I first conducted a pilot user interview to understand the process and issues of VR haptic prototyping. I then developed a series of applications to demonstrate the functionality and usability of our system. Finally, I conducted a user experiment to evaluate the usability of the FlexVibe toolkit. With this toolkit, we can better explore and understand the EI between the human body and VR.

In summary, I first filled the research gap in the lower limbs of EI in VR through two studies. Additionally, I further explored the research and application of EI on different body parts by proposing a haptic authoring toolkit. Our research results show that exploring EI in the lower limbs and the entire body of users in VR can bring various effects, including enhancing the interactive experience, improving accuracy and efficiency, reducing system costs and complexity, and providing more research opportunities and innovation.

    Research areas

  • Human computer interaction, Virtual reality, Embodied Interaction, Virtual locomotion, Haptic interface, Prototyping toolkit