Event-Based 3D Human Pose Estimation and Tracking

Project: Research

View graph of relations

Description

Estimating and tracking 3D human poses from visual sensors have a wide range of applications, including action recognition, sports analysis, video surveillance, and human-robot interaction. While advances have been made in pose estimation and tracking, there are still problems to be solved in challenging scenes such as crowded backgrounds, poor illumination, and fast motion. In recent years, event cameras, a type of bio-inspired visual sensors, have attracted much attention from academia and industry. Compared with conventional cameras, event cameras have very high temporal time resolution, high dynamic range, low latency, and low power consumptions. In addition, the redundancy reduction and data sparsity provided by event cameras make both the computation and memory lighter, while the key information can be preserved. Therefore, event camera provides an attractive candidate for applications in motion-related tasks. In particular, event cameras have its high potential for real-time 3D human pose estimation and tracking which may operate under uncontrolled lighting conditions and crowded backgrounds, while the low latency and power consumption can be guaranteed. Human pose tracking desires low latency pose prediction and tracking. This motivates our study in exploring event cameras for 3D human pose tracking in challenging scenes. This project proposes a novel voxel-wise representation with a graph neural network to encode event signals to unlock the potential of event cameras, so that meaningful spatial-temporal features can be extracted for human pose estimation and tracking. In pose estimation, a temporal densely connected recurrent network will be specially proposed to model the temporal dependency and geometric consistency among a sequence of event streams at different time steps to help complete the information loss due to the sparsity of event data. For the pose estimation and tracking, unlike most existing work that consider pose estimation and tracking as two separately tasks, a novel method will be proposed to couple the pose estimation and tracking in a unified framework. The pose estimation and tracking can benefit from each other, where the pose estimation can initialize the tracking targets and meanwhile tracking results can largely reduce the state space in the pose estimation. In this project, a robust and fast 3D human pose estimation and tracking system based on event cameras will be built, where it can be operated in uncontrolled environments, such as in outdoor scenes and in poor light conditions.  

Detail(s)

Project number9043323
Grant typeGRF
StatusNot started
Effective start/end date1/01/23 → …