Skip to main navigation Skip to search Skip to main content

Texture Affinity Cue-Aware Relationship Representation via Transformers for Facial Expression Recognition in Affective Robots

Hai Liu (Co-first Author), Zhibing Liu (Co-first Author), Feifei Li, Tingting Liu*, Zhaoli Zhang, Neal N. Xiong, You-Fu Li*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

Automatic facial expression recognition (FER) from facial videos is a key component in enabling machines to understand human emotional states, which is crucial for affective robots designed to be interactive companions and applied in smart healthcare. However, FER is susceptible to challenges such as occlusion, arbitrary orientations, and illumination, making it difficult to implement precise FER models in robots. To address these issues, we propose a texture affinity cues-aware relationship representation method (FTATrans), which learns to associate facial texture with facial expressions in videos. The research reveals two key findings: 1) interaction of facial textures, and 2) texture affinity effects. On this basis, FTATrans mainly consists of two key networks: semantic-information feature generation (SFG) and texture-affinity relationship mining (TAR). In particular, the semantic relationships between different facial regions can be learned through SFG. TAR is used to capture the texture affinity relationship and integrate them with the overall facial expression information. Additionally, a loss function focused on expression-specific texture variations is proposed to guide the model in learning discriminative expression information. Experiments conducted on five video-based FER datasets demonstrate that the FTATrans model achieves state-of-the-art performance. © 2026 IEEE.
Original languageEnglish
Number of pages12
JournalIEEE Transactions on Industrial Informatics
Online published20 Feb 2026
DOIs
Publication statusOnline published - 20 Feb 2026

Funding

This work was supported in part by the National Natural Science Foundation of China under Grant 62577020, Grant 62573369, Grant 62477024, Grant 62377037, Grant 62277041, in part by Jiangxi Provincial Natural Science Foundation under Grant 20252BAC220007, Grant 20252BAC240201, Grant 20242BAB2S107, and Grant 20232BAB212026, in part by the National Natural Science Foundation of Hubei Province under Grant 2025AFD621, in part by Shenzhen Science and Technology Program under Grant JCYJ20250604185710014 and Grant JCYJ20230807152900001, and in part by Guangdong Basic and Applied Basic Research Foundation under Grant 2025A1515010266.

Research Keywords

  • Affective robot
  • facial affinity field
  • facial expression recognition (FER)
  • human–robot interaction
  • relationship-driven

Fingerprint

Dive into the research topics of 'Texture Affinity Cue-Aware Relationship Representation via Transformers for Facial Expression Recognition in Affective Robots'. Together they form a unique fingerprint.

Cite this