Abstract
Session-based recommender systems (SBRs) are critically impaired by exposure bias in observational training logs, causing models to overfit to logging policies rather than true user preferences. This bias distorts offline evaluation and harms generalization, particularly for long-tail items. To address this, we propose the Propensity- and Temporal-consistency Enhanced Graph Transformer (PTE-GT), a principled framework that enhances a recent interval-aware graph transformer backbone with two synergistic training-time modules. This Graph Neural Network -based architecture is adept at modeling the complex, graph-structured nature of session data, capturing intricate item transitions that sequential models might miss. First, we introduce a propensity-aware (PA) optimization objective based on the self-normalized inverse propensity scoring (SNIPS) estimator. This module leverages logs containing randomized exposure or logged behavior-policy propensities to learn an unbiased risk estimate, correcting for the biased data distribution. Second, we design a lightweight, view-free temporal consistency (TC) contrastive regularizer that enforces alignment between session prefixes and suffixes, improving representation robustness without computationally expensive graph augmentations, which are often a bottleneck for graph-based contrastive methods. We conduct comprehensive evaluations on three public session-based benchmarks—KuaiRand, the OTTO e-commerce challenge dataset (OTTO), and the YOOCHOOSE-1/64 split (YOOCHOOSE)—and additionally on the publicly available Open Bandit Dataset (OBD) containing logged bandit propensities. Our results demonstrate that PTE-GT significantly outperforms strong baselines. Critically, on datasets with randomized exposure or logged propensities, our unbiased evaluation protocol, using SNIPS-weighted metrics, reveals a substantial performance leap that is masked by standard, biased metrics. Our method also shows marked improvements in model calibration and long-tail item recommendation. © 2025 by the authors.
| Original language | English |
|---|---|
| Article number | 84 |
| Number of pages | 26 |
| Journal | Electronics |
| Volume | 15 |
| Issue number | 1 |
| Online published | 24 Dec 2025 |
| DOIs | |
| Publication status | Published - Jan 2026 |
Funding
This research received no external funding.
Research Keywords
- graph transformer
- session-based recommendation
- debiasing
- GNNS
- inverse propensity scoring
- contrastive learning
- randomized exposure
- digital economy
Publisher's Copyright Statement
- This full text is made available under CC-BY 4.0. https://creativecommons.org/licenses/by/4.0/
Fingerprint
Dive into the research topics of 'Debiasing Session-Based Recommendation for the Digital Economy: Propensity-Aware Training and Temporal Contrast on Graph Transformers'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver