TY - GEN
T1 - FinTransformer
T2 - 4th International Conference on Artificial Intelligence, Internet and Digital Economy (ICAID 2025)
AU - Yan, Yuxuan
AU - Bu, William
AU - Yao, Zhongyu
PY - 2025
Y1 - 2025
N2 - Fund return prediction is critical for investment decisions, but traditional methods struggle to capture the complex nonlinear relationships and long-term dependencies in financial data. In this paper, we propose FinTransformer, a deep learning model designed for fund return forecasting. Although Transformer performs well in sequence modelling, its direct application to financial forecasting faces three major challenges: inability to distinguish the importance of near- and far-term data, lack of feature type differentiation, and insufficient quantification of forecast uncertainty. To address these challenges, we design three key innovations: (1) a finance-specific attention mechanism, which enables the model to adaptively adjust the importance of data at different time points through temporal bias; (2) a feature fusion layer, which intelligently integrates features of different types and abstraction levels; and (3) a multi-task learning framework, which simultaneously forecasts returns, market states, and uncertainty estimates. Experiments on a real dataset containing 26,093 funds show that FinTransformer significantly outperforms existing methods with a 15.3% reduction in RMSE and a 12.0% improvement in R2. The model attention weighting analysis reveals the contribution of different features to the prediction and enhances the interpretability. This study not only advances the application of deep learning in financial forecasting, but also provides a more accurate tool for investment decision-making. © 2025 IEEE.
AB - Fund return prediction is critical for investment decisions, but traditional methods struggle to capture the complex nonlinear relationships and long-term dependencies in financial data. In this paper, we propose FinTransformer, a deep learning model designed for fund return forecasting. Although Transformer performs well in sequence modelling, its direct application to financial forecasting faces three major challenges: inability to distinguish the importance of near- and far-term data, lack of feature type differentiation, and insufficient quantification of forecast uncertainty. To address these challenges, we design three key innovations: (1) a finance-specific attention mechanism, which enables the model to adaptively adjust the importance of data at different time points through temporal bias; (2) a feature fusion layer, which intelligently integrates features of different types and abstraction levels; and (3) a multi-task learning framework, which simultaneously forecasts returns, market states, and uncertainty estimates. Experiments on a real dataset containing 26,093 funds show that FinTransformer significantly outperforms existing methods with a 15.3% reduction in RMSE and a 12.0% improvement in R2. The model attention weighting analysis reveals the contribution of different features to the prediction and enhances the interpretability. This study not only advances the application of deep learning in financial forecasting, but also provides a more accurate tool for investment decision-making. © 2025 IEEE.
KW - deep learning
KW - feature fusion
KW - financial time series
KW - forecast uncertainty
KW - fund return forecasting
KW - interpretability
KW - investment decision support
KW - multi-task learning
KW - time-biased attention mechanism
KW - Transformer
UR - http://www.scopus.com/inward/record.url?scp=105011095835&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-105011095835&origin=recordpage
U2 - 10.1109/ICAID65275.2025.11034383
DO - 10.1109/ICAID65275.2025.11034383
M3 - RGC 32 - Refereed conference paper (with host publication)
SN - 979-8-3315-1067-1
T3 - International Conference on Artificial Intelligence, Internet and Digital Economy, ICAID
SP - 302
EP - 306
BT - 2025 4th International Conference on Artificial Intelligence, Internet and Digital Economy (ICAID)
PB - IEEE
Y2 - 25 April 2025 through 27 April 2025
ER -