Provable In-Context Vector Arithmetic via Retrieving Task Concepts

Dake Bu, Wei Huang, Andi Han, Atsushi Nitanda, Taiji Suzuki, Qingfu Zhang, Hau-San Wong*

*Corresponding author for this work

Research output: Conference PapersPosterpeer-review

Abstract

In-context learning (ICL) has garnered significant attention for its ability to grasp functions and tasks from demonstrations. Recent empirical studies suggest the presence of a Task/Function Vector in the latent geometry of large language models (LLMs) during ICL. Merullo et al. (2024) showed that LLMs leverage this vector alongside the residual stream for Word2Vec-like vector arithmetic, solving factual-recall ICL tasks. Additionally, recent work empirically highlighted the key role of Question-Answer data in enhancing factual-recall capabilities. Despite these insights, a theoretical explanation remains elusive. To move one step forward, this work provides a theoretical framework building on empirically grounded hierarchical concept modeling. We develop an optimization theory, showing how nonlinear residual transformers trained via gradient descent on cross-entropy loss perform factual-recall ICL tasks via task vector arithmetic. We prove 0-1 loss convergence and highlight their superior generalization capabilities, adeptly handling concept recombinations and data shifts. These findings underscore the advantages of transformers over static word embedding methods. Empirical simulations corroborate our theoretical insights.
Original languageEnglish
Publication statusAccepted/In press/Filed - 1 May 2025
Event42nd International Conference on Machine Learning, ICML 2025 - Vancouver Convention Center, Vancouver, Canada
Duration: 13 Jul 202519 Jul 2025
https://icml.cc/Conferences/2025

Conference

Conference42nd International Conference on Machine Learning, ICML 2025
Abbreviated titleICML 2025
Country/TerritoryCanada
CityVancouver
Period13/07/2519/07/25
Internet address

Bibliographical note

Information for this record is supplemented by the author(s) concerned. Since this conference is yet to commence, the information for this record is subject to revision.

Fingerprint

Dive into the research topics of 'Provable In-Context Vector Arithmetic via Retrieving Task Concepts'. Together they form a unique fingerprint.

Cite this