Pareto Multi-Task Learning

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)

View graph of relations

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 32 (NIPS 2019)
EditorsH. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, R. Garnett
Number of pages11
Publication statusPublished - Dec 2019

Publication series

NameAdvances in Neural Information Processing Systems
Volume32
ISSN (Print)1049-5258

Conference

Title33rd Conference on Neural Information Processing Systems (NeurIPS 2019)
LocationVancouver Convention Center
PlaceCanada
CityVancouver
Period8 - 14 December 2019

Abstract

Multi-task learning is a powerful method for solving multiple correlated tasks simultaneously. However, it is often impossible to find one single solution to optimize all the tasks, since different tasks might conflict with each other. Recently, a novel method is proposed to find one single Pareto optimal solution with good trade-off among different tasks by casting multi-task learning as multiobjective optimization. In this paper, we generalize this idea and propose a novel Pareto multi-task learning algorithm (Pareto MTL) to find a set of well-distributed Pareto solutions which can represent different trade-offs among different tasks. The proposed algorithm first formulates a multi-task learning problem as a multiobjective optimization problem, and then decomposes the multiobjective optimization problem into a set of constrained subproblems with different trade-off preferences. By solving these subproblems in parallel, Pareto MTL can find a set of well-representative Pareto optimal solutions with different trade-off among all tasks. Practitioners can easily select their preferred solution from these Pareto solutions, or use different trade-off solutions for different situations. Experimental results confirm that the proposed algorithm can generate well-representative solutions and outperform some state-of-the-art algorithms on many multi-task learning applications.

Research Area(s)

  • Multi-Task learning, Transfer learning

Bibliographic Note

Information for this record is supplemented by the author(s) concerned.

Citation Format(s)

Pareto Multi-Task Learning. / Lin, Xi; Zhen, Hui-Ling; Li, Zhenhua; Zhang, Qingfu; Kwong, Sam.

Advances in Neural Information Processing Systems 32 (NIPS 2019). ed. / H. Wallach; H. Larochelle; A. Beygelzimer; F. d'Alché-Buc; E. Fox; R. Garnett. 2019. (Advances in Neural Information Processing Systems; Vol. 32).

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)