Encoding sparse and competitive structures among tasks in multi-task learning
Related Research Unit(s)
|Journal / Publication||Pattern Recognition|
|Online published||18 Dec 2018|
|Publication status||Published - Apr 2019|
|Link to Scopus||https://www.scopus.com/record/display.uri?eid=2-s2.0-85058960250&origin=recordpage|
Multi-task learning (MTL) aims to enhance generalization performance by exploring the inherent structures across tasks. Most existing MTL methods are based on the assumption that the tasks are positively correlated, and utilize the shared structures among tasks to improve learning performance. By contrast, there also exist competitive structure (negative relationships) among tasks in some real-world applications, and conventional MTL methods which explore shared structures across tasks may lead to unsatisfactory performance in this setting. Another challenge, especially in a high dimensional setting, is to exclude irrelevant features (sparse structure) from the final model. For this purpose, this work propose a new method, which is referred to as Sparse Exclusive Lasso (SpEL) for multi-task learning. The proposed SpEL is able to capture the competitive relationship among tasks (competitive structure), while remove unimportant features which are common across the tasks from the final model (sparse structure). Experimental studies on synthetic and real data indicate that the proposed method can significantly improve learning performance by identifying sparse and task-competitive structures simultaneously.
- Multi-task learning, Sparse exclusive lasso, Task-competitive
Pattern Recognition, Vol. 88, 04.2019, p. 689-701.
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review