Encoding sparse and competitive structures among tasks in multi-task learning

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

4 Scopus Citations
View graph of relations

Related Research Unit(s)


Original languageEnglish
Pages (from-to)689-701
Journal / PublicationPattern Recognition
Online published18 Dec 2018
Publication statusPublished - Apr 2019


Multi-task learning (MTL) aims to enhance generalization performance by exploring the inherent structures across tasks. Most existing MTL methods are based on the assumption that the tasks are positively correlated, and utilize the shared structures among tasks to improve learning performance. By contrast, there also exist competitive structure (negative relationships) among tasks in some real-world applications, and conventional MTL methods which explore shared structures across tasks may lead to unsatisfactory performance in this setting. Another challenge, especially in a high dimensional setting, is to exclude irrelevant features (sparse structure) from the final model. For this purpose, this work propose a new method, which is referred to as Sparse Exclusive Lasso (SpEL) for multi-task learning. The proposed SpEL is able to capture the competitive relationship among tasks (competitive structure), while remove unimportant features which are common across the tasks from the final model (sparse structure). Experimental studies on synthetic and real data indicate that the proposed method can significantly improve learning performance by identifying sparse and task-competitive structures simultaneously.

Research Area(s)

  • Multi-task learning, Sparse exclusive lasso, Task-competitive