Robust Online Multi-Task Learning with Correlative and Personalized Structures

Peng Yang, Peilin Zhao, Xin Gao

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


Multi-Task Learning (MTL) can enhance a classifier's generalization performance by learning multiple related tasks simultaneously. Conventional MTL works under the offline setting and suffers from expensive training cost and poor scalability. To address such issues, online learning techniques have been applied to solve MTL problems. However, most existing algorithms of online MTL constrain task relatedness into a presumed structure via a single weight matrix, which is a strict restriction that does not always hold in practice. In this paper, we propose a robust online MTL framework that overcomes this restriction by decomposing the weight matrix into two components: the first one captures the low-rank common structure among tasks via a nuclear norm; the second one identifies the personalized patterns of outlier tasks via a group lasso. Theoretical analysis shows the proposed algorithm can achieve a sub-linear regret with respect to the best linear model in hindsight. However, the nuclear norm that simply adds all nonzero singular values together may not be a good low-rank approximation. To improve the results, we use a log-determinant function as a non-convex rank approximation. Experimental results on a number of real-world applications also verify the efficacy of our approaches.
Original languageEnglish (US)
Pages (from-to)2510-2521
Number of pages12
JournalIEEE Transactions on Knowledge and Data Engineering
Issue number11
StatePublished - Jun 29 2017

Bibliographical note

KAUST Repository Item: Exported on 2020-10-01


Dive into the research topics of 'Robust Online Multi-Task Learning with Correlative and Personalized Structures'. Together they form a unique fingerprint.

Cite this