Robust Cost-Sensitive Learning for Recommendation with Implicit Feedback

Peng Yang, Peilin Zhao, Yong Liu, Xin Gao

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    13 Scopus citations


    This paper aims at improvement on the effectiveness of matrix decomposition (MD) methods for implicit feedback. We highlight two critical limitations of existing works. First, due to the large number of unlabeled feedback, most existing works employ a uniform weight to the missing data to reduce computational complexity. However, such a uniform assumption may rarely hold in real-world scenarios. Second, the commonly-used bilateral loss function might be infinite if the data point is mis-classified. Outliers may have such issues and misguide the learning process. We address the above two issues by learning a robust asymmetric learning model. By leveraging the cost-sensitive learning and capped unilateral loss function, our robust MD objective function integrates them into a joint formulation, where the low-rank basis for user/item profiles can be modeled in an effective and robust way. Particularly, a novel log-determinant function is employed to refine the nuclear norm with respect to the low-rank approximation. We derive an iterative re-weighted algorithm to efficiently minimize this MD objective, and also rigorously prove a lower error bound of the proposed algorithm compared to the 1-bit matrix completion method. Finally, we show the promising experimental results of our algorithm on benchmark recommendation datasets.
    Original languageEnglish (US)
    Title of host publicationProceedings of the 2018 SIAM International Conference on Data Mining
    PublisherSociety for Industrial & Applied Mathematics (SIAM)
    Number of pages9
    ISBN (Print)9781611975321
    StatePublished - May 7 2018

    Bibliographical note

    KAUST Repository Item: Exported on 2020-10-01


    Dive into the research topics of 'Robust Cost-Sensitive Learning for Recommendation with Implicit Feedback'. Together they form a unique fingerprint.

    Cite this