Abstract
Nonnegative Matrix Factorization (NMF) has been a popular representation method for pattern classification problems. It tries to decompose a nonnegative matrix of data samples as the product of a nonnegative basis matrix and a nonnegative coefficient matrix. The columns of the coefficient matrix can be used as new representations of these data samples. However, traditional NMF methods ignore class labels of the data samples. In this paper, we propose a novel supervised NMF algorithm to improve the discriminative ability of the new representation by using the class labels. Using the class labels, we separate all the data sample pairs into within-class pairs and between-class pairs. To improve the discriminative ability of the new NMF representations, we propose to minimize the maximum distance of the within-class pairs in the new NMF space, and meanwhile to maximize the minimum distance of the between-class pairs. With this criterion, we construct an objective function and optimize it with regard to basis and coefficient matrices, and slack variables alternatively, resulting in an iterative algorithm. The proposed algorithm is evaluated on three pattern classification problems and experiment results show that it outperforms the state-of-the-art supervised NMF methods.
Original language | English (US) |
---|---|
Pages (from-to) | 75-84 |
Number of pages | 10 |
Journal | Neural Networks |
Volume | 61 |
DOIs | |
State | Published - Oct 26 2014 |
Bibliographical note
KAUST Repository Item: Exported on 2020-10-01ASJC Scopus subject areas
- Artificial Intelligence
- Cognitive Neuroscience