Asymptotic Analysis of an Ensemble of Randomly Projected Linear Discriminants

Lama B. Niyazi, Abla Kammoun, Hayssam Dahrouj, Mohamed-Slim Alouini, Tareq Y. Al-Naffouri

Research output: Contribution to journalArticlepeer-review


Datasets from the fields of bioinformatics, chemometrics, and face recognition are typically characterized by small samples of high-dimensional data. Among the many variants of linear discriminant analysis that have been proposed in order to rectify the issues associated with classification in such a setting, the classifier in durrant2013random, composed of an ensemble of randomly projected linear discriminants, seems especially promising; it is computationally efficient and, with the optimal projection dimension parameter setting, is competitive with the state-of-the-art. In this work, we seek to further understand the behavior of this classifier through asymptotic analysis. Under the assumption of a growth regime in which the dataset and projection dimensions grow at constant rates to each other, we use random matrix theory to derive asymptotic misclassification probabilities showing the effect of the ensemble as a regularization of the data sample covariance matrix. The asymptotic errors further help to identify situations in which the ensemble offers a performance advantage. We also develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator, which is conventionally used for parameter tuning. Finally, we demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
Original languageEnglish (US)
Pages (from-to)1-1
Number of pages1
JournalIEEE Journal on Selected Areas in Information Theory
StatePublished - 2020

Bibliographical note

KAUST Repository Item: Exported on 2020-12-07


Dive into the research topics of 'Asymptotic Analysis of an Ensemble of Randomly Projected Linear Discriminants'. Together they form a unique fingerprint.

Cite this