Regularized Discriminant Analysis: A Large Dimensional Study

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

This paper focuses on studying the performance of general regularized discriminant analysis (RDA) classifiers based on the Gaussian mixture model with different means and covariances. RDA offers a rich class of regularization options, covering as special cases the regularized linear discriminant analysis (RLDA) and the regularized quadratic discriminant analysis (RQDA) classifiers. Based on fundamental results from random matrix theory, we analyze RDA under the double asymptotic regime where the data dimension and the training size both increase in a proportional way. Under the double asymptotic regime and some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that only depends on the data statistical parameters and dimensions. This result can be leveraged to select the optimal parameters that minimize the classification error, thus yielding the optimal classifier. Numerical results are provided to validate our theoretical findings on synthetic data showing high accuracy of our derivations.
Original languageEnglish (US)
Title of host publication2018 IEEE International Symposium on Information Theory (ISIT)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages536-540
Number of pages5
ISBN (Print)9781538647806
DOIs
StatePublished - Aug 16 2018

Bibliographical note

KAUST Repository Item: Exported on 2023-03-20

Fingerprint

Dive into the research topics of 'Regularized Discriminant Analysis: A Large Dimensional Study'. Together they form a unique fingerprint.

Cite this