Abstract
This paper focuses on studying the performance of general regularized discriminant analysis (RDA) classifiers based on the Gaussian mixture model with different means and covariances. RDA offers a rich class of regularization options, covering as special cases the regularized linear discriminant analysis (RLDA) and the regularized quadratic discriminant analysis (RQDA) classifiers. Based on fundamental results from random matrix theory, we analyze RDA under the double asymptotic regime where the data dimension and the training size both increase in a proportional way. Under the double asymptotic regime and some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that only depends on the data statistical parameters and dimensions. This result can be leveraged to select the optimal parameters that minimize the classification error, thus yielding the optimal classifier. Numerical results are provided to validate our theoretical findings on synthetic data showing high accuracy of our derivations.
Original language | English (US) |
---|---|
Title of host publication | 2018 IEEE International Symposium on Information Theory, ISIT 2018 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 536-540 |
Number of pages | 5 |
ISBN (Print) | 9781538647806 |
DOIs | |
State | Published - Aug 15 2018 |
Event | 2018 IEEE International Symposium on Information Theory, ISIT 2018 - Vail, United States Duration: Jun 17 2018 → Jun 22 2018 |
Publication series
Name | IEEE International Symposium on Information Theory - Proceedings |
---|---|
Volume | 2018-June |
ISSN (Print) | 2157-8095 |
Conference
Conference | 2018 IEEE International Symposium on Information Theory, ISIT 2018 |
---|---|
Country/Territory | United States |
City | Vail |
Period | 06/17/18 → 06/22/18 |
Bibliographical note
Publisher Copyright:© 2018 IEEE.
ASJC Scopus subject areas
- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics