We investigate random projections in the context of randomly projected linear discriminant analysis (LDA). We consider the case in which the data of dimension p is randomly projected onto a lower dimensional space before being fed to the classifier. Using fundamental results from random matrix theory and relying on some mild assumptions, we show that the asymptotic performance in terms of probability of misclassification approaches a deterministic quantity that only depends on the data statistics and the dimensions involved. Such results permits to reliably predict the performance of projected LDA as a function of the reduced dimension d < p and thus helps to determine the minimum d to achieve a certain desired performance. Finally, we validate our results with finite-sample settings drawn from both synthetic data and the popular MNIST dataset.
|Title of host publication
|ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
|Institute of Electrical and Electronics Engineers (IEEE)
|Number of pages
|Published - May 2019
Bibliographical noteKAUST Repository Item: Exported on 2020-10-01
Acknowledgements: The authors thank Vahid Tarokh for valuable discussions.