Abstract
The cost of computing the spectrum of Laplacian matrices hinders the application of spectral clustering to large data sets. While approximations recover computational tractability, they can potentially affect clustering performance. This paper proposes a practical approach to learn spectral clustering, where the spectrum of the Laplacian is recovered following a constrained optimization problem that we solve using adaptive mini-batch-based stochastic gradient optimization on Stiefel manifolds. Crucially, the proposed approach is formulated so that the memory footprint of the algorithm is low, the cost of each iteration is linear in the number of samples, and convergence to critical points of the objective function is guaranteed. Extensive experimental validation on data sets with up to half a million samples demonstrate its scalability and its ability to outperform state-of-the-art approximate methods to learn spectral clustering for a given computational budget.
Original language | English (US) |
---|---|
Title of host publication | 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 3888-3895 |
Number of pages | 8 |
ISBN (Electronic) | 9781509061815 |
DOIs | |
State | Published - Jun 30 2017 |
Event | 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States Duration: May 14 2017 → May 19 2017 |
Publication series
Name | Proceedings of the International Joint Conference on Neural Networks |
---|---|
Volume | 2017-May |
Conference
Conference | 2017 International Joint Conference on Neural Networks, IJCNN 2017 |
---|---|
Country/Territory | United States |
City | Anchorage |
Period | 05/14/17 → 05/19/17 |
Bibliographical note
Publisher Copyright:© 2017 IEEE.
ASJC Scopus subject areas
- Software
- Artificial Intelligence