Abstract
Two-way functional data consist of a data matrix whose row and column domains are both structured, for example, temporally or spatially, as when the data are time series collected at different locations in space. We extend one-way functional principal component analysis (PCA) to two-way functional data by introducing regularization of both left and right singular vectors in the singular value decomposition (SVD) of the data matrix. We focus on a penalization approach and solve the nontrivial problem of constructing proper two-way penalties from oneway regression penalties. We introduce conditional cross-validated smoothing parameter selection whereby left-singular vectors are cross- validated conditional on right-singular vectors, and vice versa. The concept can be realized as part of an alternating optimization algorithm. In addition to the penalization approach, we briefly consider two-way regularization with basis expansion. The proposed methods are illustrated with one simulated and two real data examples. Supplemental materials available online show that several "natural" approaches to penalized SVDs are flawed and explain why so. © 2009 American Statistical Association.
Original language | English (US) |
---|---|
Pages (from-to) | 1609-1620 |
Number of pages | 12 |
Journal | Journal of the American Statistical Association |
Volume | 104 |
Issue number | 488 |
DOIs | |
State | Published - Dec 2009 |
Externally published | Yes |
Bibliographical note
KAUST Repository Item: Exported on 2020-10-01Acknowledged KAUST grant number(s): KUS-CI-016-04
Acknowledgements: Jianhua Z. Huang’s work was partially supported by NSF grant DMS-0606580, NCI grant CA57030, and Award Number KUS-CI-016-04, made by King Abdullah University of Science and Technology (KAUST). Haipeng Shen’s work was partially supported by NSF grant DMS-0606577, CMMI-0800575, and UNC-CH R. J. Reynolds Fund Award for Junior Faculty Development.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.