Abstract
We introduce here an incremental version of slow feature analysis (IncSFA), combining candid covariance-free incremental principal components analysis (CCIPCA) and covariance-free incremental minor components analysis (CIMCA). IncSFA's feature updating complexity is linear with respect to the input dimensionality,while batch SFA's (BSFA) updating complexity is cubic. IncSFAdoes not need to store, or even compute, any covariance matrices. The drawback to IncSFA is data efficiency: it does not use each data point as effectively as BSFA. But IncSFA allows SFA to be tractably applied, with just a few parameters, directly on highdimensional input streams (e.g., visual input of an autonomous agent), while BSFA has to resort to hierarchical receptive-field-based architectures when the input dimension is too high. Further, IncSFA's updates have simple Hebbian and anti-Hebbian forms, extending the biologicalplausibility of SFA. Experimental results show IncSFA learns the same set of features as BSFA and can handle a few cases where BSFA fails. © 2012 Massachusetts Institute of Technology.
Original language | English (US) |
---|---|
Pages (from-to) | 2994-3024 |
Number of pages | 31 |
Journal | Neural Computation |
Volume | 24 |
Issue number | 11 |
DOIs | |
State | Published - Jan 1 2012 |
Externally published | Yes |
Bibliographical note
Generated from Scopus record by KAUST IRTS on 2022-09-14ASJC Scopus subject areas
- Cognitive Neuroscience