Abstract
This paper introduces a temporal version of Probabilistic Kernel Principal Component Analysis by using a hidden Markov model in order to obtain optimized representations of observed data through time. Recently introduced. Probabilistic Kernel Principal Component Analysis overcomes the two main disadvantages of standard Principal Component Analysis, namely, absence of probability density model and lack of high-order statistical information due to its linear structure. We extend this probabilistic approach of KPCA to mixture models in time, to enhance the capabilities of transformation and reduction of time series vectors. Results over voice disorder databases show improvements in classification accuracies even with highly reduced representations. © Springer-Verlag Berlin Heidelberg 2006.
Original language | English (US) |
---|---|
Title of host publication | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Publisher | Springer Verlag |
Pages | 747-754 |
Number of pages | 8 |
ISBN (Print) | 3540464794 |
DOIs | |
State | Published - Jan 1 2006 |
Externally published | Yes |