Improving long-term online prediction with decoupled extended Kalman filters

Juan A. Pérez-Ortiz, Jürgen Schmidhuber, Felix A. Gers, Douglas Eck

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations


Long Short-Term Memory (LSTM) recurrent neural networks (RNNs) outperform traditional RNNs when dealing with sequences involving not only short-term but also long-term dependencies. The decoupled extended Kalman filter learning algorithm (DEKF) works well in online environments and reduces significantly the number of training steps when compared to the standard gradient-descent algorithms. Previous work on LSTM, however, has always used a form of gradient descent and has not focused on true online situations. Here we combine LSTM with DEKF and show that this new hybrid improves upon the original learning algorithm when applied to online processing. © Springer-Verlag Berlin Heidelberg 2002.
Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer Verlag
Number of pages6
ISBN (Print)9783540440741
StatePublished - Jan 1 2002
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2022-09-14


Dive into the research topics of 'Improving long-term online prediction with decoupled extended Kalman filters'. Together they form a unique fingerprint.

Cite this