Abstract
This paper considers the linear-Gaussian filtering problem in large-dimensions, a framework in which the Kalman filter (KF) can be computationally prohibitive. As a remedy, a hybrid scheme combining KF with variational Bayes (VB), an approach that designs an approximation of the filtering probability density function (pdf) based on a separable form under the Kullback-Leibler divergence minimization criterion, has been proposed. Despite its approximate and cyclic-iterative characters, this VBKF algorithm can provide comparable performances to the KF, at reduced computational requirements. Here, we resort to the memory gradient subspace (MGS) optimization in the space of separable pdfs to derive a new variant of VBKF endowed with a parallel-iterative structure. At a given iteration of the proposed MGS-VBKF, all separate marginals of the approximate filtering pdf are updated in parallel, which could lead to significant computational time savings, especially when the state is split into small enough partitions. We establish the connection between VBKF and MGS-VBKF, showing that the latter can be derived from the former by iterating instead in parallel on its equations and inserting a correction term representing the contribution of the gradient and memory directions. We further extend these algorithms to the case of unsupervised systems with unknown observation noise uncertainties. The performances of the proposed filters are finally studied and compared to those of the KF and a deterministic ensemble Kalman filter through numerical experiments.
Original language | English (US) |
---|---|
Pages (from-to) | 1-14 |
Number of pages | 14 |
Journal | IEEE Transactions on Signal Processing |
DOIs | |
State | Published - Dec 15 2022 |
Bibliographical note
KAUST Repository Item: Exported on 2022-12-19ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering