Sparse Polynomial Chaos expansions using variational relevance vector machines

Panagiotis Tsilifis, Iason Papaioannou, Daniel Straub, Fabio Nobile

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


The challenges for non-intrusive methods for Polynomial Chaos modeling lie in the computational efficiency and accuracy under a limited number of model simulations. These challenges can be addressed by enforcing sparsity in the series representation through retaining only the most important basis terms. In this work, we present a novel sparse Bayesian learning technique for obtaining sparse Polynomial Chaos expansions which is based on a Relevance Vector Machine model and is trained using Variational Inference. The methodology shows great potential in high-dimensional data-driven settings using relatively few data points and achieves user-controlled sparse levels that are comparable to other methods such as compressive sensing. The proposed approach is illustrated on two numerical examples, a synthetic response function that is explored for validation purposes and a low-carbon steel plate with random Young's modulus and random loading, which is modeled by stochastic finite element with 38 input random variables.
Original languageEnglish (US)
Pages (from-to)109498
JournalJournal of Computational Physics
StatePublished - Sep 2020
Externally publishedYes

Bibliographical note

KAUST Repository Item: Exported on 2021-02-11
Acknowledged KAUST grant number(s): OSR-2015-CRG4-2585-01
Acknowledgements: P.T. and F.N. acknowledge the support from the King Abdullah University of Science and Technology (KAUST) Grant OSR-2015-CRG4-2585-01: “Advanced Multi-Level sampling techniques for Bayesian Inverse Problems with applications to subsurface”.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.


Dive into the research topics of 'Sparse Polynomial Chaos expansions using variational relevance vector machines'. Together they form a unique fingerprint.

Cite this