HLIBCov: Parallel hierarchical matrix approximation of large covariance matrices and likelihoods with applications in parameter identification

Alexander Litvinenko, Ronald Kriemann, Marc G. Genton, Ying Sun, David E. Keyes

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

We provide more technical details about the HLIBCov package, which is using parallel hierarchical (H-) matrices to: • approximates large dense inhomogeneous covariance matrices with a log-linear computational cost and storage requirement; •computes matrix-vector product, Cholesky factorization and inverse with a log-linear complexity; •identify unknown parameters of the covariance function (variance, smoothness, and covariance length); These unknown parameters are estimated by maximizing the joint Gaussian log-likelihood function. To demonstrate the numerical performance, we identify three unknown parameters in an example with 2,000,000 locations on a PC-desktop.
Original languageEnglish (US)
Pages (from-to)100600
JournalMethodsX
Volume7
DOIs
StatePublished - Jul 12 2019

Bibliographical note

KAUST Repository Item: Exported on 2020-10-01
Acknowledgements: The research reported in this publication was supported by funding from the Alexander von Humboldt foundation (chair of Mathematics for Uncertainty Quantification at RWTH Aachen) and Extreme Computing Research Center (ECRC) at King Abdullah University of Science and Technology (KAUST).

Fingerprint

Dive into the research topics of 'HLIBCov: Parallel hierarchical matrix approximation of large covariance matrices and likelihoods with applications in parameter identification'. Together they form a unique fingerprint.

Cite this