Risk Convergence of Centered Kernel Ridge Regression with Large Dimensional Data

Khalil Elkhalil, Abla Kammoun, Xiangliang Zhang, Mohamed-Slim Alouini, Tareq Y. Al-Naffouri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper carries out a large dimensional analysis of a variation of kernel ridge regression that we call centered kernel ridge regression (CKRR), also known in the literature as kernel ridge regression with offset. This modified technique is obtained by accounting for the bias in the regression problem resulting in the old kernel ridge regression but with centered kernels. The analysis is carried out under the assumption that the data is drawn from a Gaussian distribution and heavily relies on tools from random matrix theory (RMT). Under the regime in which the data dimension and the training size grow infinitely large with fixed ratio and under some mild assumptions controlling the data statistics, we show that both the empirical and the prediction risks converge to a deterministic quantities that describe in closed form fashion the performance of CKRR in terms of the data statistics and dimensions. A key insight of the proposed analysis is the fact that asymptotically a large class of kernels achieve the same minimum prediction risk. This insight is validated with synthetic data.
Original languageEnglish (US)
Title of host publicationICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
PublisherIEEE
Pages8763-8767
Number of pages5
ISBN (Print)978-1-5090-6632-2
DOIs
StatePublished - 2020

Bibliographical note

KAUST Repository Item: Exported on 2021-03-25

Fingerprint

Dive into the research topics of 'Risk Convergence of Centered Kernel Ridge Regression with Large Dimensional Data'. Together they form a unique fingerprint.

Cite this