Bayesian nonlinear support vector machines and discriminative factor modeling

Ricardo Henao, Xin Yuan, Lawrence Carin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

24 Scopus citations

Abstract

A new Bayesian formulation is developed for nonlinear support vector machines (SVMs), based on a Gaussian process and with the SVM hinge loss expressed as a scaled mixture of normals. We then integrate the Bayesian SVM into a factor model, in which feature learning and nonlinear classifier design are performed jointly; almost all previous work on such discriminative feature learning has assumed a linear classifier. Inference is performed with expectation conditional maximization (ECM) and Markov Chain Monte Carlo (MCMC). An extensive set of experiments demonstrate the utility of using a nonlinear Bayesian SVM within discriminative feature learning and factor modeling, from the standpoints of accuracy and interpretability.
Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
Pages1754-1762
Number of pages9
StatePublished - Jan 1 2014
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2021-02-09

Fingerprint

Dive into the research topics of 'Bayesian nonlinear support vector machines and discriminative factor modeling'. Together they form a unique fingerprint.

Cite this