TY - GEN
T1 - Quadratically gated mixture of experts for incomplete data classification
AU - Liao, Xuejun
AU - Li, Hui
AU - Carin, Lawrence
N1 - Generated from Scopus record by KAUST IRTS on 2021-02-09
PY - 2007/8/23
Y1 - 2007/8/23
N2 - We introduce quadratically gated mixture of experts (QGME), a statistical model for multi-class nonlinear classification. The QGME is formulated in the setting of incomplete data, where the data values are partially observed. We show that the missing values entail joint estimation of the data manifold and the classifier, which allows adaptive imputation during classifier learning. The expectation maximization (EM) algorithm is derived for joint likelihood maximization, with adaptive imputation performed analytically in the E-step. The performance of QGME is evaluated on three benchmark data sets and the results show that the QGME yields significant improvements over competing methods.
AB - We introduce quadratically gated mixture of experts (QGME), a statistical model for multi-class nonlinear classification. The QGME is formulated in the setting of incomplete data, where the data values are partially observed. We show that the missing values entail joint estimation of the data manifold and the classifier, which allows adaptive imputation during classifier learning. The expectation maximization (EM) algorithm is derived for joint likelihood maximization, with adaptive imputation performed analytically in the E-step. The performance of QGME is evaluated on three benchmark data sets and the results show that the QGME yields significant improvements over competing methods.
UR - http://portal.acm.org/citation.cfm?doid=1273496.1273566
UR - http://www.scopus.com/inward/record.url?scp=34547995476&partnerID=8YFLogxK
U2 - 10.1145/1273496.1273566
DO - 10.1145/1273496.1273566
M3 - Conference contribution
SP - 553
EP - 560
BT - ACM International Conference Proceeding Series
ER -