Dynamic clustering via asymptotics of the dependent Dirichlet process mixture

Trevor Campbell, Miao Liu, Brian Kulis, Jonathan P. How, Lawrence Carin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

30 Scopus citations

Abstract

This paper presents a novel algorithm, based upon the dependent Dirichlet process mixture model (DDPMM), for clustering batch-sequential data containing an unknown number of evolving clusters. The algorithm is derived via a lowvariance asymptotic analysis of the Gibbs sampling algorithm for the DDPMM, and provides a hard clustering with convergence guarantees similar to those of the k-means algorithm. Empirical results from a synthetic test with moving Gaussian clusters and a test with real ADS-B aircraft trajectory data demonstrate that the algorithm requires orders of magnitude less computational time than contemporary probabilistic and hard clustering algorithms, while providing higher accuracy on the examined datasets.
Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
StatePublished - Jan 1 2013
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2021-02-09

Fingerprint

Dive into the research topics of 'Dynamic clustering via asymptotics of the dependent Dirichlet process mixture'. Together they form a unique fingerprint.

Cite this