Sequence Alignment with Dirichlet Process Mixtures

Ieva Kazlauskaite, Ivan Ustyuzhaninov, Carl Henrik Ek, Neill Campbell

Research output: Contribution to conferencePaperpeer-review

46 Downloads (Pure)


We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM). We follow the approach introduced in [Kazlauskaite,2018] of simultaneously representing each data sequence as a composition of a true underlying function and a time-warping, both of which are modelled using Gaussian processes (GPs), and aligning the underlying functions using an unsupervised alignment method. In [Kazlauskaite,2018] the alignment is performed using the GP latent variable model (GP-LVM) as a model of sequences, while our main contribution is extending this approach to using DPMM, which allows us to align the sequences temporally and cluster them at the same time. We show that the DPMM achieves competitive results in comparison to the GP-LVM on synthetic and real-world data sets, and discuss the different properties of the estimated underlying functions and the time-warps favoured by these models.
Original languageEnglish
Number of pages6
Publication statusAcceptance date - 12 Nov 2018
EventAll of Bayesian Nonparametrics (Especially the Useful Bits): A workshop at the Thirty-Second Annual Conference on Neural Information Processing Systems (NeurIPS 2018). - Palais des Congrès de Montréal, Montréal, Canada
Duration: 7 Dec 20187 Dec 2018


WorkshopAll of Bayesian Nonparametrics (Especially the Useful Bits)
Abbreviated titleBNP@NeurIPS 2018
Internet address


  • Gaussian processes
  • Dirichlet process

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'Sequence Alignment with Dirichlet Process Mixtures'. Together they form a unique fingerprint.

Cite this