19 Downloads (Pure)

Abstract

We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM). We follow the approach introduced in [Kazlauskaite,2018] of simultaneously representing each data sequence as a composition of a true underlying function and a time-warping, both of which are modelled using Gaussian processes (GPs), and aligning the underlying functions using an unsupervised alignment method. In [Kazlauskaite,2018] the alignment is performed using the GP latent variable model (GP-LVM) as a model of sequences, while our main contribution is extending this approach to using DPMM, which allows us to align the sequences temporally and cluster them at the same time. We show that the DPMM achieves competitive results in comparison to the GP-LVM on synthetic and real-world data sets, and discuss the different properties of the estimated underlying functions and the time-warps favoured by these models.
Original languageEnglish
Pages1-6
Number of pages6
Publication statusAccepted/In press - 12 Nov 2018
EventAll of Bayesian Nonparametrics (Especially the Useful Bits): A workshop at the Thirty-Second Annual Conference on Neural Information Processing Systems (NeurIPS 2018). - Palais des Congrès de Montréal, Montréal, Canada
Duration: 7 Dec 20187 Dec 2018
https://sites.google.com/view/nipsbnp2018/home

Workshop

WorkshopAll of Bayesian Nonparametrics (Especially the Useful Bits)
Abbreviated titleBNP@NeurIPS 2018
CountryCanada
CityMontréal
Period7/12/187/12/18
Internet address

Keywords

  • Gaussian processes
  • Dirichlet process

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this

Kazlauskaite, I., Ustyuzhaninov, I., Ek, C. H., & Campbell, N. (Accepted/In press). Sequence Alignment with Dirichlet Process Mixtures. 1-6. Paper presented at All of Bayesian Nonparametrics (Especially the Useful Bits), Montréal, Canada.

Sequence Alignment with Dirichlet Process Mixtures. / Kazlauskaite, Ieva; Ustyuzhaninov, Ivan; Ek, Carl Henrik; Campbell, Neill.

2018. 1-6 Paper presented at All of Bayesian Nonparametrics (Especially the Useful Bits), Montréal, Canada.

Research output: Contribution to conferencePaper

Kazlauskaite, I, Ustyuzhaninov, I, Ek, CH & Campbell, N 2018, 'Sequence Alignment with Dirichlet Process Mixtures' Paper presented at All of Bayesian Nonparametrics (Especially the Useful Bits), Montréal, Canada, 7/12/18 - 7/12/18, pp. 1-6.
Kazlauskaite I, Ustyuzhaninov I, Ek CH, Campbell N. Sequence Alignment with Dirichlet Process Mixtures. 2018. Paper presented at All of Bayesian Nonparametrics (Especially the Useful Bits), Montréal, Canada.
Kazlauskaite, Ieva ; Ustyuzhaninov, Ivan ; Ek, Carl Henrik ; Campbell, Neill. / Sequence Alignment with Dirichlet Process Mixtures. Paper presented at All of Bayesian Nonparametrics (Especially the Useful Bits), Montréal, Canada.6 p.
@conference{46cd3042228e4b33a6c205764c108e05,
title = "Sequence Alignment with Dirichlet Process Mixtures",
abstract = "We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM). We follow the approach introduced in [Kazlauskaite,2018] of simultaneously representing each data sequence as a composition of a true underlying function and a time-warping, both of which are modelled using Gaussian processes (GPs), and aligning the underlying functions using an unsupervised alignment method. In [Kazlauskaite,2018] the alignment is performed using the GP latent variable model (GP-LVM) as a model of sequences, while our main contribution is extending this approach to using DPMM, which allows us to align the sequences temporally and cluster them at the same time. We show that the DPMM achieves competitive results in comparison to the GP-LVM on synthetic and real-world data sets, and discuss the different properties of the estimated underlying functions and the time-warps favoured by these models.",
keywords = "Gaussian processes, Dirichlet process",
author = "Ieva Kazlauskaite and Ivan Ustyuzhaninov and Ek, {Carl Henrik} and Neill Campbell",
year = "2018",
month = "11",
day = "12",
language = "English",
pages = "1--6",
note = "All of Bayesian Nonparametrics (Especially the Useful Bits) : A workshop at the Thirty-Second Annual Conference on Neural Information Processing Systems (NeurIPS 2018)., BNP@NeurIPS 2018 ; Conference date: 07-12-2018 Through 07-12-2018",
url = "https://sites.google.com/view/nipsbnp2018/home",

}

TY - CONF

T1 - Sequence Alignment with Dirichlet Process Mixtures

AU - Kazlauskaite, Ieva

AU - Ustyuzhaninov, Ivan

AU - Ek, Carl Henrik

AU - Campbell, Neill

PY - 2018/11/12

Y1 - 2018/11/12

N2 - We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM). We follow the approach introduced in [Kazlauskaite,2018] of simultaneously representing each data sequence as a composition of a true underlying function and a time-warping, both of which are modelled using Gaussian processes (GPs), and aligning the underlying functions using an unsupervised alignment method. In [Kazlauskaite,2018] the alignment is performed using the GP latent variable model (GP-LVM) as a model of sequences, while our main contribution is extending this approach to using DPMM, which allows us to align the sequences temporally and cluster them at the same time. We show that the DPMM achieves competitive results in comparison to the GP-LVM on synthetic and real-world data sets, and discuss the different properties of the estimated underlying functions and the time-warps favoured by these models.

AB - We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM). We follow the approach introduced in [Kazlauskaite,2018] of simultaneously representing each data sequence as a composition of a true underlying function and a time-warping, both of which are modelled using Gaussian processes (GPs), and aligning the underlying functions using an unsupervised alignment method. In [Kazlauskaite,2018] the alignment is performed using the GP latent variable model (GP-LVM) as a model of sequences, while our main contribution is extending this approach to using DPMM, which allows us to align the sequences temporally and cluster them at the same time. We show that the DPMM achieves competitive results in comparison to the GP-LVM on synthetic and real-world data sets, and discuss the different properties of the estimated underlying functions and the time-warps favoured by these models.

KW - Gaussian processes

KW - Dirichlet process

M3 - Paper

SP - 1

EP - 6

ER -