DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood of the model. We demonstrate the efficacy of our approach via analysis of discovered structure and superior quantitative performance on missing data imputation.
LanguageEnglish
Title of host publicationProceedings of the 36th International Conference on Machine Learning
EditorsKamalika Chaudhuri, Ruslan Salakhutdinov
Place of PublicationLong Beach, California, USA
PublisherPMLR
Pages3682-3691
Number of pages10
Volume97
StatusPublished - 9 Jun 2019
EventThirty-sixth International Conference on Machine Learning - Long Beach Convention Center, Long Beach, USA United States
Duration: 9 Jun 201915 Jun 2019
Conference number: 36
https://icml.cc/

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
ISSN (Electronic)2368-5417

Conference

ConferenceThirty-sixth International Conference on Machine Learning
Abbreviated titleICML
CountryUSA United States
CityLong Beach
Period9/06/1915/06/19
Internet address

Cite this

Lawrence, A., Ek, C. H., & Campbell, N. (2019). DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures. In K. Chaudhuri, & R. Salakhutdinov (Eds.), Proceedings of the 36th International Conference on Machine Learning (Vol. 97, pp. 3682-3691). (Proceedings of Machine Learning Research). Long Beach, California, USA: PMLR.

DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures. / Lawrence, Andrew; Ek, Carl Henrik; Campbell, Neill.

Proceedings of the 36th International Conference on Machine Learning. ed. / Kamalika Chaudhuri; Ruslan Salakhutdinov. Vol. 97 Long Beach, California, USA : PMLR, 2019. p. 3682-3691 (Proceedings of Machine Learning Research).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Lawrence, A, Ek, CH & Campbell, N 2019, DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures. in K Chaudhuri & R Salakhutdinov (eds), Proceedings of the 36th International Conference on Machine Learning. vol. 97, Proceedings of Machine Learning Research, PMLR, Long Beach, California, USA, pp. 3682-3691, Thirty-sixth International Conference on Machine Learning, Long Beach, USA United States, 9/06/19.
Lawrence A, Ek CH, Campbell N. DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures. In Chaudhuri K, Salakhutdinov R, editors, Proceedings of the 36th International Conference on Machine Learning. Vol. 97. Long Beach, California, USA: PMLR. 2019. p. 3682-3691. (Proceedings of Machine Learning Research).
Lawrence, Andrew ; Ek, Carl Henrik ; Campbell, Neill. / DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures. Proceedings of the 36th International Conference on Machine Learning. editor / Kamalika Chaudhuri ; Ruslan Salakhutdinov. Vol. 97 Long Beach, California, USA : PMLR, 2019. pp. 3682-3691 (Proceedings of Machine Learning Research).
@inproceedings{3843e10d7a2e4769ae499f10b0013cba,
title = "DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures",
abstract = "We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood of the model. We demonstrate the efficacy of our approach via analysis of discovered structure and superior quantitative performance on missing data imputation.",
author = "Andrew Lawrence and Ek, {Carl Henrik} and Neill Campbell",
year = "2019",
month = "6",
day = "9",
language = "English",
volume = "97",
series = "Proceedings of Machine Learning Research",
publisher = "PMLR",
pages = "3682--3691",
editor = "Kamalika Chaudhuri and Ruslan Salakhutdinov",
booktitle = "Proceedings of the 36th International Conference on Machine Learning",

}

TY - GEN

T1 - DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures

AU - Lawrence, Andrew

AU - Ek, Carl Henrik

AU - Campbell, Neill

PY - 2019/6/9

Y1 - 2019/6/9

N2 - We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood of the model. We demonstrate the efficacy of our approach via analysis of discovered structure and superior quantitative performance on missing data imputation.

AB - We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood of the model. We demonstrate the efficacy of our approach via analysis of discovered structure and superior quantitative performance on missing data imputation.

UR - http://proceedings.mlr.press/v97/lawrence19a.html

UR - http://proceedings.mlr.press/v97/lawrence19a/lawrence19a-supp.pdf

M3 - Conference contribution

VL - 97

T3 - Proceedings of Machine Learning Research

SP - 3682

EP - 3691

BT - Proceedings of the 36th International Conference on Machine Learning

A2 - Chaudhuri, Kamalika

A2 - Salakhutdinov, Ruslan

PB - PMLR

CY - Long Beach, California, USA

ER -