Distributed Gaussian Processes
File(s)icml2015.pdf (626.03 KB)
Accepted version
Author(s)
Deisenroth, MP
Ng, JW
Type
Conference Paper
Abstract
To scale Gaussian processes (GPs) to large data
sets we introduce the robust Bayesian Committee
Machine (rBCM), a practical and scalable
product-of-experts model for large-scale
distributed GP regression. Unlike state-of-theart
sparse GP approximations, the rBCM is conceptually
simple and does not rely on inducing
or variational parameters. The key idea is to
recursively distribute computations to independent
computational units and, subsequently, recombine
them to form an overall result. Efficient
closed-form inference allows for straightforward
parallelisation and distributed computations with
a small memory footprint. The rBCM is independent
of the computational graph and can
be used on heterogeneous computing infrastructures,
ranging from laptops to clusters. With sufficient
computing resources our distributed GP
model can handle arbitrarily large data sets.
sets we introduce the robust Bayesian Committee
Machine (rBCM), a practical and scalable
product-of-experts model for large-scale
distributed GP regression. Unlike state-of-theart
sparse GP approximations, the rBCM is conceptually
simple and does not rely on inducing
or variational parameters. The key idea is to
recursively distribute computations to independent
computational units and, subsequently, recombine
them to form an overall result. Efficient
closed-form inference allows for straightforward
parallelisation and distributed computations with
a small memory footprint. The rBCM is independent
of the computational graph and can
be used on heterogeneous computing infrastructures,
ranging from laptops to clusters. With sufficient
computing resources our distributed GP
model can handle arbitrarily large data sets.
Date Issued
2015-06-01
Date Acceptance
2015-04-26
Citation
JMLR: Workshop and Conference Proceedings, 2015, 37
Publisher
Journal of Machine Learning Research
Journal / Book Title
JMLR: Workshop and Conference Proceedings
Volume
37
Copyright Statement
© Copyright 2015 by the author(s).
Identifier
http://jmlr.org/proceedings/papers/v37/
Source
2015 International Conference on Machine Learning (ICML)
Publication Status
Published
Publisher URL
Start Date
2015-07-06
Finish Date
2015-07-11
Coverage Spatial
Lille, France