Distributed Gaussian Processes

File Description SizeFormat 
icml2015.pdfAccepted version626.03 kBAdobe PDFDownload
Title: Distributed Gaussian Processes
Author(s): Deisenroth, MP
Ng, JW
Item Type: Conference Paper
Abstract: To scale Gaussian processes (GPs) to large data sets we introduce the robust Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts model for large-scale distributed GP regression. Unlike state-of-theart sparse GP approximations, the rBCM is conceptually simple and does not rely on inducing or variational parameters. The key idea is to recursively distribute computations to independent computational units and, subsequently, recombine them to form an overall result. Efficient closed-form inference allows for straightforward parallelisation and distributed computations with a small memory footprint. The rBCM is independent of the computational graph and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters. With sufficient computing resources our distributed GP model can handle arbitrarily large data sets.
Publication Date: 1-Jun-2015
Date of Acceptance: 26-Apr-2015
URI: http://hdl.handle.net/10044/1/22230
Publisher: Journal of Machine Learning Research
Journal / Book Title: JMLR: Workshop and Conference Proceedings
Volume: 37
Copyright Statement: © Copyright 2015 by the author(s).
Conference Name: 2015 International Conference on Machine Learning (ICML)
Publication Status: Published
Publisher URL: http://jmlr.org/proceedings/papers/v37/
Start Date: 2015-07-06
Finish Date: 2015-07-11
Conference Place: Lille, France
Appears in Collections:Computing



Items in Spiral are protected by copyright, with all rights reserved, unless otherwise indicated.

Creative Commons