On the Sampling Problem for Kernel Quadrature
File(s)1706.03369v1.pdf (1.25 MB)
Accepted version
Author(s)
Briol, F-X
Oates, CJ
Cockayne, J
Chen, WY
Girolami, M
Type
Conference Paper
Abstract
The standard Kernel Quadrature method for numerical integration with random point sets (also called Bayesian Monte Carlo) is known to converge in root mean square error at a rate determined by the ratio $s/d$, where $s$ and $d$ encode the smoothness and dimension of the integrand. However, an empirical investigation reveals that the rate constant $C$ is highly sensitive to the distribution of the random points. In contrast to standard Monte Carlo integration, for which optimal importance sampling is well-understood, the sampling distribution that minimises $C$ for Kernel Quadrature does not admit a closed form. This paper argues that the practical choice of sampling distribution is an important open problem. One solution is considered; a novel automatic approach based on adaptive tempering and sequential Monte Carlo.
Empirical results demonstrate a dramatic reduction in integration error of up to 4 orders of magnitude can be achieved with the proposed method.
Empirical results demonstrate a dramatic reduction in integration error of up to 4 orders of magnitude can be achieved with the proposed method.
Date Issued
2017-01-01
Date Acceptance
2017-04-14
Citation
Proceedings of the 34th International Conference on Machine Learning, 70, pp.586-595
Publisher
PMLR
Start Page
586
End Page
595
Journal / Book Title
Proceedings of the 34th International Conference on Machine Learning
Volume
70
Copyright Statement
Copyright 2017
by the author(s).
by the author(s).
Identifier
http://arxiv.org/abs/1706.03369v1
Source
International Conference on Machine Learning (ICML)
Subjects
stat.ML
stat.ML
cs.LG
math.NA
stat.CO
Publication Status
Published
Coverage Spatial
Sydney, Australia