Stein Points

File Description SizeFormat 
1803.10161v4.pdfAccepted version7.03 MBAdobe PDFView/Open
Title: Stein Points
Authors: Chen, WY
Mackey, L
Gorham, J
Briol, F-X
Oates, CJ
Item Type: Conference Paper
Abstract: An important task in computational statistics and machine learning is to approximate a posterior distribution $p(x)$ with an empirical measure supported on a set of representative points $\{x_i\}_{i=1}^n$. This paper focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when $n$ is small. To this end, we present `Stein Points'. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and $p(x)$. Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method.
Issue Date: 10-Jul-2018
Date of Acceptance: 11-May-2018
URI: http://hdl.handle.net/10044/1/63009
Start Page: 844
End Page: 853
Journal / Book Title: Proceedings of the 35th International Conference on Machine Learning (ICML)
Volume: PMLR 80
Copyright Statement: © 2018 by the author(s)
Conference Name: International Conference on Machine Learning
Keywords: stat.CO
cs.LG
stat.ML
Publication Status: Published
Start Date: 2018-07-10
Finish Date: 2018-07-15
Conference Place: Stockholm, Sweden
Appears in Collections:Mathematics
Statistics
Faculty of Natural Sciences



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Creative Commonsx