Nonasymptotic analysis of Stochastic Gradient Hamiltonian Monte Carlo under local conditions for nonconvex optimization
File(s)21-1423.pdf (510.05 KB)
Published version
Author(s)
Akyildiz, Omer Deniz
Sabanis, Sotirios
Type
Journal Article
Abstract
We provide a nonasymptotic analysis of the convergence of the stochastic gradient Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without assuming log-concavity. Our analysis quantifies key theoretical properties of the SGHMC as a
sampler under local conditions which significantly improves the findings of previous results.
In particular, we prove that the Wasserstein-2 distance between the target and the law of
the SGHMC is uniformly controlled by the step-size of the algorithm, therefore demonstrate
that the SGHMC can provide high-precision results uniformly in the number of iterations.
The analysis also allows us to obtain nonasymptotic bounds for nonconvex optimization
problems under local conditions and implies that the SGHMC, when viewed as a nonconvex optimizer, converges to a global minimum with the best known rates. We apply our
results to obtain nonasymptotic bounds for scalable Bayesian inference and nonasymptotic
generalization bounds.
sampler under local conditions which significantly improves the findings of previous results.
In particular, we prove that the Wasserstein-2 distance between the target and the law of
the SGHMC is uniformly controlled by the step-size of the algorithm, therefore demonstrate
that the SGHMC can provide high-precision results uniformly in the number of iterations.
The analysis also allows us to obtain nonasymptotic bounds for nonconvex optimization
problems under local conditions and implies that the SGHMC, when viewed as a nonconvex optimizer, converges to a global minimum with the best known rates. We apply our
results to obtain nonasymptotic bounds for scalable Bayesian inference and nonasymptotic
generalization bounds.
Date Issued
2024-01-01
Date Acceptance
2024-01-28
Citation
Journal of Machine Learning Research, 2024, 25 (113), pp.1-34
ISSN
1532-4435
Publisher
Microtome Publishing
Start Page
1
End Page
34
Journal / Book Title
Journal of Machine Learning Research
Volume
25
Issue
113
Copyright Statement
©2024 ¨Omer Deniz Akyildiz and Sotirios Sabanis.
License: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided
at http://jmlr.org/papers/v25/21-1423.html.
License: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided
at http://jmlr.org/papers/v25/21-1423.html.
License URL
Identifier
https://jmlr.org/papers/v25/21-1423.html
Publication Status
Published