Global convergence of optimized adaptive importance samplers
File(s)2201.00409v2.pdf (876.37 KB)
Accepted version
Author(s)
Akyildiz, Omer Deniz
Type
Journal Article
Abstract
We analyze the optimized adaptive importance sampler (OAIS) for performing Monte Carlo integration with general proposals. We leverage a classical result which shows that the bias and the mean-squared error (MSE) of the importance sampling scales with the -divergence between the target and the proposal and develop a scheme which performs global optimization of -divergence. While it is known that this quantity is convex for exponential family proposals, the case of the general proposals has been an open problem. We close this gap by utilizing the nonasymptotic bounds for stochastic gradient Langevin dynamics (SGLD) for the global optimization of -divergence and derive nonasymptotic bounds for the MSE by leveraging recent results from non-convex optimization literature. The resulting AIS schemes have explicit theoretical guarantees that are uniform-in-time.
Date Issued
2025-12-01
Date Acceptance
2024-01-26
Citation
Foundations of Data Science, 2025, 7 (4), pp.944-962
ISSN
2639-8001
Publisher
American Institute of Mathematical Sciences
Start Page
944
End Page
962
Journal / Book Title
Foundations of Data Science
Volume
7
Issue
4
Copyright Statement
Copyright © 2024 American Institute of Mathematical Sciences. This is the author’s accepted manuscript made available under a CC-BY licence in accordance with Imperial’s Research Publications Open Access policy (www.imperial.ac.uk/oa-policy)
License URL
Identifier
https://www.aimsciences.org/article/doi/10.3934/fods.2024003
Publication Status
Published
Date Publish Online
2024-02-01