AdaGeo: adaptive geometric learning for optimization and sampling

File Description SizeFormat 
abbati18a.pdfPublished version870.83 kBAdobe PDFView/Open
Title: AdaGeo: adaptive geometric learning for optimization and sampling
Authors: Abbati, G
Tosi, A
Osborne, M
Flaxman, SR
Item Type: Conference Paper
Abstract: Gradient-based optimization and Markov Chain Monte Carlo sampling can be found at the heart of several machine learning methods. In high-dimensional settings, well-known issues such as slow-mixing, non-convexity and correlations can hinder the algorithms’ efficiency. In order to overcome these difficulties, we propose AdaGeo, a preconditioning framework for adaptively learning the geometry of the parameter space during optimization or sampling. In particular, we use the Gaussian process latent variable model (GP-LVM) to represent a lower-dimensional embedding of the parameters, identifying the underlying Riemannian manifold on which the optimization or sampling is taking place. Samples or optimization steps are consequently proposed based on the geometry of the manifold. We apply our framework to stochastic gradient descent, stochastic gradient Langevin dynamics, and stochastic gradient Riemannian Langevin dynamics, and show performance improvements for both optimization and sampling.
Issue Date: 9-Apr-2018
Date of Acceptance: 22-Dec-2017
Publisher: PMLR
Start Page: 226
End Page: 234
Journal / Book Title: Proceedings of Machine Learning Research
Volume: 84
Copyright Statement: © 2018 by the author(s). Available under a CC-BY Attribution Licence (
Conference Name: 21 st International Conference on Artificial Intelligence and Statistics (AISTAT)S
Publication Status: Published
Start Date: 2018-04-09
Finish Date: 2018-04-11
Conference Place: Lanzarote, Spain
Open Access location:
Appears in Collections:Mathematics

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Creative Commonsx