Improving sample average approximation using distributional robustness
File(s)SubmittedManuscriptApril15.pdf (694.95 KB)
Accepted version
Author(s)
Anderson, Edward
Philpott, Andrew
Type
Journal Article
Abstract
We consider stochastic optimization problems in which we aim to minimize the expected value of an objective function with respect to an unknown distribution of random parameters. We analyse the out-of-sample
performance of solutions obtained by solving a distributionally robust version of the sample average approximation problem for unconstrained quadratic problems, and derive conditions under which these solutions
are improved in comparison with those of the sample average approximation. We compare different mechanisms for constructing a robust solution: phi-divergence using both total variation and standard smooth φ
functions; a CVaR-based risk measure; and a Wasserstein metric.
performance of solutions obtained by solving a distributionally robust version of the sample average approximation problem for unconstrained quadratic problems, and derive conditions under which these solutions
are improved in comparison with those of the sample average approximation. We compare different mechanisms for constructing a robust solution: phi-divergence using both total variation and standard smooth φ
functions; a CVaR-based risk measure; and a Wasserstein metric.
Date Issued
2021-12-30
Date Acceptance
2021-05-05
Citation
INFORMS Journal on Optimization, 2021, 4 (1), pp.90-124
ISSN
2575-1484
Publisher
INFORMS
Start Page
90
End Page
124
Journal / Book Title
INFORMS Journal on Optimization
Volume
4
Issue
1
Copyright Statement
© 2021, INFORMS
Identifier
https://pubsonline.informs.org/doi/abs/10.1287/ijoo.2021.0061
Publication Status
Published
Date Publish Online
2021-12-30