Volume 14, Issue 2
Discrete Approximation Scheme in Distributionally Robust Optimization

Yongchao Liu, Xiaoming Yuan & Jin Zhang

Numer. Math. Theor. Meth. Appl., 14 (2021), pp. 285-320.

Published online: 2021-01

Preview Purchase PDF 18 6659
Export citation
  • Abstract

Discrete approximation, which has been the prevailing scheme in stochastic programming in the past decade, has been extended to distributionally robust optimization (DRO) recently. In this paper, we conduct rigorous quantitative stability analysis of discrete approximation schemes for DRO, which measures the approximation error in terms of discretization sample size. For the ambiguity set defined through equality and inequality moment conditions, we quantify the discrepancy between the discretized ambiguity sets and the original set with respect to the Wasserstein metric. To establish the quantitative convergence, we develop a Hoffman error bound theory with Hoffman constant calculation criteria in a infinite dimensional space, which can be regarded as a byproduct of independent interest. For the ambiguity set defined by Wasserstein ball and moment conditions combined with Wasserstein ball, we present similar quantitative stability analysis by taking full advantage of the convex property inherently admitted by Wasserstein metric. Efficient numerical methods for specifically solving discrete approximation DRO problems with thousands of samples are also designed. In particular, we reformulate different types of discrete approximation problems into a class of saddle point problems with completely separable structures. The stochastic primal-dual hybrid gradient (PDHG) algorithm where in each iteration we update a random subset of the sampled variables is then amenable as a solution method for the reformulated saddle point problems. Some preliminary numerical tests are reported.

  • Keywords

Quantitative stability analysis, Hoffman lemma, discrete approximation method, distributionally robust optimization, stochastic primal-dual hybrid gradient.

  • AMS Subject Headings

90C15, 90C46, 90C47

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{NMTMA-14-285, author = {Liu , Yongchao and Yuan , Xiaoming and Zhang , Jin}, title = {Discrete Approximation Scheme in Distributionally Robust Optimization}, journal = {Numerical Mathematics: Theory, Methods and Applications}, year = {2021}, volume = {14}, number = {2}, pages = {285--320}, abstract = {

Discrete approximation, which has been the prevailing scheme in stochastic programming in the past decade, has been extended to distributionally robust optimization (DRO) recently. In this paper, we conduct rigorous quantitative stability analysis of discrete approximation schemes for DRO, which measures the approximation error in terms of discretization sample size. For the ambiguity set defined through equality and inequality moment conditions, we quantify the discrepancy between the discretized ambiguity sets and the original set with respect to the Wasserstein metric. To establish the quantitative convergence, we develop a Hoffman error bound theory with Hoffman constant calculation criteria in a infinite dimensional space, which can be regarded as a byproduct of independent interest. For the ambiguity set defined by Wasserstein ball and moment conditions combined with Wasserstein ball, we present similar quantitative stability analysis by taking full advantage of the convex property inherently admitted by Wasserstein metric. Efficient numerical methods for specifically solving discrete approximation DRO problems with thousands of samples are also designed. In particular, we reformulate different types of discrete approximation problems into a class of saddle point problems with completely separable structures. The stochastic primal-dual hybrid gradient (PDHG) algorithm where in each iteration we update a random subset of the sampled variables is then amenable as a solution method for the reformulated saddle point problems. Some preliminary numerical tests are reported.

}, issn = {2079-7338}, doi = {https://doi.org/10.4208/nmtma.OA-2020-0125}, url = {http://global-sci.org/intro/article_detail/nmtma/18601.html} }
TY - JOUR T1 - Discrete Approximation Scheme in Distributionally Robust Optimization AU - Liu , Yongchao AU - Yuan , Xiaoming AU - Zhang , Jin JO - Numerical Mathematics: Theory, Methods and Applications VL - 2 SP - 285 EP - 320 PY - 2021 DA - 2021/01 SN - 14 DO - http://doi.org/10.4208/nmtma.OA-2020-0125 UR - https://global-sci.org/intro/article_detail/nmtma/18601.html KW - Quantitative stability analysis, Hoffman lemma, discrete approximation method, distributionally robust optimization, stochastic primal-dual hybrid gradient. AB -

Discrete approximation, which has been the prevailing scheme in stochastic programming in the past decade, has been extended to distributionally robust optimization (DRO) recently. In this paper, we conduct rigorous quantitative stability analysis of discrete approximation schemes for DRO, which measures the approximation error in terms of discretization sample size. For the ambiguity set defined through equality and inequality moment conditions, we quantify the discrepancy between the discretized ambiguity sets and the original set with respect to the Wasserstein metric. To establish the quantitative convergence, we develop a Hoffman error bound theory with Hoffman constant calculation criteria in a infinite dimensional space, which can be regarded as a byproduct of independent interest. For the ambiguity set defined by Wasserstein ball and moment conditions combined with Wasserstein ball, we present similar quantitative stability analysis by taking full advantage of the convex property inherently admitted by Wasserstein metric. Efficient numerical methods for specifically solving discrete approximation DRO problems with thousands of samples are also designed. In particular, we reformulate different types of discrete approximation problems into a class of saddle point problems with completely separable structures. The stochastic primal-dual hybrid gradient (PDHG) algorithm where in each iteration we update a random subset of the sampled variables is then amenable as a solution method for the reformulated saddle point problems. Some preliminary numerical tests are reported.

Yongchao Liu, Xiaoming Yuan & Jin Zhang. (2021). Discrete Approximation Scheme in Distributionally Robust Optimization. Numerical Mathematics: Theory, Methods and Applications. 14 (2). 285-320. doi:10.4208/nmtma.OA-2020-0125
Copy to clipboard
The citation has been copied to your clipboard