TY - JOUR T1 - Modified Stochastic Extragradient Methods for Stochastic Variational Inequality AU - Zhang , Ling AU - Xu , Lingling JO - Journal of Computational Mathematics VL - 2 SP - 390 EP - 414 PY - 2024 DA - 2024/01 SN - 42 DO - http://doi.org/10.4208/jcm.2206-m2021-0195 UR - https://global-sci.org/intro/article_detail/jcm/22886.html KW - Stochastic variational inequality, Pseudo-monotone, Modified stochastic extragradient methods, Adaptive step-size. AB -
In this paper, we consider two kinds of extragradient methods to solve the pseudo-monotone stochastic variational inequality problem. First, we present the modified stochastic extragradient method with constant step-size (MSEGMC) and prove the convergence of it. With the strong pseudo-monotone operator and the exponentially growing sample sequences, we establish the $R$-linear convergence rate in terms of the mean natural residual and the oracle complexity $O(1/\epsilon).$ Second, we propose a modified stochastic extragradient method with adaptive step-size (MSEGMA). In addition, the step-size of MSEGMA does not depend on the Lipschitz constant and without any line-search procedure. Finally, we use some numerical experiments to verify the effectiveness of the two algorithms.