TY - JOUR T1 - Convergence Rate of Gradient Descent Method for Multi-Objective Optimization AU - Zeng , Liaoyuan AU - Dai , Yuhong AU - Huang , Yakui JO - Journal of Computational Mathematics VL - 5 SP - 689 EP - 703 PY - 2019 DA - 2019/03 SN - 37 DO - http://doi.org/10.4208/jcm.1808-m2017-0214 UR - https://global-sci.org/intro/article_detail/jcm/13041.html KW - Multi-objective optimization, Gradient descent, Convergence rate. AB -

The convergence rate of the gradient descent method is considered for unconstrained multi-objective optimization problems (MOP). Under standard assumptions, we prove that the gradient descent method with constant step sizes converges sublinearly when the objective functions are convex and the convergence rate can be strengthened to be linear if the objective functions are strongly convex. The results are also extended to the gradient descent method with the Armijo line search. Hence, we see that the gradient descent method for MOP enjoys the same convergence properties as those for scalar optimization.