TY - JOUR

T1 - New error bounds and their applications to convergence analysis of iterative algorithms

AU - Luo, Zhi Quan

PY - 2000/8

Y1 - 2000/8

N2 - We present two new error bounds for optimization problems over a convex set whose objective function f is either semianalytic or γ-strictly convex, with γ ≥ 1. We then apply these error bounds to analyze the rate of convergence of a wide class of iterative descent algorithms for the aforementioned optimization problem. Our analysis shows that the function sequence {f(xk)} converges at least at the sublinear rate of k-ε for some positive constant ε, where k is the iteration index. Moreover, the distances from the iterate sequence {xk} to the set of stationary points of the optimization problem converge to zero at least sublinearly.

AB - We present two new error bounds for optimization problems over a convex set whose objective function f is either semianalytic or γ-strictly convex, with γ ≥ 1. We then apply these error bounds to analyze the rate of convergence of a wide class of iterative descent algorithms for the aforementioned optimization problem. Our analysis shows that the function sequence {f(xk)} converges at least at the sublinear rate of k-ε for some positive constant ε, where k is the iteration index. Moreover, the distances from the iterate sequence {xk} to the set of stationary points of the optimization problem converge to zero at least sublinearly.

UR - http://www.scopus.com/inward/record.url?scp=0141603517&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0141603517&partnerID=8YFLogxK

U2 - 10.1007/s101070050020

DO - 10.1007/s101070050020

M3 - Article

AN - SCOPUS:0141603517

SN - 0025-5610

VL - 88

SP - 341

EP - 355

JO - Mathematical Programming, Series B

JF - Mathematical Programming, Series B

IS - 2

ER -