[期刊论文]


On the Linear Convergence of a Proximal Gradient Method for a Class of Nonsmooth Convex Minimization Problems

作   者:
Haibin Zhang;Jiaojiao Jiang;Zhi-Quan Luo;

出版年:2013

页     码:163 - 186
出版社:Springer Nature


摘   要:

We consider a class of nonsmooth convex optimization problems where the objective function is the composition of a strongly convex differentiable function with a linear mapping, regularized by the sum of both ℓ 1-norm and ℓ 2-norm of the optimization variables. This class of problems arise naturally from applications in sparse group Lasso, which is a popular technique for variable selection. An effective approach to solve such problems is by the Proximal Gradient Method (PGM). In this paper we prove a local error bound around the optimal solution set for this problem and use it to establish the linear convergence of the PGM method without assuming strong convexity of the overall objective function.



关键字:

Proximal gradient method ;Error bound ;Linear convergence ;Sparse group Lasso


所属期刊
Journal of the Operations Research Society of China
ISSN: 2194-668X
来自:Springer Nature