稀疏正则化:Gauss-Seidel阈值迭代算法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Sparse regularized optimization via Gauss-Seidel iterative thresholding
  • 作者:曾锦山 ; 何涛 ; 欧阳诗康
  • 英文作者:Jinshan Zeng;Tao He;Shikang Ouyang;
  • 关键词:稀疏正则化 ; ?_q(0 ; 阈值迭代算法 ; Gauss-Seidel ; Kurdyka-Lojasiewicz不等式
  • 英文关键词:sparse regularization;;?_q(0 < q < 1) regularization;;iterative thresholding algorithm;;Gauss-Seidel;;Kurdyka-Lojasiewicz inequality
  • 中文刊名:JAXK
  • 英文刊名:Scientia Sinica(Mathematica)
  • 机构:江西师范大学计算机信息工程学院;
  • 出版日期:2018-07-20
  • 出版单位:中国科学:数学
  • 年:2018
  • 期:v.48
  • 基金:国家自然科学基金(批准号:61603162,11501440,61772246和61603163);; 江西师范大学博士启动基金;; 江西省教育厅研究生创新(批准号:YC2016-S171)资助项目
  • 语种:中文;
  • 页:JAXK201807007
  • 页数:18
  • CN:07
  • ISSN:11-5836/O1
  • 分类号:93-110
摘要
本文考虑一类稀疏正则化问题,该类问题在机器学习、信号处理和图像处理等众多领域中被广泛研究.此类问题的一个典型特征是其诱导的阈值函数具有跳跃的不连续性.本文提出一种基于GaussSeidel的迭代算法,称作Gauss-Seidel跳跃阈值迭代算法(Gauss-Seidel iterative jumping thresholding algorithm,GSIJT),用以快速解决以上问题.本文首先证明了由GSIJT所产生序列的支撑与符号的有限收敛性.基于此收敛性质,同时利用restricted Kurdyka-Lojasiewicz(rKL)性质给出GSIJT算法的全局收敛性.此外给出了GSIJT的收敛率,并且证明了任意的极限点都是驻点.本文实施了一系列的数值实验来验证所提算法的有效性.特别地,通过与相关的阈值迭代算法进行比较,表明所提算法不仅收敛更快,同时可选择的步长范围更宽.
        This paper considers a class of sparse regularized optimization problems, which are widely studied in many applications such as machine learning, signal processing and image processing. A typical feature of such a class of sparse regularizers is that their induced thresholding functions are discontinuous with jump discontinuities.We develop a Gauss-Seidel based algorithm, called Gauss-Seidel iterative jumping thresholding algorithm(GSIJT)for fast solution to these problems. We first justify the finite convergence of the support and sign of any sequence generated by GSIJT. The convergence property together with the restricted Kurdyka-Lojasiewicz(rKL) property yields the global convergence of GSIJT. Moreover, we give the rate of convergence of GSIJT, and show that any limit point is a stationary point. A series of numerical experiments are provided to show the effectiveness of GSIJT. Particularly, compared with the associated iterative thresholding algorithms, the proposed algorithm converges much faster and can adopt a larger step size.
引文
[1] Chartrand R.Exact reconstruction of sparse signals via nonconvex minimization.IEEE Signal Process Lett,2007,14:707 –710
    2 Cao W,Sun J,Xu Z.Fast image deconvolution using closed-form thresholding formulas of Lq(q=12,23)regularization.J Vis Commun Image Represent,2013,24:31–41
    3 Chen X,Han Z,Wang Y,et al.Nonconvex plus quadratic penalized low-rank and sparse decomposition for noisy image alignment.Sci China Inf Sci,2016,59:052107
    4 Zeng J,Fang J,Xu Z.Sparse SAR imaging based on L1/2regularization.Sci China Inf Sci,2012,55:1755–1775
    5 Chartrand R,Yin W.Iterative reweighted algorithms for compressed sensing.In:IEEE International Conference on Acoustics,Speech and Signal Processing(ICASSP).Las Vegas:IEEE,2008,3869–3872
    6 Daubechies I,De Vore R,Fornasier M,et al.Iteratively reweighted least squares minimization for sparse recovery.Comm Pure Appl Math,2010,63:1–38
    7 Lu Z.Iterative reweighted minimization methods for lp regularized unconstrained nonlinear programming.Math Program,2014,147:277–307
    8 Candés E,Wakin M,Boyd S.Enhancing sparsity by reweighted l1 minimization.J Fourier Anal Appl,2008,14:877 –905
    9 Gasso G,Rakotomamonjy A,Canu S.Recovering sparse signals with a certain family of nonconvex penalties and DC programming.IEEE Trans Signal Process,2009,57:4686–4698
    10 Blumensath T,Davies M.Iterative thresholding for sparse approximations.J Fourier Anal Appl,2008,14:629–654
    11 Daubechies I,Defrise M,De Mol C.An iterative thresholding algorithm for linear inverse problems with a sparsity constraint.Comm Pure Appl Math,2004,57:1413–1457
    12 Xu Z,Chang X,Xu F,et al.L1/2regularization:A thresholding representation theory and a fast solver.IEEE Trans Neural Netw Learn Syst,2012,23:1013–1027
    13 Attouch H,Bolte J,Svaiter B.Convergence of descent methods for semi-algebraic and tame problems:Proximal algorithms,forward-backward splitting,and regularized Gauss-Seidel methods.Math Program,2013,137:91–129
    14 Zeng J,Lin S,Wang Y,et al.L1/2regularization:Convergence of iterative half thresholding algorithm.IEEE Trans Signal Process,2014,62:2317–2329
    15 Zeng J,Lin S,Xu Z.Sparse regularization:Convergence of iterative jumping thresholding algorithm.IEEE Trans Signal Process,2016,64:5106–5118
    16 Combettes P,Wajs V.Signal recovery by proximal forward-backward splitting.Multiscale Model Simul,2005,4:1168–1200
    17 Tritsiklis J.A comparison of Jacobi and Gauss-Seidel parallel iterations.Appl Math Lett,1989,2:167–170
    18 Peng Z,Wu T,Xu Y,et al.Coordinate-friendly structures,algorithms and applications.Ann Math Sci Appl,2016,1 :57–119
    19 Wang Y,Zeng J,Peng Z,et al.Linear convergence of adaptively iterative thresholding algorithms for compressed sensing.IEEE Trans Signal Process,2015,63:2957–2971
    20 Zhu L,Li L,Li R,et al.Model-free feature screening for ultrahigh dimensional data.J Amer Statist Assoc,2011,106:1464–1475
    21 Attouch H,Bolte J.On the convergence of the proximal algorithm for nonsmooth functions involving analytic features.Math Program,2009,116:5–16
    22 Xu Y,Yin W.A block coordinate descent method for regularized multiconvex optimization with applications to nonnegative tensor factorization and completion.SIAM J Imag Sci,2013,6:1758–1789
    23 Bagirov A,Jin L,Karmitsa N,et al.Subgradient method for nonconvex nonsmooth optimization.J Optim Theory Appl,2013,157:416–435
    24 Burke J,Lewis A,Overton M.A robust gradient sampling algorithm for nonsmooth,nonconvex optimization.SIAM J Optim,2005,15:751–779
    25 Chen X.Smoothing methods for nonsmooth,nonconvex minimization.Math Program,2012,134:71–99
    26 Hildreth C.A quadratic programming procedure.Naval Res Logist,1957,4:79–85
    27 Grippof L,Sciandrone M.Globally convergent block-coordinate techniques for unconstrained optimization.Optim Methods Softw,1999,10:587–637
    28 Razaviyayn M,Hong M,Luo Z.A unified convergence analysis of block successive minimization methods for nonsmooth optimization.SIAM J Optim,2013,23:1126–1153
    29 Tseng P.Convergence of a block coordinate descent method for nondifferentiable minimization.J Optim Theory Appl,2001,109:475–494
    30 Luo Z,Tseng P.On the convergence of the coordinate descent method for convex differentiable minimization.J Optim Theory Appl,1992,72:7–35
    31 Beck A,Tetruashvili L.On the convergence of block coordinate descent type methods.SIAM J Optim,2013,23:2037–2060
    32 Tseng P,Yun S.A coordinate gradient descent method for nonsmooth separable minimization.Math Program,2009,117 :387–423
    33 Bolte J,Sabach S,Teboulle M.Proximal alternating linearized minimization for nonconvex and nonsmooth problems.Math Program,2014,146:459–494
    34 Marjanovic G,Solo V.lq sparsity penalized linear regression with cyclic descent.IEEE Trans Signal Process,2014,62 :1464–1475
    35 Lojasiewicz S.Sur la géométrie semi-et sous-analytique.Ann Inst Fourier(Grenoble),1993,43:1575–1595
    36 Bolte J,Daniilidis A,Lewis A.The Lojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems.SIAM J Optim,2007,17:1205–1223
    1)本文所用的全局收敛是指对于算法产生的序列而言,其收敛情况与初始点无关.
    2)在许多其他论文中渐近线性收敛也称为最终或局部线性收敛.
    3)当其他坐标块固定时,如果对于每一个i,g都是x_i的凸函数,则称g是分块多凸的.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700