Practical inexact proximal quasi-Newton method with global complexity analysis
详细信息    查看全文
文摘
Recently several methods were proposed for sparse optimization which make careful use of second-order information (Hsieh et al. in Sparse inverse covariance matrix estimation using quadratic approximation. In: NIPS, 2011; Yuan et al. in An improved GLMNET for l1-regularized logistic regression and support vector machines. National Taiwan University, Taipei City, 2011; Olsen et al. in Newton-like methods for sparse inverse covariance estimation. In: NIPS, 2012; Byrd et al. in A family of second-order methods for convex l1-regularized optimization. Technical report, 2012) to improve local convergence rates. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent. Here we propose a general framework, which includes slightly modified versions of existing algorithms and also a new algorithm, which uses limited memory BFGS Hessian approximations, and provide a novel global convergence rate analysis, which covers methods that solve subproblems via coordinate descent.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700