A secant-based Nesterov method for convex functions
详细信息    查看全文
  • 作者:Razak O. Alli-Oke ; William P. Heath
  • 关键词:Convex optimization ; Fast gradient methods ; Nesterov gradient method
  • 刊名:Optimization Letters
  • 出版年:2017
  • 出版时间:January 2017
  • 年:2017
  • 卷:11
  • 期:1
  • 页码:81-105
  • 全文大小:
  • 刊物类别:Mathematics and Statistics
  • 刊物主题:Optimization; Operation Research/Decision Theory; Computational Intelligence; Numerical and Computational Physics, Simulation;
  • 出版者:Springer Berlin Heidelberg
  • ISSN:1862-4480
  • 卷排序:11
文摘
A simple secant-based fast gradient method is developed for problems whose objective function is convex and well-defined. The proposed algorithm extends the classical Nesterov gradient method by updating the estimate-sequence parameter with secant information whenever possible. This is achieved by imposing a secant condition on the choice of search point. Furthermore, the proposed algorithm embodies an "update rule with reset" that parallels the restart rule recently suggested in O’Donoghue and Candes (Found Comput Math, 2013). The proposed algorithm applies to a large class of problems including logistic and least-square losses commonly found in the machine learning literature. Numerical results demonstrating the efficiency of the proposed algorithm are analyzed with the aid of performance profiles.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700