Online optimization for max-norm regularization
详细信息    查看全文
  • 作者:Jie Shen ; Huan Xu ; Ping Li
  • 关键词:Low ; rank matrix ; Max ; norm ; Stochastic optimization ; Matrix factorization
  • 刊名:Machine Learning
  • 出版年:2017
  • 出版时间:March 2017
  • 年:2017
  • 卷:106
  • 期:3
  • 页码:419-457
  • 全文大小:
  • 刊物类别:Computer Science
  • 刊物主题:Artificial Intelligence (incl. Robotics); Control, Robotics, Mechatronics; Computing Methodologies; Simulation and Modeling; Language Translation and Linguistics;
  • 出版者:Springer US
  • ISSN:1573-0565
  • 卷排序:106
文摘
The max-norm regularizer has been extensively studied in the last decade as it promotes an effective low-rank estimation for the underlying data. However, such max-norm regularized problems are typically formulated and solved in a batch manner, which prevents it from processing big data due to possible memory bottleneck. In this paper, hence, we propose an online algorithm that is scalable to large problems. In particular, we consider the matrix decomposition problem as an example, although a simple variant of the algorithm and analysis can be adapted to other important problems such as matrix completion. The crucial technique in our implementation is to reformulate the max-norm to an equivalent matrix factorization form, where the factors consist of a (possibly overcomplete) basis component and a coefficients one. In this way, we may maintain the basis component in the memory and optimize over it and the coefficients for each sample alternatively. Since the size of the basis component is independent of the sample size, our algorithm is appealing when manipulating a large collection of samples. We prove that the sequence of the solutions (i.e., the basis component) produced by our algorithm converges to a stationary point of the expected loss function asymptotically. Numerical study demonstrates encouraging results for the robustness of our algorithm compared to the widely used nuclear norm solvers.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700