用户名: 密码: 验证码:
求解稀疏逻辑回归问题的嵌套BB算法的分裂增广拉格朗日算法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:A splitting augmented Lagrangian method embedding in the BB method for solving the sparse logistic problem
  • 作者:梁仁莉 ; 白延琴
  • 英文作者:LIANG Renli;BAI Yanqin;Department of Mathematics, Collage of Sciences, Shanghai University;
  • 关键词:稀疏逻辑回归 ; 分裂增广拉格朗日算法 ; 特征提取
  • 英文关键词:sparse logistic regression;;splitting augmented Lagrange method;;feature selection
  • 中文刊名:YCXX
  • 英文刊名:Operations Research Transactions
  • 机构:上海大学理学院数学系;
  • 出版日期:2019-05-28
  • 出版单位:运筹学学报
  • 年:2019
  • 期:v.23
  • 基金:国家自然科学基金(No.11371242)
  • 语种:中文;
  • 页:YCXX201902008
  • 页数:9
  • CN:02
  • ISSN:31-1732/O1
  • 分类号:90-98
摘要
逻辑回归是经典的分类方法,广泛应用于数据挖掘、机器学习和计算机视觉.现研究带有程。模约束的逻辑回归问题.这类问题广泛用于分类问题中的特征提取,且一般是NP-难的.为了求解这类问题,提出了嵌套BB(Barzilai and Borwein)算法的分裂增广拉格朗日算法(SALM-BB).该算法在迭代中交替地求解一个无约束凸优化问题和一个带程。模约束的二次优化问题.然后借助BB算法求解无约束凸优化问题.通过简单的等价变形直接得到带程。模约束二次优化问题的精确解,并且给出了算法的收敛性定理.最后通过数值实验来测试SALM-BB算法对稀疏逻辑回归问题的计算精确性.数据来源包括真实的UCI数据和模拟数据.数值实验表明,相对于一阶算法SLEP,SALM-BB能够得到更低的平均逻辑损失和错分率.
        Logistic regression is a promising method that has been used widely in many applications of data mining, machine learning, computer vision. In this paper,we study on the l_0 sparse logistic regression problem. This problem has been proposed as a promising method for feature selection in classification problems, and it is in general NP-hard. In order to solve this problem, we develop a splitting augmented Lagrange method with embedding BB(Barzilai and Borwein) method(SALM-BB). At each iteration, SALM-BB solves an unconstrained convex subproblem and a quadratic programming with l_0 constraint. We use BB method to solve the unconstrained convex subproblem. For the quadratic programming subproblem, we obtain its exact optimal solution directly. Furthermore, we prove the convergence of SALM-BB, under certain assumptions. Finally, we implement SALM-BB on the l_0 sparse logistic regression problem based on the simulation data and the data of University of California, Irvine, Machine Learning Repository. Numerical results show that SALM-BB outperforms the well-known first-order solver SLEP in terms of lower average logistic loss and lower misclassification error rate.
引文
[1] Maghbouleh A. A logistic regression model for detecting prominences[C]//Proceedings of the Fourth International Conference on Spoken Language. Philadelphia:IEEE, 1996:2443-2445.
    [2] Brzezinski J R, Knafl G J. Logistic regression modeling for context-based classification[C]//Proceedings of the Tenth International Workshop on Database and Expert Systems Applications.Florence:IEEE, 1999:755-759.
    [3] Friedman J, Hastie T, Tibshirani R. Additive logistic regression:a statistical view of boosting[J]. The Annals of Statistics, 2000, 28(2):337-407.
    [4] Asgary M P, Jahandideh S, Abdolmaleki P, et al. Analysis and identification ofβ-turn types using multinomial logistic regression and artificial neural network[J]. Bioinformatics, 2007,23(23):3125-3130.
    [5] Liao J G, Chin K V. Logistic regression for disease classification using microarray data:model selection in a large p and small n case[J]. Bioinformatics, 2007, 23(15):1945-1951.
    [6] Sartor M A, Leikauf G D, Medvedovic M, et al. LRpath:a logistic regression approach for identifying enriched biological groups in gene expression data[J]. Bioinformatics, 2009, 25(2):211-217.
    [7] Zhu J, Hastie T. Classification of gene microarrays by penalized logistic regression[J]. Biostatistics, 2004, 5(3):427-443.
    [8] Liang Y, Liu C, Luan X Z, et al. Sparse logistic regression with a l_(1/2)penalty for gene selection in cancer classification[J]. BMC Bioinformatics, 2013, 14(1):1-12.
    [9] Lee S I, Lee H, Abbeel P, et al. Efficient l_1 regularized logistic regression[C]//Proceedings of the National Conference on Artificial Intelligence. Massachusetts:AAAI Press, 2006:401-408.
    [10] Shi J, Yin W, Osher S, et al. A fast hybrid algorithm for large-scale l_1-regularized logistic regression[J]. The Journal of Machine Learning Research, 2008, 1(1):713-741.
    [11] Efron B, Hastie T, Johnstone I, et al. Least angle regression[J]. The Annals of Statistics, 2004,32(2):407-499.
    [12] Candes E J, Wakin M B, Boyd S P. Enhancing sparsity by reweighted l_1 minimization[J].Journal of Fourier Analysis and Applications, 2007, 14(5):877-905.
    [13] Bai Y Q, Shen Y J, Shen K J. Consensus proximal support vector machine for classification problems with sparse solutions[J]. Journal of the Operations Research Society of China, 2014,2(1):57-74.
    [14] Bagarinao E, Kurita T, Higashikubo M, et al. Adapting SVM image classifiers to changes in imaging conditions using incremental SVM:an application to car detection[C]//Proceedings of the Asian Conference on Computer Vision. Berlin:Springer-Verlag, 2009:363-372.
    [15] Kim S J, Koh K, Lustig M, et al. An interior-point method for large-scale l_1-regularized least squares[J]. IEEE Journal of Selected Topics in Signal Processing,2007, 1(4):606-617.
    [16] Liu J, Ji S, Ye J. SLEP:Sparse learning with efficient projections[EB/OL].[2016-04-20].http://www.yelab.net/software/SLEP.
    [17] Bai Y Q, Liang R L, Yang Z W. Splitting augmented lagrangian method for optimization problems with a cardinality constraint and semicontinuous variables[J]. Optimization Methods and Software, 2016, 31(5):1089-1109.
    [18] Raydan M. The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem[J]. SIAM Journal on Optimization, 1997, 7(1):26-33.
    [19] Birgin E G, Martinez J M, Raydan M, et al. Nonmonotone spectral projected gradient methods on convex sets[J]. SIAM Journal on Optimization, 1999, 10(4):1196-1211.
    [20] Lu Z, Zhang Y. Sparse approximation via penalty decomposition methods[J]. SIAM Journal on Optimization, 2013, 23(4):2448-2478.
    [21] Blake C, Merz C J. UCI Repository of machine learning databases[EB/OL].[2016-04-20].http://www.ics.uci.edu/~mlearn/MLRepository.html
    [22] Ng A Y. Feature selection, l_1 vs. l_2 regularization, and rotational invariance[C]//Proceedings of the International Conference on Machine Learning, 2004, 19(5):379-387.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700