均分式L_(1/2)正则化稀疏表示特征选择方法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Equational L_(1/2) regularization sparse representation feature selection
  • 作者:张笑朋 ; 降爱莲
  • 英文作者:ZHANG Xiao-peng;JIANG Ai-lian;School of Computer Science and Technology,Taiyuan University of Technology;
  • 关键词:稀疏表示 ; L_(1/2)正则化 ; 特征选择 ; 均分式L_(1/2)正则化 ; 高维
  • 英文关键词:sparse representation;;L_(1/2) regularization;;feature selection;;equational L_(1/2) regularization;;high dimension
  • 中文刊名:SJSJ
  • 英文刊名:Computer Engineering and Design
  • 机构:太原理工大学计算机科学与技术学院;
  • 出版日期:2019-06-16
  • 出版单位:计算机工程与设计
  • 年:2019
  • 期:v.40;No.390
  • 基金:山西省回国留学人员科研基金项目(2017-051)
  • 语种:中文;
  • 页:SJSJ201906022
  • 页数:5
  • CN:06
  • ISSN:11-1775/TP
  • 分类号:128-132
摘要
针对高维数据中出现的特征冗余问题,提出一种均分式L_(1/2)正则化稀疏表示特征选择方法。根据特征数将高维数据集平均分成若干份,使用阈值迭代算法对每个特征子集进行L_(1/2)正则化特征选择计算,聚合经过滤的数据集,运行L_(1/2)正则化特征选择算法。该特征选择方法能够选择出更具代表性的特征,减少时间开销。实验结果表明,该方法适用于高维数据和低维数据。
        Aiming at the problem of feature redundancy in high-dimensional data,equational L_(1/2) regularization sparse representation feature selection was proposed.High-dimensional data sets were divided into several parts averagely according to the feature number,and selecting feature with L_(1/2) regularization was implemented using iterative half thresholding algorithm for each part,the filtered data set was aggregated and selecting feature with L_(1/2) regularization was implemented.This feature selection method can select more representative features and reduce the time cost.Experimental results show that the equational L_(1/2) regularization feature selection method is suitable for both high dimensional data and low dimensional data.
引文
[1]Newton Spolaor,Everton Alvares Cherman,Maria Carolina Monard,et al.ReliefF for multi-label feature selection[C]//Intelligent Systems.IEEE,2014:6-11.
    [2]Wang X,Li Q,Xie ZH.New feature selection method based on SVM-RFE[J].Advanced Materials Research,2014,926-930:3100-3104.
    [3]Tibshirani RJ.The lasso problem and uniqueness[J].Electronic Journal of Statistics,2013,7(1):1456-1490.
    [4]YU Chanjuan,FANG Yuchun.Applying sparse representation-based feature selection to face recognition for gender and face[J].Computer Application and Software,2016,33(1):138-141(in Chinese).[余婵娟,方昱春.稀疏表示的特征选择方法在人脸性别和种族识别中的应用[J].计算机应用与软件,2016,33(1):138-141.]
    [5]Xu Z,Chang X,Xu F,et al.L-1/2regularization:A thresholding representation theory and a fast solver[J].IEEETransactions on Neural Networks&Learning Systems,2012,23(7):1013-1027.
    [6]Han B,He B,Sun T,et al.HSR:L1/2-regularized sparse representation for fast face recognition using hierarchical feature selection[J].Neural Computing and Applications,2016,27(2):1-16.
    [7]Zhu Y,Zhang X,Wen G,et al.Double sparse-representation feature selection algorithm for classification[J].Multimedia Tools&Applications,2017,76(16):1-15.
    [8]Zhou Q,Song S,Wu C,et al.Kernelized LARS-LASSO for constructing radial basis function neural networks[J].Neural Computing&Applications,2013,23(7-8):1969-1976.
    [9]Chen Xiaojun,Zhou Weijun.Convergence of the reweighted1,minimization algorithm for2-p,minimization[J].Computational Optimization and Applications,2014,59(1-2):47-61.
    [10]Lu C,Lin Z,Yan S.Smoothed low rank and sparse matrix recovery by iteratively reweighted least squares minimization.[J].IEEE Transactions on Image Processing,2014,24(2):646-654.
    [11]Zeng J,Lin S,Wang Y,et al.L1/2-regularization:Convergence of iterative half thresholding algorithm[J].IEEETransactions on Signal Processing,2014,62(9):2317-2329.
    [12]SHI Wanfeng,HU Xuegang,YU Kui.K-part Lasso based on feature selection algorithm for high-dimensional data[J].Computer Engineering and Applications,2012,48(1):157-161(in Chinese).[施万锋,胡学钢,俞奎.一种面向高维数据的均分式Lasso特征选择方法[J].计算机工程与应用,2012,48(1):157-161.]

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700