摘要
针对高维数据中出现的特征冗余问题,提出一种均分式L_(1/2)正则化稀疏表示特征选择方法。根据特征数将高维数据集平均分成若干份,使用阈值迭代算法对每个特征子集进行L_(1/2)正则化特征选择计算,聚合经过滤的数据集,运行L_(1/2)正则化特征选择算法。该特征选择方法能够选择出更具代表性的特征,减少时间开销。实验结果表明,该方法适用于高维数据和低维数据。
Aiming at the problem of feature redundancy in high-dimensional data,equational L_(1/2) regularization sparse representation feature selection was proposed.High-dimensional data sets were divided into several parts averagely according to the feature number,and selecting feature with L_(1/2) regularization was implemented using iterative half thresholding algorithm for each part,the filtered data set was aggregated and selecting feature with L_(1/2) regularization was implemented.This feature selection method can select more representative features and reduce the time cost.Experimental results show that the equational L_(1/2) regularization feature selection method is suitable for both high dimensional data and low dimensional data.
引文
[1]Newton Spolaor,Everton Alvares Cherman,Maria Carolina Monard,et al.ReliefF for multi-label feature selection[C]//Intelligent Systems.IEEE,2014:6-11.
[2]Wang X,Li Q,Xie ZH.New feature selection method based on SVM-RFE[J].Advanced Materials Research,2014,926-930:3100-3104.
[3]Tibshirani RJ.The lasso problem and uniqueness[J].Electronic Journal of Statistics,2013,7(1):1456-1490.
[4]YU Chanjuan,FANG Yuchun.Applying sparse representation-based feature selection to face recognition for gender and face[J].Computer Application and Software,2016,33(1):138-141(in Chinese).[余婵娟,方昱春.稀疏表示的特征选择方法在人脸性别和种族识别中的应用[J].计算机应用与软件,2016,33(1):138-141.]
[5]Xu Z,Chang X,Xu F,et al.L-1/2regularization:A thresholding representation theory and a fast solver[J].IEEETransactions on Neural Networks&Learning Systems,2012,23(7):1013-1027.
[6]Han B,He B,Sun T,et al.HSR:L1/2-regularized sparse representation for fast face recognition using hierarchical feature selection[J].Neural Computing and Applications,2016,27(2):1-16.
[7]Zhu Y,Zhang X,Wen G,et al.Double sparse-representation feature selection algorithm for classification[J].Multimedia Tools&Applications,2017,76(16):1-15.
[8]Zhou Q,Song S,Wu C,et al.Kernelized LARS-LASSO for constructing radial basis function neural networks[J].Neural Computing&Applications,2013,23(7-8):1969-1976.
[9]Chen Xiaojun,Zhou Weijun.Convergence of the reweighted1,minimization algorithm for2-p,minimization[J].Computational Optimization and Applications,2014,59(1-2):47-61.
[10]Lu C,Lin Z,Yan S.Smoothed low rank and sparse matrix recovery by iteratively reweighted least squares minimization.[J].IEEE Transactions on Image Processing,2014,24(2):646-654.
[11]Zeng J,Lin S,Wang Y,et al.L1/2-regularization:Convergence of iterative half thresholding algorithm[J].IEEETransactions on Signal Processing,2014,62(9):2317-2329.
[12]SHI Wanfeng,HU Xuegang,YU Kui.K-part Lasso based on feature selection algorithm for high-dimensional data[J].Computer Engineering and Applications,2012,48(1):157-161(in Chinese).[施万锋,胡学钢,俞奎.一种面向高维数据的均分式Lasso特征选择方法[J].计算机工程与应用,2012,48(1):157-161.]