用户名: 密码: 验证码:
代价敏感降维及其人脸识别应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
传统的降维方法,追求最低的识别错误率,假设不同错分的损失相同。在一些实际应用中,这一假设可能是不成立的。例如,在基于人脸识别的门禁系统中,存在入侵者类和合法者类,将入侵者错分成合法者的损失往往要大于将合法者错分成入侵者的损失,而将合法者错分成入侵者的损失又要大于将合法者错分成其他合法者的损失。基于此,本文研究代价敏感的降维算法,主要工作如下:
     1.提出了一种代价敏感的加权局部保持投影(Weighted Cost-Sensitive Local Preserving Projection, WCSLPP)。传统的局部保持投影算法(Local Preserving Projection, LPP)追求最小的识别错误率,其投影方向受类别不平衡影响。为此,本文在LPP模型中嵌入错分代价,定义了一种满足最小错分损失准则的WCSLPP模型。另外,为了解决类别不平衡问题,WCSLPP采用加权策略,平衡了各类样本对投影方向的贡献。在人脸数据集上的实验结果表明了WCSLPP算法的有效性。
     2.提出了一种嵌入成对代价的线性判别分析(Pairwise Costs in Linear Discriminant Analysis, PCLDA)。PCLDA通过在线性判别分析(Linear Discriminant Analysis, LDA)中引入加权函数,其模型不仅近似于成对贝叶斯风险准则,而且有效抑制了离群类对投影方向的影响。此外,考虑到数据集中类分布密度的差异性,PCLDA定义了一种重要性函数,平衡了各类样本对投影方向的贡献。在人脸数据集上的实验结果表明了PCLDA算法的有效性。
     3.提出了一种嵌入成对代价的子类判别分析(Pairwise Costs in SubClass Discriminant Analysis, PCSCDA)。本文通过分析基于人脸识别的门禁系统,将其归为一个代价敏感的子类学习问题,然后将错分代价和聚类信息同时注入判别分析框架,提出了一种近似于成对贝叶斯风险准则的PCSCDA算法。在人脸数据集上的实验结果表明了PCSCDA算法的有效性。
     4.提出了一种嵌入成对代价的半监督判别分析(Pairwise Costs in Semi-Supervised Discriminant Analysis, PCSDA)。在实际的人脸识别应用中,存在大量无标记数据,要获取有标记数据难。为了有效利用无标签人脸图像的信息,PCSDA采用1:方法预测无标签人脸图像的标签信息,与现有标签扩展策略相比,不仅具有较高的预测精度,而且时间复杂度低;以此,再通过引入加权函数,给出了满足成对贝叶斯风险准则的目标函数,提高了投影方向的判别能力。在人脸数据集上的实验结果验证了PCSDA算法的有效性。
     5.提出了一种代价敏感的半监督Laplacian支持向量机(Sample-Dependent Cost-Sensitive Semi-Supervised Support Vector Machine, SCS-LapSVM)。实际应用问题可能是代价敏感的,而且数据集中可能存在类别不平衡、大量无标签样本以及噪声样本。针对该情况,SCS-LapSVM在采用无标签扩展策略的基础上,将考虑了数据不平衡的错分代价嵌入Laplacian支持向量机的经验损失和Laplacian正则化项中。进一步,考虑到噪声样本对决策平面的影响,SCS-LapSVM定义了一种样本依赖的代价,对噪声样本赋予较低的权重。在UCI数据集和NASA软件数据集上的实验结果表明了SCS-LapSVM算法的有效性。
Conventional dimensionality reduction algorithms aim to attain low recognition errors, assuming that same misclassification loss of different misclassifications. In some real-world applications, this assumption may not hold. For example, in the door-locker based on face recognition, there has impostor and gallery person. The loss of misclassification impostor as gallery person are larger than misclassification gallery person as impostor, while the loss of misclassification gallery person as impostor will be larger than misclassification as other gallery person. So, this thesis proposes cost-sensitive dimensionality reduction. The main contributions of this thesis are as follows:
     1. A method called Weighted Cost-Sensitive Local Perserving Projection (WCSLPP) is introduced. Traditional Local Perserving Projection (LPP) aims to attain minimal misclassification error rate, and its projection direction will be influenced by imbalanced data. So this thesis embeds misclassification costs in LPP model, and defines the WCSLPP model which satisfies the minimal misclassification loss criterion. Besides, to deal with class imbalance problem, WCSLPP defines a weighted function to balance the contribution of different classes to the projection direction. The experimental results on face datasets show the superiority of WCSLPP.
     2. An algorithm named Pairwise Costs in Linear Discriminant Analysis (PCLDA) is proposed. By embeding a weighted function in the Linear Discriminant Analysis (LDA), PCLDA approximates the pairwise Bayesian risk criterion and effectively restrain the influence of outliers to the projection direction. Besides, considering the different class distribution density problem in data sets, PCLDA defines an important function to balance the contribution of different classes to the projection direction. The experimental results on face datasets demonstrate the effectiveness of PCLDA.
     3. An approach called Pairwise Costs in SubClass Discriminant Analysis (PCSCDA) is suggested. By analyzing the door-clocker based on face recognition, this thesis recognizes the door-clocker as a cost-sensitive subclass learning problem, then embeds the subclass information and misclassification costs in the framework of discriminant analysis at the same time, and proposes the PCSCDA algorithm approximates the pairwise Bayesian risk criterion.The experimental results on face datasets show the validity of PCSCDA.
     4. We propose a method named Pairwise Costs in Semi-supervised Discriminant Analysis (PCSDA). In real-world applications, there have a large number of unlabeled data and it is difficult to attain labeled data. To effectively utilize the information of unlabeled data, PCSDA uses l2approach to predict the label of unlabeled data. Compared with other label propagation strategies,l2approach has higher prediction accuracy and lower time complexity. Then by embeding a weighted function in LDA model, PCSDA approximates the pairwise accuracy criterion, and improves the discriminative ability of the projection direction. The experimental results on face datasets demonstrate the superiority of PCSDA.
     5. We develop an algorithm called Sample-Dependent Cost-Sensitive Semi-Supervised Support Vector Machine (SCS-LapSVM). In real-world applications, there may have cost-sensitive problem, the data sets of which may have class imbalance problem, a large number of unlabeled data and noise samples. In view of these situations in the data sets, SCS-LapSVM embeds the misclassification costs considering class imbalance problem in the hinge loss and Laplacian regularization of Laplacian support vector machine, on the basis of label propagation. Considering the effect on decision hypersphere of the noise samples, SCS-LapSVM defines an example-dependent cost which makes the weights of noise samples lower. The experimental results on UCI and NASA data sets show the effectiveness of SCS-LapSVM.
引文
[I]R.O. Duda, P.E. Hart, D.G. Stork. Pattern classification.2nd ed,2001, New York: Wiley.
    [2]J. Lafferty, L. Wasserman. Challenges in statistical machine learning. Statistica Sinica,2006,16(2):307-323.
    [3]C.M. Bishop. Pattern recognition and machine learning.2006, New York:Springer.
    [4]A.K. Jain, R.P. W. Duin, J.C. Mao. Statistical pattern recognition:A review. IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(1):4-37.
    [5]R. Huang, Q. Liu, H. Lu, S. Ma. Solving the small sample size problem of LDA. In: Proceedings of the 16th International Conference on Pattern Recognition,2002,3: 29-32.
    [6]D.M. Hawkins. The problem of overfitting. J. Chem. Inf. Comput. Sci.,2004,44(1): 1-12.
    [7]J. Loughrey, P. Cunningham. Overfitting in wrapper-based feature subset selection: the harder you try the worse it gets. The 24th SGAI International Conference on Innovative Techniques and Applications of Artificial Intelligence,2004,33-43.
    [8]C. Schaffer. When does overfitting decrease prediction accuracy in induced decision trees and rule sets?. In:Proceedings of the European Working Session on Learning, 1991,192-205.
    [9]T. Ho. Data complexity analysis:linkage between context and solution in classification. Structural, Syntactic, and Statistical Pattern Recognition,2008,5342: 986-996.
    [10]S. Huang, M.O. Ward, E.A. Rundensteiner. Exploration of dimensionality reduction for text visualization. The Third International Conference on Coordinated and Multiple Views in Exploratory Visualization,2005,63-74.
    [11]Y. Koren, L. Carmel. Robust linear dimensionality reduction. IEEE Transactions on Visualization and Computer Graphics,2004,10(4):459-470.
    [12]H. Hotelling. Analysis of a complex of statistical variables into principal components. Journal of Educational Psychology,1933,24(6):417-441.
    [13]I.T. Jolliffe. Principal component analysis.2nd ed,2002, New York:Springer.
    [14]张学工.模式识别.第三版,2002,北京:清华大学出版社.
    [15]P.N. Belhumeur, J. Hespanha. Eigenfaces vs. fisherfaces:recognition using specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997,19(7):711-720.
    [16]S.W. Ji, J.P. Ye. Generalized linear discriminant analysis:a unified framework and efficient model selection. IEEE Transactions on Neural Networks,2008,19(10): 1768-1782.
    [17]R.A. Fisher. The use of multiple measurements in taxonomic problems. Annals of Eugenics,1936,7(2):179-188.
    [18]A. Hyvarinen, E. Oja. Independent component analysis:algorithms and applications. Neural Networks,2000,13(4-5):411-430.
    [19]Y. Peng, Z.D. Qiang, J.C. Zhang. A new canonical correlation analysis algorithm with local discrimination. Neural Processing Letters,2010,31(1):1-15.
    [20]孙廷凯.增强型典型相关分析研究与应用:[博士学位论文].南京:南京航空航天大学,2006.
    [21]Z.H. Zhou, D.C. Zhang, Q. Yang. Semi-supervised learning with very few labeled training examples. In:Proceedings of the 22nd AAAI Conference on Artificial Intelligence,2007,1:675-680.
    [22]J.J. W. Sammon. A nonlinear mapping for data structure analysis. IEEE Transactions on Computers,1969,18(5):401-409.
    [23]T. Hastie. Principal curves and surfaces.1984, California:Stanford University.
    [24]T. Kohonen. Self-organizing maps.3nd ed,2001, New York:Springer.
    [25]B. Scholkopf, A. Smola, K.R. Miiller. Kernel principal component analysis. Artificial Neural Networks,1997,1327:583-588.
    [26]S. Mika, G. Ratsch, J. Weston, B. Scholkopf, K.R. Mullers. Fisher discriminant analysis with kernels. IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing,1999,41-48.
    [27]X.F. He, P. Niyogi. Locality preserving projections. In Neural Information Processing Systems,2003,153-160.
    [28]D. Cai, X.F. He, J.W. Han. Speed up kernel discriminant analysis. The VLDB Journal, 2011,20(1):21-33.
    [29]D. You, A.M. Martinez. Bayes optimal kernel discriminant analysis. IEEE International Conference on Computer Vision and Pattern Recognition,2010, 3533-3538.
    [30]D. You, O.C. Hamsici, A.M. Martinez. Kernel optimization in discriminant analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence,2011,33(3): 631-638.
    [31]J.B. Tenenbaum, V.D. Silva, J.C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science,2000,290(5500):2319-2323.
    [32]M. Belkin, P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation,2003,15(6):1373-1396.
    [33]S.T. Roweis, L.K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science,2000,290(550):2323-2326.
    [34]Y. Bengio, J.F. O. Paiement, P. Vincent, O. Delalleau, N.L. Roux, M. Ouimet. Out-of-sample extensions for LLE, isomap, MDS, eigenmaps, and spectral clustering. In Neural Information Processing Systems,2004,177-184.
    [35]X.F. He, D. Cai, S. Yan, H. Zhang. Neighborhood preserving embedding. IEEE International Conference on Computer Vision,2005,2:1208-1213.
    [36]Y. Fu, T.S. Huang. Locally linear embedded eigenspace analysis. Technical Report, http://www.ifp.uiuc.edu/~yunfu2/papers/LEA-Yun05.pdf,2005.
    [37]D. Cai, X.F. He, J.W. Han. Isometric projection. In:Proceedings of the 22nd AAAI Conference on Artificial Intelligence,2007,528-533.
    [38]L.S. Qiao, S.C. Chen, X.Y. Tan. Sparsity preserving projections with applications to face recognition. Pattern Recognition,2010,43(1):331-341.
    [39]D. Xu, S.C. Yan, D.C. Tao, S. Lin, H.J. Zhang. Marginal fisher analysis and its variants for human gait recognition and content-based image retrieval. IEEE Transactions on Image Processing,2007,16(11):2811-2821.
    [40]D. Cai, X.F. He, K. Zhou. Locality Sensitive discriminant analysis. In:Proceedings of International Joint Conference on Artificial Intelligence,2007,1713-1726.
    [41]D. Cai, X.F. He, J.W. Han. Semi-supervised discriminant analysis. IEEE International Conference on Computer Vision,2007,1-7.
    [42]W.W. Yu, X.L. Teng, C.Q. Liu. Face recognition using discriminant locality preserving projects. Image and Vision Computing,2006,24(3):2398-248.
    [43]L.S. Qiao, S.C. Chen, X.Y. Tan. Sparsity preserving discriminant analysis for single training image face recognition. Pattern Recognition Letters,2010,31(5):422-429.
    [44]R. He, W.S. Zheng, B.G. Hu, X.W. Kong. Nonnegative sparse coding for discriminative semi-supervised learning. IEEE International Conference on Computer Vision,2011,2849-2856.
    [45]F. Zang, J.S. Zhang. Discriminative learning by sparse representation for classification. Neurocomputing,2011,74(12-13):2176-2183.
    [46]J. Gui, Z.A. Sun, W. Jia, R.X. Hu, Y.K. Lei, S.W. Ji. Discriminant sparse neighborhood preserving embedding for face recognition. Pattern Recognition,2012, 45(8):2884-2893.
    [47]X.L. Li, S. Lin, S.C. Yan, D. Xu. Discriminant locally linear embedding with high-order tensor data. IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics,2008,38(2):342-352.
    [48]S.C. Yan, H. Wang. Semi-supervised learning by sparse representation. In:Proceeding of the SIAM International Conference on Data Mining,2009,792-801.
    [49]L. Zhu, S.N. Zhu. Face recognition based on orthogonal discriminant locality preserving projections. Neurocomputing,2007,70(7-9):1543-1546.
    [50]E. Kokiopoulou, Y. Saad. Orthogonal neighborhood preserving projections:a projection-based dimensionality reduction technique. IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(12):2143-2156.
    [51]S.C. Yan, D. Xu, B.Y. Zhang, H.J. Zhang, Q. Yang, S. Lin. Graph embedding and extensions:a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(1):40-51.
    [52]D.Y. Zhou, O. Bousquet, T.N. Lal, J. Weston, B. Scholkopf. Learning with local and global consistency. In Neural Information Processing Systems,2004,321-328.
    [53]J.B. Shi, J. Malik. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(8):888-905.
    [54]M. Belkin, P. Niyogi, V. Sindhwani. Manifold regularization:a geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research,2006,7:2399-2434.
    [55]张丽梅.面向降维的图学习研究及应用:[博士学位论文].南京:南京航空航天大学,2011.
    [56]L.M. Zhang, L.S. Qiao, S.C. Chen. Graph-optimized locality preserving projections. Pattern Recognition,2010,43(6):1993-2002.
    [57]L.M. Zhang, L.S. Qiao, S.C. Chen. Graph optimization for dimensionality reduction with sparsity constraints. Pattern Recognition,2012,45(3):1205-1210.
    [58]S.C. Chen, Y.L. Zhu. Subpattern-based principle component analysis. Pattern Recognition,2004,37:1081-1083.
    [59]K.R. Tan, S.C. Chen. Adaptively weighted sub-pattern PCA for face recognition. Neurocomputing,2005,64:505-511.
    [60]R. Gottumukkal, V.K. Asari. An improved face recognition technique based on modular PCA approach. Pattern Recognition Letters,2004,25(4):429-436.
    [61]M. Zhu, A.M. Martinez. Subclass discriminant analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence,2006,28(8):1274-1286.
    [62]B. Yang, S.C. Chen, X.D. Wu. A structurally motivated framework for discriminant analysis. Pattern Analysis and Applications,2011,14(4):349-367.
    [63]J.W. Lu, Y.P. Tan. Cost-sensitive subspace learning for face recognition. IEEE International Conference on Computer Vision and Pattern Recognition,2010, 2661-2666.
    [64]J.W. Lu, Y.P. Tan. Cost-sensitive subspace learning for human age estimation. IEEE 17th International Conference on Image Processing,2010,1593-1596.
    [65]J.W. Lu, X.Z. Zhou, Y.P. Tan. Cost-sensitive semi-supervised discriminant analysis for face recognition. IEEE Transactions on Information Forensics and Security,2012, 7(3):944-953.
    [66]J. Yang, D. Zhang, A.F. Frangl. Two-dimensional PCA:a new approach to appearance-based face representation and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence,2004,26(1):131-137.
    [67]J.P. Ye, R. Janardan, Q. Li. Two-dimensional linear discriminant analysis. In Neural Information Processing Systems,2004.
    [68]M.A.Vasilescu, D. Terzopoulos. Multilinear analysis of image ensembles: TensorFaces. In Proceedings of 7th European Conference on Computer Vision,2002, 2350:447-460.
    [69]杨波.图驱动的无监督降维和判别子空间学习研究及其应用:[博士学位论文].南京:南京航空航天大学,2010.
    [70]J. Wright, Y. Ma, J. Mairal, G. Sapiro, T.S. Huang, S.C. Yan. Sparse representation for computer vision and pattern recognition. In:Proceedings of the IEEE,2010, 98(6):1031-1044.
    [71]J. Wright, A. Yang, S. Sastry, Y. Ma. Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009,31(2):210-227.
    [72]乔立山.基于图的降维技术研究及应用:[博士学位论文].南京:南京航空航天大学,2009.
    [73]H. Chen, H. Chang, T. Liu. Local discriminant embedding and its variants. IEEE Conference on Computer Vision and Pattern Recognition,2005,2:846-853.
    [74]C. Blake. UCI repository of machine learning databases. http://www.ics.uci.edu/-mlearn/MLRepository.html,1998.
    [75]Y. Zhang, Z.H. Zhou. Cost-sensitive face recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence,2010,32(10):1758-1769.
    [76]C. Elkan. The foundations of cost-sensitive learning. In Proceedings of the 17th International Joint Conference on Artificial Intelligence,2001,973-978.
    [77]S.R. Safavian, D. Landgrebe. A survey of decision tree classifier methodology, IEEE Transactions on Systems, Man and Cybernetics,1991,21 (3):660-674.
    [78]M. Tan. Cost-sensitive learning of classification knowledge and its applications in robotics. Machine Learning,1993,13(1):7-33.
    [79]P.D. Turney. Types of cost in inductive concept learning. Workshop on Cost-Sensitive Learning at the 17th International Conference on Machine Learning, Stanford University,2000,15-21.
    [80]T. Wang, Z.X. Qin, S.C. Zhang. Cost-sensitive classification with inadequate labeled data. Information Systems,2012,37(5):508-516.
    [81]S.C. Zhang. Cost-sensitive classification with respect to waiting cost. Knowledge Based Systems,2010,23(5):369-378.
    [82]V.S. Sheng. Fast data acquisition in cost-sensitive learning. In:Proceedings of the International Conference on Data Mining,2012,66-77.
    [83]V.S. Sheng, C.X. Ling. Feature value acquisition in testing:a sequential batch test algorithm. In:Proceedings of the 23rd International Conference on Machine Learning,2006,809-816.
    [84]J.L. Wang, P.L. Zhao, S.C. H. Hoi. Cost-sensitive online classification. In: Proceedings of the 12th International Conference on Data Mining,2012,1140-1145.
    [85]Q. Yang, C.X. Ling, X. Chai, R. Pan. Test-cost sensitive classification on data with missing values. IEEE Transactions on Knowledge and Data Engineering,2006,18(5): 626-638.
    [86]C.X. Ling, Q. Yang, J. Wang, S.C. Zhang. Decision trees with minimal costs. In: Proceedings of the International Conference on Machine Learning,2004,69-76.
    [87]C.X. Ling, V.S. Sheng, Q. Yang. Test strategies for cost-sensitive decision trees. IEEE Transactions of Knowledge and Data Engineering,2006,18(8):1055-1067.
    [88]R. Batuwita, V. Palade. FSVM-CIL:fuzzy support vector machines for class imbalance learning. IEEE Transactions on Fuzzy Systems,2010,18(3):558-571.
    [89]凌晓峰.代价敏感分类器的比较研究.计算机学报,2007,30(8):1203-1212.
    [90]P.D. Turney. Cost-sensitive classification:empirical evaluation of a hybrid genetic decision tree induction algorithm. Journal of Artificial Intelligence Research,1995,2: 369-409.
    [91]C. Drummond, R. Holte. Exploiting the cost sensitivity of decision tree splitting vriteria. In:Proceedings of the 17th International Conference on Machine Learning, 2000,239-246.
    [92]C. Drummond, R. Holte. C4.5, class imbalance, and cost sensitivity:why under-sampling beats over-sampling. In:Proceedings of the Workshop on Learning from Imbalanced Datasets,2003,1-8.
    [93]P. Domingos. MetaCost:a general method for making classifiers cost-sensitive. In: Proceedings of 15th International Conference on Knowledge Discovery and Data Mining,1999,155-164.
    [94]X. Chai, L. Deng, Q. Yang, C.X. Ling. Test-cost sensitive naive bayes classification. In:Proceedings of the 2004 International Conference on Data Mining,2004,51-58.
    [95]S. Vandera. CSNL:A cost-sensitive non-linear decision tree algorithm. ACM Transactions on Knowledge Discovery from Data,2010,4(2):6-25
    [96]H.M. Shirazi, N. Vasconcelos. Cost-sensitive boosting. IEEE Transactions on Pattern Analysis and Machine Intelligence,2011,33(2):294-309.
    [97]S. Raudys, A. Raudys. Pairwise costs in multiclass perceptrons. IEEE Transactions on Pattern Analysis and Machine Intelligence,2010,32(7):1324-1328.
    [98]K. Morik, P. Brochhausen. Combining statistical learning with a knowledge-based approach:A case study in intensive care monitoring. In:Proceedings of 16th international conference on Machine Learning,1999,268-277.
    [99]Y. Lee, Y. Lin, G. Wahba. Multicategory support vector machines:theory and application to the classification of micro array data and satellite radiance data. Journal of American Statistical Association,2004,99(465):67-81.
    [100]J.Y. Man, X.Y. Jing, D. Zhang, C. Lan. Sparse cost-sensitive classifier with application to face recognition. In:Proceedings of 18th International Conference on Image Processing,2011,1773-1776.
    [101]Y.F. Li, J. Kwok, Z.H. Zhou. Cost-sensitive semi-supervised support vector machine. In:Proceedings of the 24th AAAI Conference on Artificial Intelligence (AAAI'10), 2010,500-505.
    [102]B. Zadrozny, J. Langford, N. Abe. Cost-sensitive learning by cost-proportionate example weighting. In:Proceedings of the 3th International Conference on Data Mining,2003,435-442.
    [103]K.M. Ting. An instance-weighting method to induce cost-sensitive trees. IEEE Transactions on Knowledge and Data Engineering,2002,14(3):659-665.
    [104]Z.H. Zhou, X.Y. Liu. On multi-class cost-sensitive learning. Computational Intelligence,2010,26(3):232-257.
    [105]T. Hastie, P. Tibshirani, J. Friedman. The elements of statistical learning.2001, New York:Springer.
    [106]X.Y. Liu, Z.H. Zhou. Learning with cost intervals. In:Proceedings of the 16th ACM SIGKDD Conference on Knowledge Discovery and Data Ming,2010,494-505.
    [107]X.Y. Liu, Z.H. Zhou. The influence of class imbalance on cost-sensitive learning:An empirical study. In:Proceedings of the 6th IEEE International Conference on Data Mining,2006,970-974.
    [108]C. Seiffert, T.M. Khoshgoftaar, J.V. Hulse, A. Napolitano. A comparative study of data sampling and cost sensitive learning. In:Proceedings of the International Conference on Data Mining,2008,46-52.
    [109]N.T. Nghe, Z. Gantner, L.S. Thieme. Cost-sensitive learning methods for imbalanced data. The International Joint Conference on Neural Networks,2010,18-23.
    [110]S.Z. Wang, Z.J. Li, W.H. Chao, Q.H. Cao. Applying adaptive over-sampling technique based on data density and cost-sensitive SVM to imbalanced learning. The International Joint Conference on Neural Networks,2012,1-8.
    [111]Z.H. Zhou, X.Y. Liu. Training cost-sensitive neural networks with methods addressing the class imbalance problem. IEEE Transactions on Knowledge and Data Engineering,2006,18(1):63-77.
    [112]N.V. Chawla, K. Bowyer, L. Hall, W. P. Kegelmeyer. SMOTE:synthetic minority over-sampling technique. Journal of AI Research,2002,16:321-357.
    [113]S. Areibi, G. Grewal, J. Tempelman. A dynamic sampling framework for multi-class imbalanced data. In:Proceedings of 11th International Conference on Machine Learning and Applications,2012,2:113-118.
    [114]H.B. He, E.A. Garcia. Learning from imbalanced data. IEEE Transcations on Konwledge and Data Engineering,2009,21(9):1263-1284.
    [115]X.Y. Liu, J.X. Wu, Z.H. Zhou. Exploratory undersampling for class-imbalance learning. IEEE Transactions on Systems, Man and Cybernetics- Part B:Cybernetics, 2009,39(2):539-550.
    [116]Y. Jiang, M. Li, Z.H. Zhou. Software defect detection with ROCUS. Journal of Computer Science and Technology,2011,26(2):328-342.
    [117]X.Y. Liu, Z.H. Zhou. Towards cost-sensitive learning for real-world applications. In: Proceedings of the 15th International Conference on New Frontiers in Applied Data Minging,2012,494-505.
    [118]N. Abe, B. Zadrozny, J. Langford. An iterative method for multi-class cost-sensitive learning. In:Proceedings of the 10th ACM SIGKDD International Conference on Kowledge Discovery and Data Minging,2004,3-11.
    [119]T. Ahonen, A. Hadid. Face description with local binary patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence,2006,28(12):2014-2037.
    [120]S. Georghiades, P.N. Belhumeur. From few to many:illumination cone models for face recognition under variable lighting and pose. IEEE Transactions on Pattern Analysis and Machine Intelligence,2001,23(6):643-660.
    [121]X.F. He, S. Yan. Face recognition using laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(3):328-340.
    [122]K. Jain, A. Ross, S. Prabhakar. An introduction to biometric recognition. IEEE Transactions on circuits and systems for video technology,2004,14(1):4-20.
    [123]F. Li, H. Wechsler. Open set face recognition using transduction. IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(11):1686-1697.
    [124]M. Turk, A. Pentland. Eigenfaces for recognition. Journal of cognitive neuroscience, 1991,3(1):71-86.
    [125]W. Zhao, R. Chellappa, P.J. Phillips. Face recognition:a literature survey. ACM Computing Surveys,2003,35(1):399-458.
    [126]A.M. Martinez, R. Benavente. The AR face database. CVC Technical Report 24, http://www.cat.uab.cat/Public/Publications/1998/MaB1998/CVCReport24.pdf,1998.
    [127]A.M. Martinez, A.C. Kak. PCA versus LDA. IEEE Transactions on Pattern Analysis and Machine Intelligence,2001,23(2):228-233.
    [128]F. Samaria, A. Harter. Parameterisation of a stochastic model for human face identification. In:Proceeding of IEEE Workshop on Applications of Computer Vision,1994,138-142.
    [129]M. Loog, R.P. W. Duin, R.H. Umbach. Multiclass linear dimension reduction by weighted pairwise fisher criteria. IEEE Transactions on Pattern Analysis and Machine Intelligence,2001,23(7):762-766.
    [130]M. Loog. Approximate pairwise accuracy criteria for multiclass linear dimension reduction:generalisations of the fisher criteria.1999, Delft University.
    [131]C. Hamsici, A.M. Martinez. Bayes optimally in linear discriminant analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence,2008,30(4):647-657.
    [132]D.Q. Zhang, Z.H. Zhou, S.C. Chen. Semi-supervised dimensionality reduction. In: Proceedings of the 7th SIAM International Conference on Data Mining,2007,629-634.
    [133]K. Bennett, A. Demiriz. Semi-supervised support vector machine. In Neural Information Processing Systems,1999,368-374.
    [134]O. Chapelle, B. Scholkopf. Semi-supervised learning.2006, MA:MIT Press.
    [135]E.L. Hu, S.C. Chen, D.Q. Zhang, X.S. Yin. Semi-supervised kernel matrix learning by kernel propagation. IEEE Transactions on Neural Networks,2010,21(11): 1831-1841.
    [136]D.Q. Zhang, D.G. Shen. Semi-supervised multimodal classification of Alzheimer's disease. IEEE International Symposium on Biomedical Imaging:From Nano to Macro,2011,1628-1631.
    [137]Y. Wang, S.C. Chen, Z.H. Zhou. A new semi-supervised classification method based on modified cluster assumption. IEEE Transactions on Neural Networks and Learning Systems,2012,23(5):689-702.
    [138]Q.F. Shi, A. Eriksson, C.H. Shen. Is face recognition really a compressive sensing problem?. IEEE International Conference on Computer Vision and Pattern Recognition,2011,553-560.
    [139]H. Xue, S.C. Chen, Q. Yang. Structural regularized support vector machine:a framework for structural large margin classifier. IEEE Transactions on Neural Networks,2011,22(4):573-587.
    [140]M. Chapman, P. Callis. Metrics data program. http://mdp.ivv.nasa.gov,2004.
    [141]陈银娟.基于代价敏感的特征选择及其在软件缺陷预测中的应用:[硕士学位论文].南京:南京师范大学,2012.
    [142]F. Provost, T. Fawcett. Analysis and visualization of classifier performance: comparison under imprecise class and cost distributions. In:Proceedings of the 3th International Conference on Knowledge Discovery and Dining,1997,43-48.
    [143]T. Fawcett. ROC graphs with instance-varying costs. Pattern Recognition Letters, 2006,27(8):882-891.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700