基于聚类与流形正则化的分类方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
分类器设计一直是模式识别领域研究的重要课题之一。近十年来,随着统计学习和核函数理论的深入研究,涌现出许多新方法。这些理论和方法较好地解决了模式分类中的局部最优、过拟合以及维数灾难等问题。然而,在以支持向量机为代表的核分类方法的基础上,近年来又涌现出了一些新的研究热点,这些新的热点往往是传统模式分类方法存在的弊端,例如,海量高维数据的分类、类重叠和噪声干扰下的数据分类、多标记数据分类、类不平衡数据的分类、非线性分类中的核函数(矩阵)优化以及非线性快速分类等等。在此背景下,本文主要从快速鲁棒聚类算法、不平衡样本的分类、核优化、基于流形正则化的快速半监督分类等几个方面进行深入研究,提出了解决类不平衡、核优化以及快速分类的新方法。
     论文的主要研究工作包括以下四个方面的内容:
     (1)针对实际应用中样本重叠以及噪声干扰问题,提出了一种基于样本加权的可能性模糊聚类算法和一种鲁棒可能性模糊核聚类算法。第一种聚类算法主要解决近似线性可分问题,算法通过为孤立点或噪声点赋予较小的权重缩小典型值的收敛范围,减小其对聚类的影响。在分析算法收敛性的基础上,证明了其具有比传统IPCM(Improved Possibilistic C-Means)算法更快的收敛速度,在有效降低时间复杂度的同时能够取得较好的聚类准确率。第二种聚类算法主要解决线性不可分问题,同时,为解决无监督条件下的核函数参数选择问题,提出了一种核函数参数优化方法。因此,所提出的聚类算法不仅可以同时处理线性不可分和部分重叠数据集,而且具有更强的鲁棒性,在噪声干扰下能够取得较好的聚类准确率。
     (2)针对实际应用中正负样本数量分布不平衡分类问题,基于两种鲁棒聚类算法,建立了可能性模糊支持向量机(Possibilistic Fuzzy Support VectorMachine,PFSVM)模型,提出了基于可能性模糊聚类的不平衡数据分类方法。所设计的分类器较好地解决了分类中的类不平衡、孤立点和噪声干扰问题,通过鲁棒聚类算法为训练样本分配模糊隶属度和典型值,减小了孤立点和噪声对SVM的分类精度以及泛化能力所造成的影响。
     (3)针对多核学习效率较低以及需要预先定义一组核函数等缺陷,建立了无监督非参数核学习模型,该模型易于拓展至有监督学习。提出了非参数核学习分类方法。该方法通过对多核学习优化问题进行放松,使其可以转化为一系列的稀疏特征值分解子问题,每次迭代中只需进行闭合解的计算,从而提高了核学习的性能和效率。所建立的模型通过把谱核学习和间隔最大化标准进行有机结合,充分利用了数据的低维流形结构,增强了决策函数的光滑性,同时可以有效利用未标记数据进行最大间隔分类。实验验证了非参核学习的有效性,在有监督和无监督情况下,提出的非参核学习方法的性能均优于多核学习方法。
     (4)为解决半监督快速学习问题,建立了扩展的流形正则化框架E-MR(Extended Manifold Regularized Framework),提出了推广的决策函数表示定理、单输出极速学习机与流形正则化框架关系定理和多输出极速学习机与流形正则化框架关系定理。这些定理为快速半监督分类模型和算法的提出提供了理论依据,表明所建立的流形正则化极速学习机模型(Manifold Regularized ExtremeLearning Machine, MRELM)是E-MR框架的一个特例,其本质是随机地离散化核函数。因此,所提出的算法是传统核分类的近似算法。MRELM继承了ELM无需调整模型参数的优点,能够为不同的学习任务提供统一的解析解。实验结果验证了MRELM算法的有效性。
     本文研究的内容主要涉及到了不平衡数据分类方法、基于非参数核优化的分类方法以及快速半监督分类方法三个方面的相关研究内容。在研究了相关前期工作的基础上,建立了多种分类和学习模型,提出了新的学习算法,并使用标准数据集和多个人脸数据集对算法进行了测试。通过和相关算法进行对比,进一步验证了本文提出算法的有效性。本文的研究成果将丰富分类问题的解决途径,具有一定的理论意义和较好的应用前景。
In past years, the classifier design has been one of the important research topicsin the field of pattern recognition, it has made rapid progress in the aspects of theoryand application and a lot of new approaches have been emerging. These newrequirements are often the disadvantages of traditional pattern classification methods,such as classification of massive high-dimensional data, classification under noisejamming and class overlapping, multi-labeled data classification, classification onclass imbalance data sets, optimization of kernel matrices in nonlinear classification,fast non-linear classification, etc. Under this background, this thesis has made furtherresearch on fast robust clustering, classification on class imbalance samples, kerneloptimization and fast semi-supervised classification based on manifold regularization.Several new models and methods have been present to solve the problem of classimbalance, kernel optimization and fast classification.
     The studies mainly include the following four aspects:
     (1) To solve the problem of class overlapping and noise jamming, a novelpossibilistic fuzzy clustering algorithm based on the sample-weighted idea wasproposed. In this method, the outliers and the noises have smaller weights, which canlimit the convergence range of typical values. Thus, their contributions to theclustering process are reduced. We proved that its convergence speed was faster thanthat of IPCM (Improved Possibilistic C-Means) algorithm. It can not only reduce thetime complexity effectively, but obtain good clustering accuracy. To solve theclustering problem in linear inseparable case, a robust possibilistic fuzzy kernelclustering was proposed. This method can not only handle linear inseparable and classoverlapping datasets, but also overcome noise interference effectively.
     (2) The existing class imbalance learning methods can decrease the sensitivity ofSVM to the imbalanced class, but it still suffers from the problem of noises andoutliers. We proposed a new class imbalance method based on the fuzzy and typicalmemberships, which can handle the imbalance problem. Since improved clusteringalgorithm is robust to noises, this method is not only effective on class imbalancedatasets, but also robust to noises and outliers. Experimental results on artificialdatasets and real datasets show that the proposed method is effective for solving theclass imbalance problem, especially for the imbalanced datasets existing noises andoutliers.
     (3) Most research on Non-Parametric kernel learning (NPKL) has tended tofocus on the semi-supervised scenario. In this paper, we propose a novel unsupervisednon-parametric kernel learning method, which can seamlessly combine the spectralembedding of unlabeled data and manifold Regularized Least-Squares (RLS) to learnnon-parametric kernels efficiently. The proposed algorithm enjoys a closed-formsolution in each iteration, which can be efficiently computed by the Lanczos sparseeigen-decomposition technique. Meanwhile, it can be extended to supervised kernellearning naturally.
     (4) Compared with traditional computational intelligence techniques such assupport vector machine (SVM), Extreme learning machine (ELM) provides bettergeneralization performance at a much faster learning speed without tuning modelparameters. In order to deal with unlabeled data, we extend the manifoldregularization framework, and demonstrate the relationship between the extended MRframework and ELM. A manifold regularized extreme learning machine is derivedfrom the proposed framework, which maintains the properties of ELM, especially insolving the problem of large-scale data training.
     This thesis mainly studied three aspects, e.g., classification method for classimbalance samples, classification methods based on non-parametric kerneloptimization and fast semi-supervised classification. Based on the related preparatorywork, several classification and learning models have been established and newalgorithms have been designed. Experiments on benchmark datasets and face datasetsvalidate the effectiveness and efficiency of these proposed algorithms. The results ofthis thesis will enrich the ways to solve the classification problem and has a certaintheoretical significance and good application prospect.
引文
[1]边肇祺,张学工等.模式识别[M].清华大学出版社,2004.
    [2] K. Fukunage, Introduction to statistical pattern recognition[M]. Academic Press, New York,1991.
    [3] R.O. Duda, P.E. Hart, and D.G. Stork. Pattern classification [M]. Wiley,2001.
    [4] V. Vapnik. Statistical learning theory. Wiley [M],1998.
    [5]徐勇,张大鹏,杨健.模式识别中的核方法及其应用[M].北京:国防工业出版社.2010.
    [6]孙即祥.现代模式识别(第二版)[M].长沙:国防科技大学出版社.2002.
    [7] S. Haykin著.申富饶,徐烨等译.神经网络与机器学习(第三版)[M].北京:机械工业出版社,2011.
    [8] B. Sch lkopf, A. Smola. Learning with kernels: support vector machines, regularization,optimization, and beyond [M]. The MIT Press.2001.
    [9] J.Shawe-Tayor,N.Cristianini著.赵玲玲,翁苏明,曾华军等译.模式分析的核方法[M].北京:机械工业出版社,2006.
    [10]邓乃扬,田英杰著.数据挖掘中的新方法:支持向量机[M].科学出版社,2004.
    [11] B Liu, S.X. Xia,Y Zhou. Unsupervised non-parametric kernel learning algorithm[J].Knowledge-Based Systems,2013,44:1-9.
    [12] V. N. Vapnik. An overview of statistical learning theory[J]. Neural Networks, IEEETransactions on,1999,10(5):988-999.
    [13]邓乃扬,田英杰著.支持向量机—理论、算法与拓展[M].科学出版社,2009.
    [14]王珊,王会举,覃雄派,周烜.架构大数据:挑战、现状与展望[J].计算机学报,2011,34(10):1741-1752.
    [15] N Chawla, K Bowyer, L Hall, W Kegelmeyer. SMOTEBoost: Improving prediction of theminority class in boosting[C].7th European Conference on Principles and Practice ofKnowledge Discovery in Databases,2003,107-119.
    [16] H Hui, W Y Wang, B H Mao. Borderline-SMOTE: A new over-sampling method inimbalanced data sets learning[C]. International Conference on Intelligent Computing,2005,878-887.
    [17] W Yang, S Liu, T Jin, X Xu. An optimization criterion for generalized marginal fisheranalysis on undersampled problems[J]. International Journal of Automation and Computing,2011,8(2):193-200.
    [18] H G Xun, H Hui, et al. An over-sampling expert system for learning fromimbalanced datasets [J]. International Conference on Neural Networks and Brain,2005,537-541.
    [19] J J Wang, M T Xu, H Wang, J W Zhang. Classification of imbalanced data by using theSMOTE algorithm and locally linear embedding[C].8th International Conference on SignalProcessing,2007.
    [20] R Akbani, S Kwek, N Japkowicz. Applying support vector machines to imbalanceddatasets[C].15th European Conference on Machine Learning,2004,39-50.
    [21] S Tang, S P Chen. The generation mechanism of synthetic minority class examples[C].International Conference on Information Technology and Applications in Biomedicine.2008,444-447.
    [22] R Barandela, R Valdovinos, et al. The imbalanced training sample problem: under or oversampling[C]. Proc of International Workshops on Structural Syntactic and Statistical PatternRecognition.2004:806-814.
    [23] K Miroslav, M Stan. Addressing the curse of imbalanced training sets-one sided selection[C].In: Proc. of the14th International Conference on Machine Learning, Morgan Kaufmann.1997,179-186.
    [24] J Dehmeshki, M Karak, V Casique. A rule-based scheme for filtering examples from majorityclass in an imbalanced training set[C]. Proc of MLDM,2003:215-223.
    [25] T Yu, T Jan, S Simoff, J Debenham. A hierarchical VQSVM for imbalanced data sets[C].IEEE International Conference on Neural Networks,2007,518-523.
    [26]肖健华,吴今培.样本数目不对称时的SVM模型[J].计算机科学,2003,30(2):165-167.
    [27] K Z Huang, H Q Yang, K Irwin, et al. Biased mimimax probability machine for medicaldiagnosis[C]. Proc. of the8th International Symposium on Artificial Intelligence andMathematics,2004.
    [28] G Wu, Y C Edward. Adaptive feature space conformal transformation for imbalanced datalearning[C]. In: Proc. of the20th international conference on machine learning, WashingtonDC,2003.816-823.
    [29] G Wu, Y C Edward. KBA: Kernel boundary alignment considering imbalanced datadistribution [J]. The IEEE transactions on knowledge and data engineering,2005,17(6):786-795.
    [30] H J Lee, S Z Cho. Focusing on non-respondents: Response modeling with noveltydetectors[J]. Expert Systems with Applications.2007,33(2):522-530.
    [31] X Peng, Y Wang. A bi-fuzzy progressive transductive support vector machine(BFPTSVM)algorithm[J]. Expert Systems with Applications,2010,37(1):527-533.
    [32] T. Imam, K. Ting, and J. Kamruzzaman. z-SVM: An SVM for improved classification ofimbalanced data[C]. In Proc.19th Aust. Joint Conf. AI, Hobart, Australia,2006,264–273.
    [33] R Batuwita, V Palade. FSVM-CIL: Fuzzy support vector machines for class imbalancelearning[J]. IEEE Trans Fuzzy Systems,2010,18(3):558-571.
    [34] C T Su, L S Chen, Yih Y. Knowledge acquisition through information granulation forimbalanced data[J]. Expert Systems with applications,2006,31:531-541.
    [35]谢纪刚,裘正定.不平衡数据集Fisher线性判别模型[J].北京交通大学学报,2006,30(5):15-18.
    [36] J F Liu, D R Yu. A weighted rough set method to address the class imbalance problem[C].Proceedings of the Sixth International Conference on Machine Learning and Cybernetics,2007,3693-3698.
    [37] C J Tsai, C I Lee, et al. A multivariate decision tree algorithm to mine imbalanced data[J].WSEAS Transactions on Information Science and Applications,2007,4(1):50-58.
    [38] C Claire, H Nicholas. Improving minority class prediction using case specific featureweights[C]. In: Proc. of the14th international conference on machine learning, Morgan,Kaufmann,1997,57-65.
    [39] J.B Yin, T Li, Hong-Bin Shen.Gaussian kernel optimization: Complex problem and a simplesolution[J]. Neurocomputing,2011,74(18):3816-3822.
    [40] S Kitayama, K Yamazaki. Simple estimate of the width in Gaussian kernel with adaptivescaling technique Original Research Article[J]. Applied Soft Computing,2011,11(8):4726-4737.
    [41] C F Yan, Bo Liu. An improved novel kernel parameter optimization and application[C]. In3rd International Workshop on Intelligent Systems and Applications (ISA),2011,1–4.
    [42]常群,王晓龙,林沂蒙,王熙照, Daniel S.Yeung.支持向量分类和多宽度高斯核[J].电子学报,2007,35(3):484-487.
    [43] L.Wang, P. Xue, K. L. Chan. Two criteria for model selection in multiclass support vectormachines[J]. IEEE Transactions on Systems, Man, and Cybernetics-Part B: Cybernetics,2012,42(2):513-529.
    [44] N.Cristianini, J.Shawe-Taylor, A.Elisseeff, J.Kandola. On kernel-target alignment[C]. InAdvances in Neural Information Processing Systems(NIPS)2001,14:367-373.
    [45] M.Y. Zhao, C Fu, L Ji, Ke Tang, Mingtian Zhou. Feature selection and parameteroptimization for support vector machines: A new approach based on genetic algorithm withfeature chromosomes[J]. Expert Systems with Applications,2011,38(5):5197-5204.
    [46] I Aydin, M Karakose, E Akin. A multi-objective artificial immune algorithm for parameteroptimization in support vector machine[J]. Applied Soft Computing,2011,11(1):120-129.
    [47] P Chudzian. Evaluation measures for kernel optimization[J], Pattern Recognition Letters,2012,33(9):1108-1116.
    [48] X.W Liu, J.P. Yin, L Wang. An adaptive approach to learning optimal neighborhoodkernels[J]. IEEE Transactions on Cybernetics,2013,43(1):371–384.
    [49] Di You, Hamsici O.C., Martinez A.M. Kernel optimization in discriminant analysis[J]. IEEETransactions on Pattern Analysis and Machine Intelligence,2011,33(3):631–638.
    [50] Hicham Laanaya, Fahed Abdallah, Hichem Snoussi, Cédric Richard. Learning generalGaussian kernel hyperparameters of SVMs using optimization on symmetric positive-definitematrices manifold[J]. Pattern Recognition Letters,2011,32(13):1511-1515.
    [51] C. Cortes, M. Mohri, A. Rostamizadeh. Two-stage learning kernel algorithms[C]. In:Proceedings of the27th International Conferece on Machine Learning(ICML’2010). Haifa,Israel,2010,239-246.
    [52] Taciana A.F. Gomes, Ricardo B.C. Prudêncio, Carlos Soares, André L.D. Rossi, AndréCarvalho.Combining meta-learning and search techniques to select parameters for supportvector machines[J]. Neurocomputing,2012,75(1):3-13.
    [53] X Y Zhang, J.Z Zhou, C.Q Wang, C.S Li, L.X Song. Multi-class support vector machineoptimized by inter-cluster distance and self-adaptive deferential evolution[J]. AppliedMathematics and Computation,2012,218(9):4973-4987.
    [54]唐耀华,郭为民,高静怀.基于核相似性差异最大化的支持向量机参数选择算法[J].模式识别与人工智能,2010,23(2):210-215.
    [55] G.R.G. Lanckriet, N. Cristianini, P. Bartlett, L.E. Ghaoui, M.I. Jordan. Learning the kernelmatrix with semidefinite programming[J], The Journal of Machine Learning Research,2004,5:27-72.
    [56] F.Bach, G.Lanckriet, M.Jordan. Multiple kernel learning, conic duality, and the SMOalgorithm[C]. In: Proceedings of the21st International Conference on MachineLearning(ICML’2004),Banff, Canada,2004,41-48.
    [57] Mehmet G nen, Ethem Alpayd n. Regularizing multiple kernel learning using responsesurface methodology[J]. Pattern Recognition,2011,44(1):159-171.
    [58] Mehmet G nen, Ethem Alpayd n. Localized algorithms for multiple kernel learning[J].PatternRecognition,2013,46(3):795-807.
    [59] F. Yger, A. Rakotomamonjy. Wavelet kernel learning[J]. Pattern Recognition,2011,44(10):2614-2629.
    [60] J. Ye, S. Ji, and J. Chen, Multi-class discriminant kernel learning via convex programming,Journal of Machine Learning Research,2008,9,719-758.
    [61] A. Vedaldi, A. Zisserman. Sparse kernel approximations for efficient classification anddetection[C]. IEEE Conference on Computer Vision and Pattern Recognition (CVPR),2012,2320–2327.
    [62] Taghizadeh, E., Sadeghipoor, Z., Manzuri, M.T. Semipolynomial kernel optimization basedon the fisher method[C]. IEEE International Workshop on Machine Learning for SignalProcessing (MLSP),2011,1–6.
    [63] H.Q Yang, Z.L Xu, Jieping Ye, King, I., Lyu, M.R. Efficient sparse generalized multiplekernel learning[J]. IEEE Transactions on Neural Networks,2011,22(3):433–446.
    [64] C. Cortes, M. Mohri, A. Rostamizadeh. L2regularization for learning kernels[C]. In:Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence,2009,109-116.
    [65] Y.N Han, G.Z Liu. Probability-confidence-kernel-based localized multiple kernel learningWith Lpnorm systems[J]. IEEE Transactions on Man, and Cybernetics, Part B: Cybernetics,2012,42(3):827–837.
    [66] M. Varma, B. R. Babu. More generality in efficient multiple kernel learning[C]. In:Proceedings of the26th International Conference on Machine Learning(ICML’2009),Montreal, Canada,2009,1065-1072.
    [67] C. Cortes, M. Mohri, A. Rostamizadeh. Learning non-linear combinations of kernels[C], In:Advances in Neural Information Processing Systems.2009,22:396-404.
    [68] C.Y Yeh, W.P. Su, S.J Lee. An efficient multiple-kernel learning for pattern classification[J].Expert Systems with Applications,2013,40(9):3491-3499.
    [69] Z.Xu, R.Jin, H. Yang, I. King, M.R.Lyu. Simple and efficient multiple kernel learning bygroup lasso[C]. In: Proceedings of the27th International Conference on MachineLearning(ICML’2010), Haifa, Israel,2010,1175-1182.
    [70] Mahdieh Soleymani Baghshah, Saeed Bagheri Shouraki. Learning low-rank kernel matricesfor constrained clustering[J]. Neurocomputing,2011,74(12):2201-2211.
    [71] B.B Pan, J.H Lai, P. C. Yuen. Learning low-rank Mercer kernels with fast-decayingspectrum[J]. Neurocomputing,2011,74(17):3028-3035.
    [72] S.C.H. Hoi, R. Jin, M.R. Lyu. Learning nonparametric kernel matrices from pairwiseconstraints[C]. In: Proceedings of International conference on Machine Learning,2007,361-368.
    [73] Enliang Hu, Songcan Chen, Jiankun Yu, Lishan Qiao. Two-stage nonparametric kernelleaning: From label propagation to kernel propagation[J]. Neurocomputing,2011,74(17):2725-2733.
    [74] W. Liu, B. Qian, J. Cui, J. Liu. Spectral kernel learning for semi-supervised classification[C].In: Proceedings of the21st international Jont Conference on Artifical intelligence,2009,1150-1155.
    [75] J. Zhuang, I.W. Tsang, S.C.H. Hoi. A family of simple non-parametric kernel learningalgorithms[J]. Journal of Machine Learning Research,2011,12:1313-1347.
    [76] Shahshahani B, Landgrebe D. The effect of unlabeled samples in reducing the small samplesize problem and mitigating the hughes phenomenon[J]. IEEE Transactions on Geoscienceand Remote Sensing,1994,32(5):1087-1095.
    [77] Miller DJ, Uyar HS. A mixture of experts classifier with learning based on both labeled andunlabelled data[C]. In: Advances in Neural Information Processing Systems9,1997,Cambridge: MIT Press,571-577.
    [78] T Zhang, F.J. Oles. A probability analysis on the value of unlabeled data for classificationproblems[C]. In: Proceedings of the17th International Conference on Machine Learning,2000, San Francisco,1191-1198.
    [79] Joachims T. Transductive inference for text classification using support vector machines[C].In: Proceedings of the16th International Conference on Machine Learning,1999, Bled,Slovenia,200-209.
    [80] F Wu, W Wang, Y Yang, Y Zhuang, F Nie. Classification by semi-supervised discriminativeregularization[J]. Neurocomputing,2010,73(10-12):1641-1651.
    [81] X Zhu, Z Ghahramani, J Lafferty. Semi-supervised learning using Gaussian fields andharmonic functions[C]. In: Proceedings of the20th International Conference on MachineLearning,2003, Washington, DC,912-919.
    [82] D Zhou, O Bousquet, TN Lal, J Weston, B Sch lkopf. Learning with local and globalconsistency[C]. In: Advances in Neural Information Processing Systems,2004,16,Cambridge: MA: MIT Press,321-328.
    [83] H Zhang, W Deng, J Guo, J Yang. Locality preserving and global discriminant projectionwith prior information[J]. Machine Vision and Applications,2010,21:577-585.
    [84] M Belkin, P Niyogi. Semi-supervised learning on Riemannian manifolds[J]. MachineLearning,2004,56(1-3):209-239.
    [85] Mika S, Ratsch G, Weston J, Scholkopf B, Smola A, Muller KR. Constructing descriptiveand discriminative nonlinear features: reyleigh coefficients in kernel feature spaces[J]. IEEETrans on Pattern Analysis and Machine Intelligence,2003,25(5):623-628.
    [86] Talwalkar A, Kumar S, Rowley H. Large-scale manifold learning[C]. IEEE Conference onComputer Vision and Pattern Recognition,2008,1-8.
    [87] Tsang I W, Kwok J T. Large-scale sparsified manifold regularization[C]. Advances in NeuralInformation Processing Systems.2006,1401-1408.
    [88] Pozdnoukhov A, Bengio S. Semi-supervised kernel methods for regression estimation[C].Proceedings of the IEEE International Conference on Acoustics, Speech and SignalProcessing,2006, Toulouse, France,577-580.
    [89] C Hou, C Zhang, Y Wu, F Nie. Multiple view semi-supervised dimensionality reduction[J].Pattern Recognition,2010,43(3):720-730.
    [90] X Yin, T Shu, Q Huang. A general model for semi-supervised dimensionality reduction[J].Procedia Engineering,2012,29:3552-3556.
    [91] Z Zhou, M Li. Semi-supervised learning by disagreement[J]. Knowledge and InformationSystems,2010,24(3):415-439.
    [92]周志华.半监督学习中的协同训练风范.北京:清华大学出版社,2007:259-275.
    [93] K Nigam, AK McCallum, S Thrun, T Mitchell. Text classification from labeled andunlabeled documents using EM[J]. Machine Learning,2000,39(2-3):103-134.
    [94] R Chatpatanasiri, B Kijsirikul. A unified semi-supervised dimensionality reductionframework for manifold learning[J]. Neurocomputing,2010,73(10-12):1631-1640.
    [95] M Belkin, P Niyogi, V Sindwani. On manifold regularization[C]. In: Proceedings of the10th International Workshop on Artificial Intelligence and Statistics2005, Barbados,17-24.
    [96] Zhang S, Lei Y, Wu Y. Semi-supervised locally discriminant projection for classification andrecognition[J]. Knowledge-Based Systems,2011,24(2):341-346.
    [97] V N Vapnik著,张学工译.统计学习理论的本质[M].北京:清华大学出版社,2000.
    [98] S. Haykin. Neural Networks: A comprehensive foundation[M]. Tsinghua University Press,2001.
    [99] Z. Chen and S. Haykin. On different facets of regularization theory[J]. Neural Computation,2002,14(12):2791-2846.
    [100] R.M. Rifkin and R.A. Lippert. Value regularization and fenchel duality[J]. Journal ofMachine Learning Research,2007,8:441-479.
    [101] C.A. Micchelli and M. Pontil. Learning the kernel function via regularization[J]. JournalMachine Learning Research,2005,6:1099-1125.
    [102] Hernandez-Gonzalez, Jeronimo; Inza, Inaki; Lozano, Jose A. Learning Bayesian networkclassifiers from label proportions[J]. Pattern Recognition,2013,46(12):3425-3440.
    [103] F.H Shang, L.C. Jiao, Y.Y Liu. Semi-supervised learning with nuclear normregularization[J].Pattern Recognition,2013,46(8):2323-2336.
    [104] T. Evgeniou, M. Pontil, and T. Poggio. Regularization networks and support vectormachines[J]. Advances in Computational Mathematics,2000,13(1):1-50,.
    [105] Rifkin R, Yeo G, Poggio T. Regularized least-squares classification[J]. Nato Science SeriesSub Series III Computer and Systems Sciences,2003,190:131-154.
    [106] M. Belkin, P. Niyogi, V. Sindhwani, Manifold regularization: A geometric framework forlearning from labeled and unlabeled examples[J]. The Journal of Machine LearningResearch,2006,7:2399-2434.
    [107]孙吉贵,刘杰,赵连宇.聚类算法研究[J].软件学报,2008,19(1):48-61.
    [108] S Kirindis, V Chatzis. A robust fuzzy local information c-means clustering algorithm [J].IEEE Trans. Image Process,2010,19(5):1328-1337.
    [109] W Cai, S Chen, D Zhang. Fast and robust fuzzy c-means clustering algorithms incorporatinglocal information for image segmentation [J]. Pattern Recognition,2007,40(3):825-838.
    [110]张敏,于剑.基于划分的模糊聚类算法[J].软件学报,2004,15(6):858-868.
    [111]田军委,黄永宣,于亚琳.基于熵约束的快速FCM聚类多阈值图像分割算法[J].模式识别与人工智能,2008,21(2):221-226.
    [112] R Krishnapuram, J Keller. A possibilistic approach to clustering [J]. IEEE Trans FuzzySystems,1993,1(2):98-110.
    [113] N R Pal, K Pal, J C Bezdek. A possibilistic fuzzy c-means clustering algorithm [J]. IEEETrans Fuzzy Systems,2005,13(4):517-530.
    [114] J S Zhang, Y W Leung. Improved possibilistic c-means clustering algorithms [J]. IEEETrans Fuzzy Systems,2004,2(12):209-217.
    [115]陈健美,陆虎等.一种隶属关系不确定的可能性模糊聚类方法[J].计算机研究与发展,2008,45(9):1486-1492.
    [116]高新波,李洁.基于加权FCM与统计检验指导的多阈值图像自动分割算法[J].电子学报,2004,32(4):661-664.
    [117] J Yu, M.S Yang, E.Stanley Lee. Sample-weighted clustering methods [J]. Compu ters&Mathematics with Applications,2011,62(5):2200-2208.
    [118] Y.Y. He, Hussaini, M. Yousuff, J.W Ma. A new fuzzy c-means method with total variationregularization for segmentation of images with noisy and incomplete data[J].PatternRecognition,2012,45(9):3463-3471.
    [119] M S Yang, K L Wu. Unsupervised possibilistic clustering [J]. Pattern Recognition,2006,39(1):5-21.
    [120]武小红,周建江.可能性模糊C-均值聚类新算法[J].电子学报,2008,36(10):1996-2000.
    [121] K.L.Wu. Analysis of parameter selections for fuzzy c-means[J]. Pattern Recognition,2012,45(1):407-415.
    [122] X.C.Zhang, J.W.Li, H Yu. Local density adaptive similarity measurement for spectralclustering[J]. Pattern Recognition Letters,2011,32(2):352-358.
    [123] Tsai, D.M, Lin, C.Chan. Fuzzy c-means based clustering for linearly and nonlinearlyseparable data[J]. Pattern Recognition,2011,44(8):1750-1760.
    [124] Nadernejad, Ehsan, Sharifzadeh, Sara. A new method for image segmentation based onFuzzy C-means algorithm on pixonal images formed by bilateral filtering[J].Signal Imageand Video Processing,2013,7(5):855-863.
    [125] C.Y. Qiu, et al. A modified interval type-2fuzzy c-means algorithm with application in MRimage segmentation[J]. Pattern Recognition Letters,2013,34(12):1329-1338.
    [126] D. Graves, W. Pedrycz. Kernel-based fuzzy clustering and fuzzy clustering: A comparativeexperimental study[J].Fuzzy Sets and Systems,2010,161(4):522-543
    [127] Pimentel, Bruno A.,de Souza, Renata M. C. R.A multivariate fuzzy c-means method[J].Applied Soft Computing,2013,13(4):1592-1607.
    [128] Meena Tushir,Smriti Srivastava.A new Kernelized hybrid c-mean clustering model withoptimized parameters[J].Applied Soft Computing,2010,10(2):381-389.
    [129] M.G.Gong, Y Liang, J Shi. Fuzzy c-means clustering with local information and kernelmetric for image segmentation[J].IEEE Transactions on Image Processing,2013,22(2):573-584.
    [130]张莉,周伟达,焦李成.核聚类算法[J].计算机学报,2002,25(6):587-590.
    [131]沈红斌,王士同,吴小俊.离群模糊核聚类算法[J].软件学报,2004,15(7):1021-1029.
    [132]伍忠东,高新波,谢维新.基于核方法的模糊聚类算法[J].西安电子科技大学学报,2004,31(4):533-537.
    [133] Sabzekar, Mostafa, Naghibzadeh, Mahmoud. Fuzzy c-means improvement using relaxedconstraints support vector machines[J].Applied Soft Computing,2013,13(2):881-890.
    [134] Hamasuna, Yukihiro, Endo, Yasunori On semi-supervised fuzzy c-means clustering for datawith clusterwise tolerance by opposite criteria[J]. Soft Computing,2013,17(1):71-81.
    [135] F Zhao, L C Jiao, H.Q. Liu. Kernel generalized fuzzy c-means clustering with spatialinformation for image segmentation[J].Digital signal Processing,2013,23(1):184-199.
    [136] Havens Timothy C., Bezdek James C.,Leckie Christopher.Fuzzy c-means algorithms forvery large data[J].IEEE Transactions on Fuzzy Systems,2012,20(6):1130-1146.
    [137] X.S Yin,S.C Chen,E.L Hu,D.Q Zhang. Semi-supervised clustering with metric learning:Anadaptive kernel method[J].Pattern Recognition,43(4),2010:1320–1333
    [138] Galar Mike, Fernandez Alberto, Barrenechea Edurne. EUSBoost: Enhancing ensembles forhighly imbalanced data-sets by evolutionary undersampling[J]. Pattern Recognition,2013,46(12):3460-3471.
    [139] L.Z.Yin, Y Ge, K.L Xiao.Feature selection for high-dimensional imbalanced data[J].Neurocomputing,2013,105:3-11.
    [140] Hwang Jae Pil, Park Seongkeun, Kim Euntai.A new weighted approach to imbalanced dataclassification problem via support vector machine with quadratic cost function[J]. Expertsystems with Applications,2011,38(7):8580-8585.
    [141] H. Haibo and E. Garcia.Learning from imbalanced data[J]. IEEE Trans. Knowl. Data Eng,2009,21(9):1263–1284.
    [142] C F Lin, S D Wang.Fuzzy support vector machines[J]. IEEE Transactions on Neural Netw-orks,2002,13(2):464-471.
    [143] Y. Wang, S. Wang and K. Lai. A new fuzzy support vector machine to evaluate credit risk[J].IEEE Trans. Fuzzy Syst.,2005,13(6):820-831.
    [144] Khoshgoftaar Taghi M.,Van Hulse Jason, Napolitano Amri.Comparing boosting andbagging techniques with noisy and imbalanced data[J]. IEEE Transactions on systems Manand Cybernetics Part A-Systems and Humans,2011,41(3):552-568.
    [145] Z. Xie, Q. Hu and D. Yu. Fuzzy output support vector machine for classification[C]. Proc.Int. Conf. Adv. Natural Comput.,2005,1190-1197.
    [146] Z. Lin, Z. Hao, X. Yang, and X. Lium.Several SVM ensemble methods integrated withunder-sampling for imbalanced data learning[C]. In Advanced Data Mining andApplications. Berlin, Germany: Springer-Verlag,2009,536–544.
    [147] S.Ali, K.A.Smith-Miles. A meta-learning approach to automatic kernel selection for supportvector machines[J].Neurocomputing,2006,70(l-3):173-186.
    [148]王泳,胡包钢.应用统计方法综合评估核函数分类能力的研究[J].计算机学报,2008,31(6):942-952.
    [149] Y.Baram. Learning by kernel polarization[J]. Neural Computation,2005,17(6):1264-1275.
    [150] N.Cristianini, J.Shaw-Taylor, A.Elisseeff and J.Kandola. On kernel-target alignment[C]. In:Advances in Neural Information processing Systems(NIPS),2001,14:367-373.
    [151] C.H.Nguyen and T.B.Ho. Kernel matrix evaluation[C]. In:Proeeedings of the20thInternational Joint Conference on Artificial Intelligence (IJCAI.2007), Hyderabad, India,2007,987-992.
    [152] Pawel Chudzian. Evaluation measures for kernel optimization[J]. Pattern RecognitionLetters,2012,33(9):1108-1116.
    [153] S. Sonnenburg, G. R tsch, C. Sch fer, B. Sch lkopf. Large scale multiple kernellearning[J].Journal of Machine Learning Research,2006,7:1531-1565.
    [154] B.F. Rakotomamonjy A, Canu S, et al.. SimpleMKL[J]. Journal of Machine LearningResearch,2008,9:2491-2521.
    [155] Z. Xu, R. Jin, I. King, M.R. Lyu.An extended level method for efficient multiple kernellearning[C]. In:Advances in Neural Information Processing Systems21,2009,1825-1832.
    [156] M. Kloft, U. Brefeld, S. Sonnenburg, P. Laskov, K.R. Müller, A. Zien. Efficient and accuratelp-norm multiple kernel learning. In: Advances in Neural Information Processing Systems22,2009,997-1005.
    [157] M. Kloft, U. Brefeld, S. Sonnenburg, A. Zien. Lp-norm multiple kernel learning[J]. Journalof Machine Learning Research,2011,12:953-997.
    [158] Y Yu Lin, T L Liu, C.S. Fuh. Multiple kernel learning for dimensionality reduction[J]. IEEETransactions on Pattern Analysis and Machine Intelligence,2011,33(6):1147-1160.
    [159] T Sun, L C Jiao, F Liu.Selective multiple kernel learning for classification with ensemblestrategy[J]. Pattern Recognition,2013,46(11):3081-3090.
    [160] Reyhani, Nima. Multiple spectral kernel learning and a gaussian complexity computation[J].Neural Computation,2013,25(7):1926-1951.
    [161] J. Zhuang, I.W. Tsang, S.C.H. Hoi. SimpleNPKL: simple non-parametric kernel learning[C].In: Proceedings of International Conference on Machine Learning,2009,1273-1280.
    [162] A. Beygelzimer, S. Kakade, J. Langford.Cover trees for nearest neighbor[C].In: Proceedingsof International Conference on Machine Learning,2006,97-104.
    [163] B. Kulis, M.A. Sustik, I.S. Dhillon. Low-rank kernel learning with bregman matrixdivergences[J]. Journal of Machine Learning Research,2009,10:341-376.
    [164] M. Varma, B.R. Babu.More generality in efficient multiple kernel learning[C]. In:Proceedings of the26th Annual International Conference on Machine Learning,2009,1065-1072.
    [165] L. Xu, J. Neufeld, B. Larson, D. Schuurmans. Maximum margin clustering[C]. In: Advancesin Neural Information Processing Systems,2004,17:1537-1544.
    [166] K. Zhang, I.W. Tsang, J.T. Kwok. Maximum margin clustering made practical[J]. IEEETransactions on Neural Networks,2009,20(4):583-596.
    [167] H. Zhang, J. Lu. Semi-supervised fuzzy clustering: A kernel-based approach[J].Knowledge-Based Systems,2009,22(6):477-481.
    [168] H. Valizadegan, R. Jin. Generalized maximum margin clustering and unsupervised kernellearning[C].In: Advances in neural information processing systems,2007,19:1417-1425.
    [169] F. Wang, B. Zhao, C. Zhang.Linear time maximum margin clustering[J]. IEEE Transactionson Neural Networks,2010,21(2):319-332.
    [170] B. Zhao, J.T. Kwok, C. Zhang. Multiple kernel clustering[C]. In: Proceedings of the9thSIAM International Conference on Data Mining,2009,638-649.
    [171] T. Joachims. Training linear SVMs in linear time[C]. In: Proceedings of the12th ACMSIGKDD international conference on Knowledge discovery and data mining,2006,217-226.
    [172] Horata Punyaphol, Chiewchanwattana Sirapat, Sunat Khamron. Robust extreme learningmachine[J]. Neurocomputing,2013,102:31-44.
    [173] Y Qi, Y.Miche, E Eirola. Regularized extreme learning machine for regression withmissingdata[J]. Neurocomputing,2013,102:45-51.
    [174] F L Cao, B Liu, Park, S Dong. Image classification based on effective extreme learningmachine[J]. Neurocomputing,2013,102:90-97.
    [175] Y. Miche, et al. OP-ELM: Optimally pruned extreme learning machine[J]. IEEETrans.Neural Netw,2010,21(1):158-162.
    [176] W.W.Zong, G.B. Huang, Y.Q. Chen. Weighted extreme learning machine for imbalancelearning[J]. Neurocomputing,2013,101:229-242.
    [177] G.B. Huang, Q.Y. Zhu, and C.-K. Siew. Extreme learning machine: Theory andapplications[J].Neurocomputing,2006,70(1):489-501.
    [178] X.Y. Liu, C.H.Gao, P Li. A comparative analysis of support vector machines and extremelearning machines[J]. Neural Networks,2012,33:58-66.
    [179] Martinez-Martinez, Jose M.; Escandell-Montero, Pablo; Soria-Olivas, Emilio;Regularizedextreme learning machine for regression problems[J]. Neurocomputing,2011,74(17):3716-3721.
    [180] G.B. Huang and L.Chen. Enhanced random search based incremental extreme learningmachine[J].Neurocomputing,2008,71(16):3460-3468.
    [181] G. Wang, Y. Zhao, D. Wang. A protein secondary structure prediction framework based onthe Extreme Learning Machine[J]. Neurocomputing,2008,72(1-3):262-268.
    [182] Y. Lan, Y.C. Soh, G.-B. Huang.Extreme Learning Machine based bacterial proteinsubcellular localization prediction[C]. In: Proceedings of the IEEE International JointConference on Neural Networks, IJCNN2008, Hong Kong,1859-1863.
    [183] R. Zhang, G.-B. Huang, N. Sundararajan, P. Saratchandran. Multicategory classificationusing an Extreme Learning Machine for microarray gene expression cancer diagnosis[J].IEEE/ACM Trans. Comput. Biol. Bioinform.2007,4(3):485-495.
    [184] A.A. Mohammed, R. Minhas, Q.M. Jonathan Wu, M.A. Sid-Ahmed. Human facerecognition based on multidimensional PCA and extreme learning machine[J]. PatternRecognition,2011,44(10-11):2588-2597.
    [185] A.H. Nizar, Z.Y. Dong, Y. Wang. Power utility nontechnical loss analysis with ExtremeLearning Machine Method[J]. IEEE Trans. Power Syst,2008,23(3):946-955.
    [186] S. Decherchi, P. Gastaldo, R.S. Dahiya, M. Valle, R. Zunino. Tactile data classification ofcontact materials using computational intelligence[J]. IEEE Trans. Robotics,2011,27(3):635-639.
    [187] G.B. Huang, H.M Zhou, X.J. Ding. Extreme Learning Machine for regression and multiclassclassification[J]. IEEE Transactions on Systems, Man, and Cybernetics-PART B:Cybernetics,2012,42(2):513-529.
    [188] G. B. Huang,et al. Extreme learning machines: a survey[J]. International Journal of MachineLearning and Cybernetics,2011,2:107-122.
    [189] H. Xue, S. Chen, Q. Yang. Discriminatively regularized least-squares classification[J].Pattern Recognition,2009,42(1):93–104.
    [190] M.Y. Fan, N.N Gu, H Qiao, etc. Sparse regularization for semi-supervisedclassification[J].Pattern Recognition,2011,44(8):1777-1784.
    [191] Y. G. Wang, F. L. Cao, Y. B. Yuan. A study on effectiveness of extreme learning machine[J].Neurocomputing,2011,74:2483-2490.
    [192] F. L. Cao, B. Liu, D. S. Park. Image classification based on effective extreme learningmachine[J]. Neurocomputing,2013,102:90-97.
    [193] L.N Li, D.Y. Liu, J.H Ouyang. A new regularization classification method based on extremelearning machine in network data[J]. Jounal of Information and Computational Science,2012,9(12):3351-3363.
    [194] J.F Liu, Y.Q. Chen, M.J. Liu, et al. SELM: Semi-supervised ELM with application in sparsecalibrated location estimation[J]. Neurocomputing,2011,74(16):2566-2572.
    [195] X.F. He, D Cai, Y.L Shao, H.J. Bao, and J.W. Han. Laplacian regularized gaussian mixturemodel for data clustering[J]. IEEE Transactions on Knowledge and Data Engineering,2011,23(9):1406-1418.
    [196] J. J. Hull. A data base for hand written text recognition research[J]. IEEE Trans. Pattern Anal.Mach.Intell,1998,16(5):550-554.
    [197] K. C. Lee, J. Ho, D. Kriegman. Acquiring linear subspaces for face recognition undervariable lighting[J]. IEEE Trans. Pattern Anal. Mach. Intell,2005,27(5):684-698.
    [198] B Zhao, F Wang, C.S. Zhang. CutS3VM: A fast semi-supervised SVM algorithm[C].The14th ACM SIGKDD International Conference on Knowledge Discovery&DataMining(KDD).Las Vegas, Nevada, USA,2008,830-838.
    [199] V. Sindhwani, S.S. Keerthi. Large scale semi-supervised linear SVMs[C].29th AnnualInternational ACM SIGIR,2006.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700