支持向量机中若干问题及应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
统计学习理论为研究小样本情况下机器学习问题提供了有力的理论基础。它使用结构风险最小化原则,综合了统计学习、机器学习和神经网络等方面技术,在最小化经验风险的同时,有效地提高了算法泛化能力。支持向量机是在该理论体系下产生的一种新的、非常有力的机器学习方法。它较好地解决了以往困扰很多学习方法的小样本、非线性、过学习、高维数、局部极小点等实际问题,具有良好的潜在应用价值和发展前景。目前,统计学习理论和支持向量机作为小样本学习的最佳理论,受到越来越广泛的重视,成为人工智能和机器学习领域新的研究热点。本文综述了支持向量机的研究现状,针对目前存在的几个问题:不平衡调整问题、大样本减样和除噪问题、两种支持向量算法即Support Vector Machines(或SVM)与Support Vector Domain Description(或SVDD)的结合问题、核心向量的性能和应用问题以及SVDD算法在不确定型决策中的应用等问题进行研究。本文的主要工作如下:
     1.研究了不平衡支持向量机的调整方法。不平衡数据集的学习问题被公认为机器学习领域的难题之一,其困难主要来自于不平衡数据集本身的特点:例如,样本数量少的类其样本不足,样本的分布并不能很好地反映整个类的实际分布。因此标准支持向量机在应用于不平衡数据集时,往往把少数类的样本错分,尽管整体的分类精度比较高,但数量少的类的分类精度非常低。本文针对支持向量机中两类不平衡数据的分离超平面提出一种调整算法。该算法根据样本投影分布和样本容量所提供的信息给出两类惩罚因子比例,从而得到一个新的分离超平面。实验结果显示了该方法的良好性能。
     2.研究了样本的减样和除噪问题。在使用支持向量机分类时,存在以下两个问题:一是当两类训练样本中存在野点(噪点)时,分类的精度较低;二是对大规模样本集,所占用的内存空间较大,训练时所需时间较长。针对以上问题,我们分别基于欧氏距离和核距离,根据概率论的知识定位分析了野点(噪点)及多余样本点的一般比例情况,给出一种减样方法。实验结果表明该方法与标准SVM相比,能保持或提高分类精度;对于大样本来说不仅能保持精度不减,而且还能较大地提高分类速度,具有较强的实用性。
     3.将支持向量机与支持向量域描述结合起来,提出一种分类器。支持向量机在学习阶段,所有样本参加训练,因此需要较大的内存空间和较长的训练时间;而支持向量域分类器(Support Vector Domain Classifier,或SVDC),只训练一类样本点,因此,分类时训练时间较短,但精度较低。为了减少SVM的训练时间,提高SVDC的精度,我们建立一种新的分离超平面,即基于支持向量域的分离超平面。该算法是从整体上考虑分类信息,实现了SVDD和SVM的结合。实验结果显示了该方法的有效性。
     4.提出了核心向量的重要概念,并把核心向量集应用于支持向量机的改进。为了有效提取样本类信息,基于SVDD算法依据参数选择,剔除支持向量,找核心向量。为了研究核心向量的性能,分别使用线性以及径向基核函数对样本数据进行描述,从理论上证明了核心向量在样本集中,在对应参数下具有最大密度值,因而得出核心向量包含最大信息量的重要结论。因此,核心向量不仅可以作为样本的期望点估计,而且可以提炼控制向量,改善SVM的分类效果。
     5.将支持向量域描述算法应用在不确定型群决策中。分别研究了模糊判断和区间判断两种逆判问题。对于模糊判断的逆判问题,是以模糊互反判断为准,使用SVDD算法,寻找公共信息,根据信息的贡献量决定专家的评判权重。对于区间判断的逆判问题,通过对区间判断矩阵的点向量分解,采用径向基核函数,使用SVDD算法提取群体的公共信息,同样根据信息贡献量决定专家的权重。该研究充分利用了SVDD的描绘功能,抓住主要信息,比较适合于不确定型的群决策问题。它不仅开拓了SVDD的研究领域,而且为不确定型群决策的研究提供了有效的技术。
Statistical Learning Theory (SLT) provides a powerful theoretical basis for machine learning in studying small sample. By using the Structural Risk Minimization to integrate techniques of the statistical learning, machine learning and neural network etc and efficiently improve generalization ability of algorithm under Empirical Risk Minimization. Support Vector Machine (or SVM) is a new and very powerful machine learning method generated under such a theoretical system. SVM is of good potential applicability and development prospect, for it can solve well many practical problems that puzzle many existing learning methods, such as small sample, nonlinearity, over learning, high dimensional number and local minimal point etc. Currently, SLT and SVM, as the best theory for small sample learning, has receiving wide attention, and becoming a new research hotspot in machine learning and artificial intelligence. In this paper, we firstly illustrate existing algorithm and application researches on SVM. Secondly, we study in detail some existing problems concerned now, such as the unbalanced problems, de-sampling and de-noising problems, combination problems of two kinds of support vector algorithms SVM and Support Vector Domain Description(or SVDD), performances and applications of core vectors and application problem of SVDD algorithm to uncertain group decision etc. The main works in this paper include contents as follows:
     1. Study the adjustment method for unbalanced support vector machines. The learning of unbalanced data set is regarded as one of the open difficult problems in the area of machine learning, where the difficulty comes mainly from the feature of the unbalanced data itself. For instance, the class with few samples lacks samples, which can not reflect well the practical distribution of whole class. Therefore, the standard SVM often makes mistakes when separating the samples from the class with few samples in application to unbalanced data set. This results in the fact that the class with few samples has low precision though the whole classification precision is high. This paper proposes an adjustment algorithm for the separating hyperplane of two classes of unbalanced data in SVM. We use the information provided by sample projection distribution and sample size to determine the ratio of two classes of penalty factors and then obtain a new separating hyperplane. Experiment results show the good performances of the method.
     2. Study the de-sampling and de-noising problems. There exist two problems in using SVM to perform classifications as follows: One is the low classification precision due to the existence of outliers (or noises) in sample set; another is long training time for a large scale sample set due to requiring great memory space. For above problems, according to probability theory we analyze in location general proportions of outliers (or noises) and surplus samples, and propose a de-sampling method based on Euclid distance and kernel distance, respectively. The experiments show that the proposed method can keep or improve classification precision compared with SVM generally; for large sample, the method can not only keep precision, but also improve classification speed greatly, which is of strong practicality.
     3. A kind of classifier is presented by combining SVM and SVDD. Since all the samples participate in training for using SVM, it needs great memory space and long training time; while SVDC (Support Vector Domain Classifier) is low in classification precision though it needs a relatively little time in classification. To reduce training time of SVM and to speed SVDC, we build a new separating hyperplane, namely separating hyperplane based on SVDD. The algorithm considers classification information as a whole, and implements the combination of SVM and SVDD. Experiments show the efficiency of the method.
     4. Propose the concept of core vector, and apply core vector set to improve SVM. To extract sample information efficiently, we delete all the support vectors and find core vectors by choosing parameters based on SVDD. Linear kernel and radial basis kernel function are applied to describe sample data respectively to study the performance of core vector. It is proved theoretically that a core vector is of maximal density with respect to corresponding parameters in given sample set, hence we obtain the important conclusion that a core vector contains maximal information in the sample set. Therefore, any core vector can be evaluated the expectation point of a sample set, further more, core vector set can be trained to find control vector to improve SVM.
     5. Apply SVDD to uncertain group decision. Respectively Study the two kinds of inverse judgment problems of fuzzy judgment and interval judgment. For the fuzzy judgment, we choose fuzzy reciprocal judgment as the standard to determine expert weight according to the informational contribution by using SVDD to find common information. For the interval judgment, expert weights are determined in terms of the information contribution by using SVDD to extract group information, in which interval judgment matrices are decomposed as point vectors, and radial basis kernel function is applied. This research makes full use of the description performance of SVDD, holds main information, which is well suitable for uncertain group decision problems. This method not only enlarges the research area of SVDD, but also provides an efficiently technique for studying uncertain decision.
引文
[1] Vapnik V. The nature of statistical learning theory. NewYork: Springer-Verlag, 1995.
    [2]王磊.支持向量机学习算法的若干问题研究.成都:电子科技大学博士学位论文, 2007.
    [3]阎威武.支持向量机理论、方法和应用研究.上海:上海交通大学博士论文,2003.
    [4]张国云.支持向量机算法及其应用研究.长沙:湖南大学博士学位论文, 2006.
    [5]张学工译.统计学习理论的本质.北京:清华大学出版社, 2000.
    [6] Rosenblat F. Principles of neurodinamics: Perceptron and Theory of Brain Mechanisms, Spartan Books, Washington D C, 1962.
    [7] Novikof A.On convergence proofs on perceptrons. Proceedings of the Symposium on the mathematical Theory of Automata, Poltechnic Instatute of Brooklyn,1962.
    [8] Chaitin G. On the length of programs for computing finite binary sequences. Journal of Association with Computer Machine, 1966, 13: 54 7-569.
    [9] Kolmogorof A .Three approaches to the quantitative definitions of information. Problem of Information Transmission,1965, 1 (1): 1- 7.
    [10] Lecun Y. Learning porcesses in an asymmetric threshold network. Disordered Systems and Biological Organizations, Les. Houches, France: Springer-Verlag, 1986, 9: 23 3-240.
    [11] Rumelhart D, Hinton G, Wiliams R J. Learning internal representations by error propagation. Parallel Distributed Processing: Explorations in Macrostructure of Cognition. Cambridge, M A. 1986, 1: 318-362.
    [12]邓乃杨,田英杰.数据挖掘中的新方法一一支持向量机.北京:科学出版社, 2005.
    [13] Vapnik V. Estimation of dependencies based on empirical data. New York: Springer-Verlag, 1995.
    [14] Tax D M J, Duin R P W. Uniform object generation for optimizing one-class classifiers. Journal of Machine Learning Research, 2001 (2):155-173.
    [15] Yan W, Shao H. Application of support vector machine nonlinear classifier to fault diagnoses. Proceedings of the 4th World Congress on Intelligent Control and Automation, 2002, 4 (4): 2697-2700.
    [16] Lunts A, Brailovskiy V. Evaluation of attributes obtained in statistical decision rules. Engineering Cybenretics, 1967, 3 (l): 98-109.
    [17]范明,柴玉梅,咎红英等译.统计学习基础一教据挖掘、推理与预测.北京:电子工业出版社, 2004.
    [18] Vapnik V, Chapelle O, Boundson. Error expectation for support vector machine. Advances in Large Margin Classifiers, Cambridge, M A: MIT Press, 2000: 5-26.
    [19] JaakkolaT, Haussler D. Probabilistic kernel regression models. Proceedings of the 7th Workshop on AI and Statistics, San Francisco,1999:26-34.
    [20] Wahba G, LinY, Zhang H. Generalized approximate cross-validation for support vector machines: Another way to look at margin-like quantities. Advances in Large Marge Classifiers. Cambridge, M A: MIT Press, 2000: 397-309.
    [21]王国胜,钟义信.支持向量机的若干新进展电子学报,2001, 29(10): 1397- 1400.
    [22] Amari S, Wu S. Improving support vector machine classifier by modifying kernel function. Neural Networks, 1999, 12: 783-789.
    [23] Gustavo C V, Luis G C, et al. Composite kernels for hyperspectral image classification. IEEE Geoscience and Remote Sensing Letters, 2006, 3 (l): 93 -98.
    [24] Lodhi H, Saunders C, Shawe-Taylor J, et al. Text classification using string kernels. Journal of Machine Learning Research, 2002, 2 (2): 419-444.
    [25] Scholkopf B, Smola A, Bartlet P. New support vector algorithms. Neural Computation, 2000, 12: 1207-1245.
    [26] Suykens J.A, Vandewalle J. Least squares support vector machines classifiers. Neural Processing Leters, 1999, 19(3): 293-300.
    [27] Mangasarian O, Musicant D. Lagrange support vector machines. Journal of Machine Learning Research, 2001, 1: 161-177.
    [28] Fung G, Mangasarian O. Proximal support vector machine classifiers. Proceedings of the 7th International Conference on Knowledge Discovery and Data Mining, San Francisco, California 2001:77-86.
    [29] Lee Y J, Mangasarian O. SSVM: A smooth support vector machines. Computational Optimization and Applications, 2001, 20 (l): 5-22.
    [30] Tax D M J, Duin R. Support vector domain description. Pattern Recognition Letters, 1999, 20:1191-1199.
    [31]陆从德,张太镒,胡金艳.基于乘性规则的支持向量域分类器.计算机学报, 2004, 27 (5): 690-694.
    [32]陆从德,张太镒,李灿平等.基于支持向量域描述的学习分类器.微电子学与计算机, 2005, 22 (11): 75-81.
    [33] DietterichT G, Bakiri G. Solving multiclass learning problems via erorr-correcting output codes. Journal of Artificial Intelligence Research.1995, 2: 263-286.
    [34] Dong J X, Krzyzak A, et al. Fast SVM training algorithm with decomposition on very large data sets. IEEE Transaction on Pattern Analysis and Machine Intelligence, 2005, 27 (4): 603 -618.
    [35] Hsu C W, Lin C J. A comparison of methods for multiclass support vector machines. IEEE Transaction on Neural Networks, 2002, 13: 415-425.
    [36] Jair C, Xiaoou L , Wen Y, Javier B. Multi-class support vector machines for large data sets via minimum enclosing ball clustering. IEEE 4th International Conference on Electrical and Electronics Engineering. Mexico City, Mexico 2007: 146-149.
    [37] Lingras P, Butz C. Rough set based 1-v-1 and 1-v-r approaches to support vector machine multi-classification. Information Sciences , 2007, 177(18): 3782-3798.
    [38] Crammer K, SingerY. On the learning ability and design of output codes for multiclass problems. Machine Learning, 2002, 47: 201-233.
    [39] Weston J, Watkins C. Support vector machines for multiclass pattern recognition. Proceedings of 7th European Symposium on Artificial Neural Networks.1999: 219-224.
    [40]袁亚湘,孙文瑜著.最优化理论与方法.北京:科学出版社, 1997.
    [41] Cortes C, Vapnik V. Support vector networks. Machine Learning, 1995, 20: 273-297.
    [42] Downs T, Gates K E, Masters A. Exact simplification of support vector solutions. Journal of Machine Learning Research 2001, 2: 293-297.
    [43] Keerthi S S, Shevade S K, et al. Improvements to Platt's SMO algorithm for SVM classifier design. Neural Computation, 2001, 1 (13): 637-649.
    [44] Lin C J. On the convergence of the decomposition method for support vector machines. IEEE Transaction on Neural Networks, 2001, 12: 1288-1298.
    [45] Platt J C. Fast Training of Support Vector Machines using Sequential Minimal Optimization.Advances in Kernel Methods: Support Vector Learning, MIT Press, Cambridge, M A, 1999.
    [46] Cao L J, Keerthi S S, et al. Parallel sequential minimal optimization for the training of support vector machines. IEEE Transaction on Neural Networks, 2006, 17(4): 1039 -1049.
    [47] Runarsson T P, Sigurdsson S . Asynchronous parallel evolutionary model selection for support vector machines. Neural Information Processing-Letters and Reviews, 2004, 3: 59 -67.
    [48] Xia Y S, Wang J. A one-layer recurrent neural network for support vector machine learning. IEEE Transaction on System, Man, and Cybenretics-Part B, 2004, 34 (2): 1261 -1269.
    [49] Colobert R, Bengio S, BengioY A. Parallel mixture of SVMs for very large scale problems. Neural Computation, 2002, 14 (5): 1105-1114.
    [50] Valentini G, Muselli M, Rufmo F. Bagged ensembles of SVMs for gene expression data analysis. IEEE International Joint Conference on Neural Networks, Portland, 2003: 1844-1849.
    [51] Zhang Y, Burer S, Street W N. Ensemble pruning via semi-definite programming. Journal of Machine Learning Research, 2006, 7:1315-1338.
    [52] Osuna E, Freund R. An improved training algorithm for support vector machines. Proceedings of the IEEE Workshop on Neural Networks for Signal Processing, Amelia Island, FL, USA,1997: 276-285.
    [53] Graf H P, Cosato E, Botou L, et al. Parallel support vector machines: The Cascade SVM. Advances in Neural Information Processing Systems, Cambridge,MA:MIT Press, 2005: 521-528.
    [54] Zanghirati G, Zanni L. A parallel solver for large quadratic programming training support vector machines. Parallel Computing, 2003, 29: 535-551.
    [55] Zanni L. An improved gradient projection-based decomposition technique for support vector machines. Computational Management Science, 2006, 3 (2): 131-145.
    [56] Kim H, Pang S, Je H, et al. Constructing support vector machine ensemble. Pattern Recognition, 2003, 36 (12): 2757-2767.
    [57] Sun BY, Huang D S. Least squares support vector machine ensemble. International Joint Conference on Neural Networks, Budapest, 2004: 2013-2016.
    [58] Valentini G, Dieterich T. Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods. Journal of Machine Learning Research, 2004,5: 725-775.
    [59] Robert B, Ricardo GO, Francis Q. Attribute bagging: Improving accuracy of classifier ensembles by using random feature subsets, Pattern Recognition, 2003, 36:12 91-1302.
    [60] Dong Y S, Man K S. A comparison of several ensemble methods for text categorization. IEEE International Conference on Services Computing, Washington, DC, USA , 2004: 419-422.
    [61] Skurichina M , Duin R P W. Bagging, Boosting and the random subspace method for linear classifiers, Pattern Analysis and Applicaiton, 2002, 5(2): 121-135.
    [62] Tao D C, T ang X O, et al. A symmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2006, 28(7): 1088-1099.
    [63] Zheng Z, Wu X, Srihari R. Feature selection for text categorization on imbalanced data. SIGKDD explorations, 2004, 6 (1): 80-89.
    [64] Fawcett T, Provost F. Combining data mining and machine learning for effective user profile. Proceedings of the 2nd International Conference on Knowledge Discovery and Data Mining, Portland OR, AAAI Perss, 1996: 8-13.
    [65] Akbai R, Kwek S, Japkowicz N. Applying support vector machines to imbalanced datasets. Fifteenth European Conference on Machines Learning, Berlin: Springer-Verlag, 2004: 39-50.
    [66] Barandela R, Sanchez J S, Garcia V, Rangel E. Strategies for learning in class imbalance problems. Pattern Recognition, 2003, 36 (3): 849-851.
    [67] Chawla N V, Bowyer K W, Hall L O, et al. Synthetic minority over-sampling technique. Journal of Artificial Intelligence Research, 2002, 16 (3): 321-357.
    [68] Japkowicz N, Stephen S.The class imbalanced problem: A systematic study. Intelligent Data Analysis, 2002, 6(5): 429-449.
    [69] Barandela R, Valdovinos R M, Sanchez J S, et al. The imbalanced training sample problem: Under or over sampling? LNCS, Berlin: Springer-Verlag, 2004, 3138: 806-814.
    [70] Kubat M, Holteand R, Matwin S. Machine learning for the detection of oil spills in satellite radarlmages. Machine Leaming, 1998, 30: 195-215.
    [71] Kubat M, Matwin S. Addressing the curse of imbalanced datasets. One-sided Sampling Proceedings of the Fourteenth International Conference on Machine Learning, Nashville: Tennessee, 1997: 178-186.
    [72] Feng J. Non-symmetric support vector machines. IWANN, LNCS2084, Berlin: Springer-Verlag, 2001: 418-426.
    [73] Tao Q, Wu G W, Wang F Y, et al. Posterior probability support vector machines for unbalanced data. IEEE Trans on Neural Networks, 2005, 16 (6): 1561-1573.
    [74]郑恩辉,李平,宋执环.不平衡数据挖掘:类分布对支持向量机的影响.信息与控制, 2005, 34 (6): 703-708.
    [75] Chew H G, Bonger R E, Lim C C. Dual nu-support vector machine with error rate and training size biasing. Proceedings of the 26th IEEE International Conference Acoustics, Speech, and Signal, Salt Lake: (USA), 2001: 1269-1272.
    [76] Huang K, Yang H, King Z, et al. The minimum error minimax probability machine. Journal of Machine Learning Research MIT Press, Cambridge, MA, USA, 2004, 5:1253-1286.
    [77]贾银山,贾传荧.一种加权支持向量机分类算法.计算机工程, 2005, 31(12): 23-25.
    [78] Lin C F, Wang S D. Fuzzy support vector machines. IEEE Trans on Neural Networks, 2002, 13 (2): 464-471.
    [79] Lin Y, Lee Y, Wahba G.. Support vector machines for classification in nonstandard situations. Machine Learning, 2002, 46:191-202.
    [80] Abe S, Inoue T. Fast training of support vector machines by extracting boundary. Lecture Notes in Computer Science, Berlin: Springer, 2001, 213:308-313.
    [81] Chen L, Huang J, Gong Z H. An anti-noise text categorization method based on support vector machines. AWIC, LNAI, Berlin: Springer, 2005, 3528: 272-278.
    [82] Li B Q. Distance-based selection of potential support vector by kernel matrix. Lecture Notns in Computer Science3173, Berlin: Springer-Verlag, 2004: 468-473.
    [83]李红莲,王春花,袁保宗等.针对大规模训练集的支持向量机的学习策略.计算机学报, 2004, 27 (5): 140-144.
    [84] Tax D M J, Duin R P W. Outlier deletion using classifier instability. In Proceeding of the Joint International Workshops on Advances in Pattern Recognition, Lecture Notes in Computer Science1451, Berlin: Springer-Verlag, 1998: 593-601.
    [85] Wu M, Sch?lkopf B, Bakir G.. A direct method for building sparse kernel learning. Journal of Machine Learning Research, 2006, 7: 603–624.
    [86] Chang C C, Lee Y J. Generating the reduced set by systematic sampling. IDEAL, LectureNotes in Computer Science, Berlin: Springer-Verlag, 2004, 3177: 720-725.
    [87] Gao J, Shi W, Tan J, et al. .Support vector machines based approach for fault diagnosis of valves in reciprocating pumps. Proceedings of IEEE Conference on Electrical and Computer Engineering, Canadian, 2002, 3:1622 -1627.
    [88] Li Y , Hu Z , Cai Y, Zhang W. Support vector based prototype selection method for nearest neighbor rules. ICNC, Lecture Notes in Computer Science3610, Berlin: Springer-Verlag, 2005: 528-535.
    [89] Mika S, Sch?lkopf B, Wlliamson R, Smola A, et al. Kernel PCA and de-noising in feature spaces. Advances in Neural Information Processing Systems, MIT Press, Cambridge, MA, 1999, 11: 536-542.
    [90] Nguyen D D, Ho T B. A bottom-up method for simplifying support vector solutions. IEEE Transactions on Neural Networks, 2006,17 (3): 792-796.
    [91] Ahmad A R, Khalid M and Yusof R. Kernel methods and support vector machines for handwriting recognition. Student Conference on Research and Development, Shah, Alam, Selangor, 2002: 309-312.
    [92]李昆仑,黄厚宽等.模糊多类支持向量机及其在入侵检测中的应用.计算机学报, 2005, 28(2): 274-280.
    [93] Bazi Y, Melgani F. Toward an optimal SVM classification system for hyperspectral remote sensing images. IEEE Transactions on Geoscience and Remote Sensing, 2006, 44 (11): 3452-3461.
    [94] Guo G D, Jain A K, Ma W Y, et al. Learning similarity measure for natural image retrieval with relevance feedback. IEEE Transactions on Neural Networks, 2002, 13 (4): 811-820.
    [95] Li J, Allinson N, Tao D, etal. Multitraining support vector machine for image retrieval. IEEE Transactions on Image Processing, 2006, 15 (11): 3597-3601.
    [96] Dihua X, Seong-Whan L. Face detection and facial feature extraction using support vector machines. Proceedings of 16th International Conferenceon Pattern Recognition, 2002, 4 (4): 209-212.
    [97] Liu Y H, Chen Y T. Face recognition using total margin-based adaptive fuzzy support vector machines. IEEE Transactions on Neural Networks, 2007, 18 (l): 178-192.
    [98] Gordan M, Kotropoulos C and Pitas I. Application of support vector machines classifiers to visual speech recognition. International Conference on Image Processing, 2002, 3: 24 -28.
    [99] Wan V and Campbell W M. Support vector machines for speaker verification and identification. Proceedings of IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing, 2000, 2 (2): 775 -784.
    [100] Bhanu P K N, Ramakrishnan A G. Suresh S. et al. Fetal lung maturity analysis using ultrasound image features. IEEE Transactions on Information Technology in Biomedicine, 2002, 6 (1): 38-45.
    [101] Liu W, Shen P, Qu Y. et al. Fast algorithm of support vector machines in lung cancer diagnosis. Proceedings of International Workshop on Medical Imaging and Augmented Reality, Hong Kong, China, 2001: 188-192.
    [102] Bao L, Sun Z. Identifying genes related to drug anticancer mechanisms using support vector machine. FEBS Letters, 2002, 521 (1): 109-114.
    [103] Mark G and Ronald J. The current excitement in bioinformatics-analysis of whole-genome expression data: how does it relate to protein structure and function. Current Opinion in Structural Biology, 2000, 10 (5): 574-584.
    [104] Chen S, Samingan A K and Hano L. Support vector machine multiuser receiver for DS-CDMA signals in multipath channels. IEEE Transaction on Neural Network, 2001, 12 (3): 264-276.
    [105] Drucker H, Shahrary B, David C G. Support vector machine: Relevance feedback and information retrieval. Information processing and management, 2002, 38: 305-323.
    [106] Hong S J, Weiss S M. Advance in predictive models for data mining. Pattern Recognition Letter, 2001, 22: 55-61.
    [107] Drezet P M L, Harrison R F. Support vector machines for system identification. UKACC International Conference on Control, Swansea, UK,1998: 688-692.
    [108] Suykens J A K, Vandewalle J and De Moor B. Optimal control by least squares support vector machine. Neural Network, 2001, 14: 23-35.
    [109] Francis E H T, Lijuan C. Application of support vector machines in financial time series forecasting. Omega, 2001, 29: 309-317.
    [110]王强,沈永平,陈英武.多属性决策的支持向量机方法.系统工程理论与实践. 2006, 26(6): 53-58.
    [111] Rehan A, Stephen K, Nathalie J. Applying support vector machines to imbalanced datasets. Fifteenth European Conference on Machines Learning. Berlin: Springer-Verlag. 2004: 39-50.
    [112] Veropoulos K, Campbell C, Cristianini N. Controlling the sensitivity of support vector machines. Proceedings of the International Joint Conference on AI, 1999: 55-60.
    [113] Blake C L, Merz C J. UCI repository of machine learning databases. Department of Information and Computer Science, University of California, Irvine, CA, 1998. (http://www.ics.uci.edu/~mlearn/MLRepository.html).
    [114] Vapnik V. An overview of statistical learning theory. IEEE Transactions on Neural Networks, 1999, 10 (5): 988-999.
    [115] Alsuwaiyel M H著,吴伟昶,方世昌等译.算法设计技巧与分析.北京:电子工业出版社, 2006.
    [116] Tax D M J, Müller K R. Feature extraction for one-class classification, ICANN/ICONIP, LNCS, Berlin: Springer-Verlag, 2003, 2714: 342-349 .
    [117]张尧庭,方开泰.多元统计分析引论.北京:科学出版社,1997.
    [118] Cristianini N, Shawe-Tayor J. An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, 2000: 93-111.
    [119]武小红,周建江.一种基于类中心最大间隔的支持向量机.信息与控制, 2007, 36 (1): 63-67.
    [120] Ben-Hur A, Horn D, Siegelmann H, et al. Support vector clustering. Journal of Machine Learning Research, 2001, 2:125-137.
    [121] Juszczak P, Duin R P W. Selective sampling methods in one-class classification problems. International Conference on Artificial Neural Network, Lecture on Computer Sciences2714, Berlin: Springer-Verlag, 2003:140-148.
    [122] Park J, Kang D, Kim J, et al. SVDD-based pattern denoising, Neural Computation, 2007, 19 (7): 1919-1938.
    [123] Tax D M J, Duin R P W. Data domain description using support vectors. Proceedings - European Symposium on Artificial Neural Networks, Bruges (Belgium), 1999: 251-256.
    [124] Tax D M J, Juszczak P. Kernel whitening for one-class classification. International Journal of Pattern Recognition and Artificial and intelligence, 2003, 17 (3): 333-347.
    [125] Zhang X. Using class-center vector to build support vector machines. Neural Networks for Signal Processing. New York: IEEE Press, 1999: 3-11.
    [126] Zhenzhen Kou, Jianhua Xu, Xuegong Zhang et al. An improved support vector machine using class-median vectors. Proceedings of 8th International Conference on Neural Information Processing. Shanghai, China, 2001: 883-887.
    [127]边肇祺,张学工.模式识别.北京:清华大学出版社, 2000.
    [128]程昭,王丽亚.群AHP法判断矩阵调整和群信息集结算法研究.计算机工程, 2007, 33 (7): 184-186.
    [129] Chwolka A, Raith M. Group preference aggregation with the AHP- implications for multiple-issue agendas, European Journal of Operational Research, 2001, 132 (1): 176-186.
    [130] Forman E, Peniwati K. Aggregating individual judgments and priorities with the analytic hierarchy process. European Journal of Operational Research 1998,108 (1): 165-169.
    [131]吕跃进,郭欣荣.群AHP判断矩阵的一种有效集结方法.系统工程理论与实践2007, 27 (7): 132-136.
    [132] Ramanathan R, Ganesh L S. Group preference aggregation methods employed in AHP: An evaluation and an intrinsic process for deriving members' weightages. European Journal of Operational Research, 1994, 79(2): 249-265.
    [133]郑明,黄治斌.群决策的加权对数最小二乘法.复旦学报2001, 40 (2): 167-170.
    [134]刘万里,刘三阳. AHP中群决策判断矩阵的构造.系统工程与电子技, 2005, 27 (11), 1907-1908.
    [135] Van Den Honert R C, Lootsma F A. Group preference aggregation in the multiplicative AHP: The model of the group decision process and pareto optimality. European Journal of Operational Research, 1997, 96(2): 363-370.
    [136]]徐泽水.不确定多属性决策方法及应用.北京:清华大学出版社, 2004.
    [137] Xu Z. On consistency of the weighted geometric mean complex judgement matrix in AHP. European Journal of Operational Research, 2000, 126 (3): 683-687.
    [138]杨善林,刘心报. GDSS中判断矩阵的两种集结方法.计算机学报, 2001, 24 (1): 106-111.
    [139] Kwiesielewicz M. The logarithmic least squares and the generalized pseudoinverse in estimating ratios. European Journal of Operational Research, 1996, 93: 611-619.
    [140]王莲芬,郝刚,黎建强.层次分析法中区间判断的凸锥模型.系统工程学报, 1997, 12 (3): 39-48.
    [141] Bolloju N. Aggregation of analytic hierarchy process models based on similarities in decision makers’preferences. European Journal of Operational Research, 2001,128 (3): 499-508.
    [142] Bryson N. Group decision-making and the analytic hierarchy process: Exploring the consensus-relevant information content, Computers & Operations Research 1996, 23 (1):27-35.
    [143] Saaty T L. The analytic hierachy process. New York: McGraw-Hill, 1980.
    [144]熊立,梁樑,王国华.一种群决策中确定专家判断认真程度的方法.系统工程, 2004, 22 (3): 84-87.
    [145]刘万里.关于AHP中逆判问题的研究.系统工程理论与实践, 2001, 21(4): 133-136.
    [146]徐泽水,群决策中专家赋权方法研究.应用数学与计算数学学报, 2001, 15 (1): 19-22.
    [147]周宇峰,魏法杰.基于模糊判断矩阵信息确定专家权重的方法中国管理科学, 2006, 14 (3): 71-75.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700