核方法在分类、回归与聚类方面的研究及应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
近年来,核方法在模式识别与机器学习领域中得到了快速的发展。核方法的本质,是通过核函数,把数据从低维的输入空间映射到高维的特征空间。如在分类问题上,核方法可以使输入空间中线性不可分的数据,在特征空间中是线性可分的。
     本论文对核方法中的鲁棒支持向量回归机、半监督多标记支持向量学习、稀疏支持向量学习及核聚类等四个方面进行研究。具体来说,本论文的工作分述如下:
     针对鲁棒支持向量回归机问题,提出一种自适应误差惩罚支持向量回归机AEPSVR,该算法能够减少离群点对支持向量回归机的不利影响。进一步地,研究了鲁棒支持向量回归机的代价函数的性质,引入一类鲁棒代价函数族,实现了模糊鲁棒支持向量回归机FRSVR。FRSVR不仅具有鲁棒性的优点,而且能够对离群点进行识别。
     对于半监督多标记的支持向量学习问题,研究一种半监督多标记支持向量算法SSML_SVM。SSML_SVM把半监督多标记学习问题转化为半监督单标记学习问题,然后基于MAP(Maximum a Posteriori)原则对未标记样本分类,通过迭代的方式求解半监督单标记学习问题。SSML_SVM能利用未标记样本的信息,提高多标记学习的性能。
     在稀疏支持向量学习问题上,给出一种直接稀疏核回归机DSKR。在DSKR中,通过给ε-SVR支持向量回归机增加非凸约束,限定支持向量个数,然后用梯度下降法求解优化问题。DSKR算法可以显著地降低支持向量的数量,用更少的支持向量,得到较好的拟合结果。
     在核聚类算法问题上,研究了两种改进的信任力传播聚类算法SSKAPC和AFAPC。SSKAPC用核函数将样本映射到高维空间,并使用先验信息辅助聚类,提高了聚类精度。AFAPC算法是一种基于万有引力的信任力传播聚类算法,该算法根据近邻样本之间的信息,加快聚类速度,能在更短的运行时间内,得到与信任力传播聚类算法相媲美的性能。
     作者在攻读博士学位期间还进行了伪图像识别方面的工作,研究一种伪图像识别算法BERFS。BERFS从语义的角度,根据相对频域特征和语义特征识别伪图像,它不但可以检测伪图像,而且能较好地估计出模糊区。
Recent years, kernel method develops rapidly in pattern recognition and machine learning community. The nature of kernel method is to map data from low dimensional input space to high dimensional feature space, which can improve the performance of machine learning method. For example, for non-linearly separatable dataset in input space, the mapping may make it linearly separatable in feature space. In kernel method, there exist important problems to be solved. Among them, robust support vector regression, semi-supervised multi-label learning, sparse support vector learning and kernel clustering are in need of solutions.
     In this dissertation, these four problems are investigated. The contributions of this dissertation are as follows:
     Firstly, we propose an adaptive error penalization support vector regression method named AEPSVR. AEPSVR can reduce the affect of outliers, and achieves improved generalization capability. Furthermore, we investigate the properties of cost function for constructing robust support vector regression. Then a family of robust cost functions is introduced. Based on these cost functions, we implement a fuzzy robust support vector regression method called FRSVR, which is robust, and can be used to identify outliers.
     Secondly, for semi-supervised multi-label support vector learning problem, we present a semi-supervised multi-label learning method named SSML_SVM to obtain an effective multi-label method for gene expression data processing. The proposed SSML_SVM transforms semi-supervised multi-label learning into semi-supervised single-label learning by PT4 method, then it labels unlabeled examples using MAP (Maximum a Posteriori) principle together with K-nearest neighbor method, and finally, it solves single-label learning problem using SVM. The distinctive character of the proposed method is its efficient integration of SVM based single-label learning together with MAP and K-nearest neighbor method.
     Thirdly, we extend direct sparse kernel learning framework to support vector regression, and propose direct sparse kernel regression method called DSKR. By adding a non-convex constraint toε-SVR, DSKR can obtain sparse kernel regression with arbitrary user-defined number of support vectors. It can achieve promising regression performance with less number of support vectors thanε-SVR.
     In the last, we propose two improved kernel affinity propagation clustering methods called SSKAPC and AFAPC. Kernel trick is adopted for the purpose of processing non-linear problem. In SSKAPC, affinity propagation clustering method is extended to semi-supervised setting, in which background knowledge is provided in terms of pairwise constraints for improving clustering performance. In AFAPC, clusters and corresponding centers can be achieved by transforming affinity messages in data networks, where affinity messages are obtained based on gravity forces between data points. Experimental results demonstrate that the clustering accuracy of AFAPC is comparable with affinity propagation clustering. However, its running time is much less than that of affinity propagation clustering method.
     The author also does researching work on image forensics, and proposes a fake image detecting method named BERFS. BERFS can identify fake images using relative frequency feature and semantic feature with high accuracy, and can estimate blurred region precisely.
引文
1. Vapnik V N著,张学工译.统计学习理论的本质[M].北京:清华大学出版社, 2000
    2. Smola A J, Sch?lkopf B. A Tutorial on Support Vector Regression [J]. Statistics and Computing, 2004, 14(3): 199–222
    3. Sch?lkopf B, Smola A J, Williamson R C, et al. New support vector algorithm [J]. Neural Computation, 2000, 12(12): 1207–1245
    4. David M J T, Robert P W D. Support Vector Data Description [J]. Machine Learning, 2004, 54(1): 45-66
    5. Asa B H, Horn D, Siegelmann H T, Vapnik V. Support Vector Clustering [J]. Journal of Machine Learning Research, 2001, 2(Dec): 125-137
    6. Baudat G, Anouar F. Generalized Discriminant Analysis Using a Kernel Approach [J]. Neural Computation, 2000, 12(10): 2385–2404
    7. Sch?lkopf B, Smola A J, and Müller K R. Nonlinear Component Analysis as a Kernel Eigenvalue Problem [J]. Neural Computation, 1998, 10(5): 1299–1319
    8.张莉,周伟达,焦李成.核聚类算法[J].计算机学报, 2002, 25(6): 587-590
    9.沈红斌,王士同,吴小俊.离群模糊核聚类算法[J].软件学报, 2004, 15(7):1021-1029
    10. Drucker H, Wu D, Vapnik V. Support Vector Machines for Spam Categorization [J]. IEEE Transaction on Neural Networks, 1999, 10(5): 1048–1054
    11. Furey T, Cristianini N, Duffy N, Bednarski D, et al.. Support Vector Machine Classification and Validation of Cancer Tissue Samples Using Microarray Expression Data [J]. Bioinformatics, 2000, 16(10):906–914
    12. Zien A, R?tsch G, Mika S, et al. Engineering Support Vector Machine Kernels that Recognize Translation Initiation Sites in DNA [J]. Bioinformatics, 2000, 16(9):799–807
    13. DeCoste D, Sch?lkopf B. Training Invariant Support Vector Machines [J]. Machine Learning, 2002, 46(1-3): 161-190
    14.边肇祺,张学工编著.模式识别[M].北京:清华大学出版社, 2000
    15. Burges C J C. A Tutorial on Support Vector Machines for Pattern Recognition [J]. Data Mining and Knowledge Discovery, 1998, 2(2): 121-167
    16. Baudat G, Anouar F. Feature Vector Selection and Projection Using Kernels [J]. Neurocomputing, 2003, 55(1-2): 21-38
    17. Niijima S, Kuhara S. Gene Subset Selection in Kernel-induced Feature Space [J]. Pattern Recognition Letters, 2006, 27(16): 1884-1892
    18. Louw N, Steel S J. Variable Selection in Kernel Fisher Discriminant Analysis by Means of Recursive Feature Elimination [J]. Computational Statistics and Data Analysis, 2006, 51(3): 2043-2055
    19. Yang S, Yan S C, Zhang C, Tang X O. Bilinear Analysis for Kernel Selection and Nonlinear Feature Extraction [J]. IEEE Transactions on Neural Networks, 2007, 18(5): 1442-1452
    20. Lafferty J D, Lebanon G.. Diffusion Kernels on Statistical Manifolds [J]. Journal of Machine Learning Research, 2005, 6(Jan): 129-163
    21. Spira A, Kimmel R, Sochen N A. A Short-time Beltrami Kernel for Smoothing Images and Manifolds [J]. IEEE Transactions on Image Processing, 2007, 16(6): 1628-1636
    22. Sch?lkopf B, Smola A J, Williamson R C, et al.. New Support Vector Algorithm [J]. Neural Computation, 2000, 12(12): 1207–1245
    23. Bansal R K, Panayota P K. Outlier-resistant Algorithms for Detecting a Change in a Stochastic Process [J]. IEEE Transactions on Information Theory, 1989, 35(3): 521-535
    24. Calafiore G. Outliers Robustness in Multivariate Orthogonal Regression [J]. IEEE Transactions on Systems, Man, and Cybernetics, Part A, 2000, 30(6): 674-679
    25. Zhang J S, Leung Y W. Robust Clustering by Pruning Outliers [J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 2003, 33(6): 983-998
    26. Lehtom?ki J J, Vartiainen J, Juntti M J et al.. CFAR Outlier Detection with Forward Methods [J]. IEEE Transactions on Signal Processing, 2007, 55(9): 4702-4706
    27. Zhang L P, Peng Y N. Outlier Probability of Generalized Chirpogram-based Estimators for Multicomponent Polynomial-phase Signals [J]. IEEE Transactions on Signal Processing, 2005, 53(2-1): 576-588
    28. Kim J H, Han J H. Outlier Correction from Uncalibrated Image Sequence Using the Triangulation Method [J]. Pattern Recognition, 2006, 39(3): 394-404
    29. Song Q, Hu W, Xie W. Robust Support Vector Machine with Bullet Hole Image Classification [J]. IEEE Transion on Systems, Man and Cybernetics, Part C, 2002, 32(4): 440–448
    30. Weston J, Herbrich R. Adaptive Margin Support Vector Machines [C]. In: Smola A J, Bartlett P, Sch?lkopf B , eds. Neural Information Processing Systems ConferenceWorkshop on Advance in Large Margin Classifiers. Cambridge: MIT Press, 1998. 281–296
    31. Xu L L, Crammer K, Schuurmans D. Robust Support Vector Machine Training via Convex Outlier Ablation [C]. In: AAAI Society eds. Proc. of the 21st National Conference on Artificial Intelligence, Washington: AAAI Press, 2006. 536–546
    32. Zhan Y Q, Shen D G. An Adaptive Error Penalization Method for Training an Efficient and Generalized SVM [J]. Pattern Recognition, 2006, 39(3): 342–350
    33.张讲社,郭高.加权稳健支撑向量回归方法[J].计算机学报, 2005, 28(7): 1171–1177
    34. Suykens J A K, Brahanter D J, Lukas L, Vandewalle J. Weighted Least Squares Support Vector Machine: Robustness and Sparse Approximation [J]. Neurocomputing, 2002, 48(1-4): 85–105
    35. Chuang C C, Su F F, Jeng J T, Hsiao C C. Robust Support Regression Networks for Function Approximation with Outliers [J]. IEEE Trans on Neural Networks, 2002, 13(6): 1322–1330
    36. Zhan Y, Cheng H Z. A Robust Support Vector Algorithm for Harmonic and Interharmonic Analysis of Electric Power System [J]. Electric Power Systems Research, 2005, 73(3): 393–400
    37. Smola A J. Learning with Kernels [D]: PhD thesis. Berlin: Technische Universit?t Berlin, 1998.
    38. Hathaway R J, Bezdek J C. Local Convergence of the Fuzzy C-means Algorithms [J]. Pattern Recognition, 1986, 19(6): 477-480
    39. Ismail M A, Selim S Z. Fuzzy C-means: Optimality of Solutions and Effective Termination of the Algorithm [J]. Pattern Recognition, 1986, 19(6): 481-485
    40. Mansfield J R, Sowa M G, Payette J R et al.. Tissue Visability by Multispectral Near Infrared Imaging: A Fuzzy C-Means Clustering Analysis [J]. IEEE Transactions on Medical Imaging, 1998, 17(6): 1011-1018
    41. Cai W L, Chen S C, Zhang D Q. Fast and Robust Fuzzy C-means Clustering Algorithms Incorporating Local Information for Image Segmentation [J]. Pattern Recognition, 2007, 40(3): 825-838
    42. Lu J M, Yuan X, Yahagi T. A Method of Face Recognition Based on Fuzzy c-Means Clustering and Associated Sub-NNs [J]. IEEE Transactions on Neural Networks, 2007, 18(1): 150-160
    43.段希利,王宗明,王丽娟等.多股射流瓦斯燃烧器湍流扩散火焰尺度的实验研究[J].热能动力工程, 2004, 19(2): 144-147
    44.王宗明,仇性启.奥里乳化油燃烧器设计与实验[J].石油化工设备, 2002, 31(1): 23-25
    45. Lin T C, Yu P T. Adaptive Two-pass Median Filter on Support Vector Machines for Image Restoration [J]. Neural Computation, 2004, 16(2): 333-354
    46. Zhu J G, Wang S T, Wu X S et al.. A Novel Adaptive SVR Based Filter ASBF for Image Restoration [J]. Soft Computing, 2006, 10(8): 665-672
    47. Otsu N. A Threshold Selection Method from Gray-level Histograms [J]. IEEE Transactions on Systems, Man and Cybernetics, 1979, 9(1): 62-66
    48. Tizhoosh H R. Image Thresholding Using Type II Fuzzy Sets [J]. Pattern Recognition, 2005, 38(12): 2363-2372
    49. Eisen M B, Spellman P T, Brown P O, et al. Cluster Analysis and Display of Genome-wide Expression Patterns [C]. In: Proc. of the National Academy of Science of the United States of America. Washington: the National Academy of Sciences, 1998. 14863-14868
    50. Tamayo P, Slonim D, Mesirov J, et al. Interpreting Patterns of Gene Expression with Self-organizing Maps: Methods and Application to Hematopoietic Differentiation [C]. In: Proc. of the National Academy of Sciences of the United States of America. Washington, Washington: the National Academy of Sciences, 1999. 2907-2912
    51. Wu S, Liew A W C, Yan H, et al. Cluster Analysis of Gene Expression Data Based on Self-splitting and Merging Competitive Learning [J]. IEEE Transactions on Information Technology in Biomedicine, 2004, 8(1):5-15
    52. McCallum A K. Multi-label Text Classification with a Mixture Model Trained by EM [C]. In: Working Notes of the AAAI’99 Workshop on Text Learning. Orlando: USA. http:// citeseer.ist.psu.edu/mccallum99multilabel.html, 1999
    53. Schapire R E, Singer Y. Boostexter: a Boosting-based System for Text Categorization [J]. Machine Learning, 2000, 39(2-3): 135-168
    54. Elisseeff A, Weston J. A Kernel Method for Multi-labeled Classification [C]. In: Dietterich T G, Becker S, Ghahramani Z, eds. Advances in Neural Information Processing Systems 14. Cambridge: MIT Press, 2002. 681-687
    55. Boutell M R, Luo J, Shen X, et al.. Learning Multi-label Scene Classification [J]. Pattern Recognition, 2004, 37(9): 1757-1771
    56. Li T, Ogihara M. Detecting Emotion in Music [C]. In: Proc. of the 4th International Symposium on Music Information Retrieval. Maryland: ISMIR Press, 2003
    57. Zhu X J. Semi-supervised Learning Literature Survey. Technical Report 1530, Department of Computer Sciences, University of Wisconsin, Madison, 2005
    58. Zhang M L, Zhou Z H. ML-KNN: A Lazy Learning Approach to Multi-label Learning [J]. Pattern Recognition, 2007, 40(7): 2038-2048
    59. Tsoumakas G, Katakis I. Multi-label Classification: An overview [J]. International Journal of Data Warehousing and Mining, 2007, 3(3):1-13
    60. Clare A, King R D. Knowledge Discovery in Multi-label Phenotype Data [C]. In: Raedt L D, Siebes A, eds. Proc. of the 5th European Conference on Principles of Data Mining and Knowledge Discovery. Berlin: Springer, 2001. 42-53
    61. Luo X, Zincir H. Evaluation of Two Systems on Multi-class Multi-label Document Classification [C]. In: Hacid M S, Murray N V, Ras Z W, et al. eds. Lecture Notes in Computer Science 3488. Berlin: Springer, 2005. 161-169
    62. Godbole S, Sarawagi S. Discriminative Methods for Multi-labeled Classification [C]. In: Dai H H, Srikant R, Zhang C Q, eds. Lecture Notes in Computer Science 3056. Germany: Springer, 2004. 22-30
    63. Zhou Z H, Zhang M L. Multi-instance Multi-label Learning with Application to Scene Classification [C]. In: Sch?lkopf B, Platt J C, Hoffman T, eds. Advances in Neural Information Processing Systems 19. Cambridge: MIT Press, 2007. 1609-1616
    64. Zhang M L, Zhou Z H. Multilabel Neural Networks with Applications to Functional Genomics and Text Categorization [J]. IEEE Transactions on Knowledge and Data Engineering, 2006, 18(10): 1338-1351
    65.施彤年,卢忠良,荣融等.多类多标签汉语文本自动分类的研究[J].情报学报, 2003, 22(3): 306-309
    66. Liu Y, Jin R, Liu Y. Semi-supervised Multi-label Learning by Constrained Non-negative Matrix Factorization [C]. In: AAAI Society eds. Proc. of the 21th National Conference on Artificial Intelligence. Washtington: AAAI Press, 2006.
    67.宫秀军,史忠植.基于Bayes潜在语义模型的半监督Web挖掘[J].软件学报, 2002, 12(8): 1508-1514
    68.彭雅,林亚平,陈治平. TFIDF_NB协同训练算法[J].小型微型计算机, 2004, 25(12): 2243-2246
    69. Yarowsky D. Unsupervised Word Sense Disambiguation Rivaling Supervised Methods. In: Proc. of the 33rd Annual Meeting of the Association for Computational Linguistics, 1995
    70. Riloff E, Wiebe J, Wilson T. Learning Subjective Nouns Using Extraction Pattern Bootstrapping. In: Proc. of the Seventh Conference on Natural Language Learning, 2003
    71. Rosenberg C, Hebert M, Schneiderman H. Semi-supervised Self-training of Object Detection Models. In: Proceedings of Seventh IEEE Workshop on Applications of Computer Vision, 2005
    72. Tsang I W, Kwok J T. Large-scale Sparsified Manifold Regularization. In: Sch?lkopf B, Platt JC, Hofmann T eds. Sch?lkopf B, Platt J C, Hoffman T, eds. Advances in Neural Information Processing Systems 19. Cambridge, MA: MIT Press, 2007. 1401-1408
    73. Chapelle O, Sindhwani V, Keerthi S S. Optimization Techniques for Semi-Supervised Support Vector Machines. Journal of Machine Learning Research, 2008, 9(Feb): 203-233
    74. Brinker K, Furnkranz J, Hullermeier E. A Unified Model for Multi-label Classification and Ranking. In: Proc. of 17th European Conference on Artificial Intelligence. Riva del Garda: IOS Press, 2006. 489-493
    75. Bennett K, Demiriz A. Semi-supervised Support Vector Machines. In: Michael J Kearns, Sara A Solla, David A Cohn, eds. Advances in Neural Information Processing Systems 11. Cambridge: MIT Press, 1999. 368-394
    76. Chapelle O, Chi M, Zien A. A Continuation Method for Semi-supervised SVMs. In: William W Cohen, Andrew Moore eds. Proc. of the 23th International Conference on Machine Learning. New York: ACM Press, 2006. 185-192
    77. Joachims T. Transductive Inference for Text Classification Using Support Vector Machines. In: Ivan Bratko, Saso Dzeroski eds. Proc of the 16th International Conference on Machine Learning. Slovenia: Morgan Kaufmann, 1999. 200-209
    78. Steinwart I. Sparseness of Support Vector Machine [J]. Journal of Machine Learning Research, 2003, 4(Nov): 1071–1105
    79. Wu M R, Sch?lkopf B, Bakir G. A Direct Method for Building Sparse Kernel Learning Algorithms [J]. Journal of Machine Learning Research, 2006, 7(Jul): 603-624
    80. Burges C J C. Simplified Support Vector Decision Rules [C]. In: Saitta L eds. Proc. of 13th International Conference on Machine Learning. Morgan Kaufmann, 1996. 71-77
    81. Suykens J A K, Vandewalle J. Least Squares Support Vector Machine Classifiers [J]. Neural Processing Letter, 1999, 9(3): 293–300
    82. Suykens J A K, Vandewalle J. Multiclass Least Squares Support Vector Machines [C]. In: Proc. of the International Joint Conference on Neural Networks.,Washington, DC, 1999.
    83.王庆云,黄道.固定尺度最小二乘支持向量机[J].华东理工大学学报, 2006, 32(7): 772-774
    84.陶少辉,陈德钊,胡望明.最小二乘支持向量机分类器的高稀疏化及应用[J].系统工程与电子技术, 2007, 29(8): 1353-1357
    85.陈爱军,宋执环,李平.基于矢量基学习的最小二乘支持向量机建模[J].控制理论与应用, 2007, 24(1): 1-5
    86.甘良志,孙宗海,孙优贤.稀疏最小二乘支持向量机[J].浙江大学学报(工学版), 2007, 41(2): 245-248
    87.王定成,姜斌.在线稀疏最小二乘支持向量机回归的研究[J].控制与决策, 2007, 22(2): 132-137
    88.陶少辉,陈德钊,胡望明.基于CCA对LSSVM分类器的稀疏化[J].浙江大学学报(工学版), 2007, 41(7):1093-1118
    89. Lee Y J, Mangasarian O L. RSVM: Reduced Support Vector Machines [J]. Techneical Report 00-07. Computer Science Department of University of Wisconsin Madison, 2000
    90. Lin K M, Lin C J. A Study on Reduced Support Vector Machines [J]. IEEE Transation on Neural Networks, 2003, 14(6): 1449–1459
    91. Lee Y J, Huang S Y. Reduced Support Vector Machines: A Statistical Theory [J]. IEEE Transaction on Neural Networks, 2007, 18(1): 1–13
    92. Smola A J, Sch?lkopf B. Sparse Greedy Matrix Approximation for Machine Learning [C]. In: Langley P eds. Proc. of 17th International Conference of Machine Learning. San Mateo: Morgan Kaufmann Press, 2000. 911-918
    93. Williams C K I, Seeger M. Using the Nystr?m Method to Speed up Kernel Machines [C]. In: Leen T K, Dietterich T G., Tresp V eds. Advances in Neural Information Processing Systems 13. Cambridge: MIT Press, 2001. 682-688
    94. Drineas P, Mahoney M W. On the Nystr?m Method for Approximating a Gram Matrix for Improved Kernel-based Learning [J]. Journal of Machine Learning Research, 2005, 6(Dec): 2153–2175
    95. Fung G, Mangasarian O L. Proximal Support Vector Machine Classifiers [C]. In: Proc. of the 7th International Conference on Knowledge Discovery and Data Mining. New York: ACM Press, 2001. 77-86
    96. Tipping M E. Sparse Bayesian Learning and the Relevance Vector Machine [J]. Journal of Machine Learning Research, 2001, 1(Jun): 211–244
    97. Mangasarian O L, Musicant D R. Lagrangian support vector machines [J]. Journal of Mache Learning Research. 2001,1(Mar): 161–177
    98. Nair P B, Choudhury A, Keane A J. Some Greedy Learning Algorithms for Sparse Regression and Classification with Mercer Kernels [J]. Journal of Machine Learning Research, 2002, 3(Dec):781–801
    99. Musicant D R, Feinberg A. Active Set support Vector Regression [J]. IEEE Transaction on Neural Networks, 2004, 15(2): 268–275
    100.Lee Y J, Hsieh W F, Huang C M.ε-SSVR: A Smooth Support Vector Machine forε-insensitive Regression [J]. IEEE Transaction on Knowledge and Data Engineering. 2005, 17(5): 678–685
    101.王玲,薄列峰,刘芳,焦李成.稀疏隐空间支持向量机[J].西安电子科技大学学报, 2006, 33(6): 896-900
    102.Liu D C, Nocedal J. On the Limited Memory BFGS Method for Large Scale Optimization [J]. Math Programming, 1989, 45(3)B: 503–528
    103.Frey B J, Dueck D. Clustering by Passing Messages between Data Points [J]. Science, 2007, 315(5814): 972–976
    104.Sch?lkopf B, Smola A J, Muller K R. Nonlinear. Component Analysis as a Kernel Eigenvalue Problem. Technical Report, Max Planck Institute for Biological Cybernetics, Tübingen, 1996
    105.Sch?lkopf B, Smola A J, Muller K R. Nonlinear. Nonlinear Component Analysis as a Kernel Eigenvalue Problem [J]. Neural Computation, 1998, 10(1): 1299–1319
    106.Wagsta K, Cardie C, Rogers S et al.. Constrained K-means Clustering with Background Knowledge. In: Brodley C E, Danyluk A P, eds. Proc. of the 17th International Conference on Machine Learning. New York: ACM Press, 2001. 577-584
    107.Klein D, Kamvar D, Manning C. From Instance-level Constraints to Space-level Constraints: Making the most of Prior Knowledge in Data Clustering. In: Sammut C, Hoffmann A G, eds. Proc. of 19th International Conference on Machine Learning. San Mateo: Morgan Kaufmann Press, 2002. 307-314
    108.Girolami M. Mercer Kernel Based Clustering in Feature Space [J]. IEEE Transactions on Neural Networks, 2002, 13(3): 669–688
    109.Basu S, Banerjee A, Mooney R. Semi-supervised Clustering by Seeding. In: Sammut C, Hoffmann A G, eds. Proc. of 19th International Conference on Machine Learning. San Mateo: Morgan Kaufmann Press, 2002. 27-34
    110.Zhang R, Rudnicky A I. A Large Scale Clustering Scheme for Kernel K-Means. In: IEEE Computer Society eds. Proc. of the 16th International Conference on Pattern Recognition. Washtington: IEEE Press, 2002. 289-294
    111.Xing E P, Ng AY, Jordan M I et al. Distance Metric Learning, with Application to Clustering with Side-information. In: Becker S, Thrun S, Obermayer K, eds. Advances in Neural Information Processing Systems 15. Combridge: MIT Press, 2003. 505-512
    112.Bar H A, Hertz T, Shental N et al. Learning Distance Functions Using Equivalence Relations [C]. In: Fawcett T, Mishra N, eds. Proc. of the 20th International Conference on Machine Learning. Washington: AAAI Press, 2003. 11-18
    113.Bilenko M, Basu S, Mooney R J. Integrating Constraints and Metric Learning in Semi-supervised Clustering [C]. In: Brodley C E, eds. Proc. of the 21th International Conference on Machine Learning. New York: ACM Press, 2004. 81-88
    114.Chang H, Yeung D Y. Locally Linear Metric Adaptation for Semi-supervised Clustering [C]. In: Brodley C E, eds. Proc. of the 21th International Conference on Machine Learning. New York: ACM Press, 2004. 153-160
    115.Basu S, Bilenko M, Mooney R J. A Probabilistic Framework for Semi-supervised Clustering [C]. In: Kim W, Kohavi R, Gehrke J, et al. eds. Proc. of 10th International Conference on Knowledge Discovery and Data Mining. New York: ACM Press, 2004. 59-68
    116.Dhillon I, Guan Y, Kulis B. Kernel K-means, Spectral Clustering and Normalized Cuts [C]. In: Kim W, Kohavi R, Gehrke J, et al. eds. Proc. of 10th International Conference on Knowledge Discovery and Data Mining. New York: ACM Press, 2004. 551-556
    117.Kulis B, Basu S, Dhillon I et al. Semi-Supervised Graph Clustering: a Kernel Approach [C]. In: Raedt L D, Wrobel S, eds. Proc. of the 22th International Conference on Machine Learning. New York: ACM Press, 2005. 457-464
    118.Yan B, Domeniconi C. An Adaptive Kernel Method for Semi-supervised Clustering [C]. In: Fürnkranz J, Scheffer T, Spiliopoulou M, eds. Proc. of the 17th European Conference on Machine Learning. Berlin: Springer, 2006. 521-532
    119.Zhang D Q, Chen S C. Clustering Incomplete Data Using Kernel-Based Fuzzy C-means Algorithm [J]. Neural Processing Letters, 2003, 18(3): 155-162
    120.Zhang D Q, Chen S C. A Novel Kernelized Fuzzy C-means Algorithm with Application in Medical Image Segmentation [J]. Artificial Intelligence in Medicine, 2004, 32(1): 37-50
    121.Chen S C, Zhang D Q. Robust Image Segmentation Using FCM with Spatial Constraints Based on New Kernel-Induced Distance Measure [J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 2004, 34(4): 1907-1916
    122.Cai W L, Chen S C, Zhang D Q. Fast and Robust Fuzzy C-means Clustering Algorithms Incorporating Local Information for Image Segmentation [J]. Pattern Recognition, 2007 40(3): 825-838
    123.吴琼,李国辉,涂丹,孙韶杰.面向真实性鉴别的数字图像盲取证技术综述[J].自动化学报, 2008, 34(12): 1458-1466
    124.Hsiao D Y, Pei S C. Detecting Digital Tampering by Blur Estimation [C]. In: Proc. of the First IEEE International Workshop on Systematic Approaches to Digital Forensic Engineering. Washtington: IEEE Press, 2005. 264-278
    125.Sutcu Y, Coskun B, Sencar H T et al. Tamper Detection Based on Regularity of Wavelet Transform Coefficients [C]. In: Proc. of IEEE International Confefence on Image Processing. Washtington: IEEE Press, 2007. Vol. 1: 397-400
    126.Bayram S, Avcibas I, Sankur B et al. Image Manipulation Detection with Binary Similarity Measures [C]. In: Proc. of 13th Europe Conference on Signal Processing. England: Curran Associates Inc, 2005. 752-755
    127.Avcibas I, Memon N, Sankur B et al. A Classifier Design for Detecting Image Manipulations [C]. In: Proc. of IEEE International Conference on Image Processing. Washtington: IEEE Press, 2004. 2645-2648
    128.Bayram S, Avcibas I, Sankur B et al. Image Manipulation Detection [J]. Journal of Electronic Imaging, 2006, 15(4): 1-16
    129.Lyu S, Farid H. Steganalysis Using Higher-order Image Statistics [J]. IEEE Transaction on Information Forensices and Security, 2006, 1(1): 111-119
    130.Ahmed N, Natarajan T, Rao K R. Discrete Cosine Transform [J]. IEEE Transaction on Computer, 1974, C-32: 90-93
    131.Carneiro G, Chan A B, Moreno P J et al. Supervised Learning of Semantic Classes for Image Annotation and Retrieval [J]. IEEE Transaction on Pattern Analysis and Machine Intellegence, 2008, 29(3): 394-410
    132.Barnard K, Duygulu P, Guru R et al. The Effects of Segmentation and Feature Choice in a Translation Model of Object Recognition [C]. In: Proc. of IEEEComputer Society Conference on Computer Vision and Pattern Recognition. Washtington: IEEE Press, 2003. Vol.2: 75-82
    133.Barnard K, Fan Q, Swaminathan R et al. Evaluation of Localized Semantics: Data, Methodology, and Experiments [J]. International Journal of Computer Vision, 2008, 77(1-3): 199-217
    134.Adamr R, Nischof L. Seeded Region Growing [J]. IEEE Transaction on Pattern Analysis and Machine Intellegence, 1994, 16(6): 641-647
    135.Hojjatoleslami S A, Kittler J. Region Growing: a New Approach [J]. IEEE Transaction on Image Processing, 1998, 7(7): 1079-1084
    136.Shi J, Malik J. Normalized Cuts and Image Segmentation [J]. IEEE Transaction on Pattern Analysis and Machine Intellegence, 2000, 22(8): 888-905
    137.Belongie S, Malik J. Shape Matching and Object Recognition Using Shape Contexts [J]. IEEE Transaction on Pattern Analysis and Machine Intellegence, 2002, 24(24): 509-522

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700