用户名: 密码: 验证码:
聚类分析及其在图像处理中的应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
聚类分析作为一种无监督学习方法,是机器学习领域重要研究方向之一。近年来,数据聚类正在蓬勃发展,聚类分析已成功应用于图像处理,文本挖掘,生物信息学等诸多领域。
     本文重点研究聚类分析中两个关键问题:相似性度量和聚类算法的设计及应用。聚类的目的是发现相似对象的集合,因此如何度量对象间的相似性是聚类分析中的一个关键问题。本文基于现有的高斯型相似度计算模型,提出了新的相似度计算模型,并针对数据特征对相似度的影响进行了讨论,将本质维数作为新的特征来改善相似性度量。在聚类算法设计及应用方面,针对不同的聚类问题,设计快速有效的聚类算法是十分必要的。本文分析了现有聚类算法的优缺点,提出了一种快速的基于相似度矩阵的聚类算法,并将其应用到图像分割中。考虑到现实中大多数图像受到了噪声干扰,为了降低噪声对图像分割和其他后续图像分析带来的影响,提出了一种基于稀疏表示的混合噪声去噪算法。
     本文的主要工作包括:
     (1)提出一种加权的自适应高斯型相似性度量方法。传统的高斯型相似度适用于同密度簇的聚类问题,而且对于数据中的野值点不够鲁棒。考虑到实际数据中野值点和不同密度簇的存在,提出了一种新的鲁棒的高斯型相似度计算方法。基于已有的自适应高斯型相似度度量,新的相似度根据数据点的邻域信息对每个数据点赋以权值,并通过降低野值点的权重来降低野值点与其他数据点的相似度。实验表明新的相似性度量能更好地反映类内和类间数据点的相似性关系,得到更加满意的聚类结果。
     (2)提出一种基于本质维数的相似性度量。相似性度量不仅依赖于相似度计算公式,还依赖于数据的特征。数据集中的每个类可以看作是一个子流形,通过定义反映流形拓扑结构的数据特征来对各个子流形进行划分。本质维数可以用于区分不同的流形结构,位于同一流形的数据点的本质维数应该保持致,具有不同本质维数的数据点通常也可以认为分布在不同的流形上。通过数据点的邻域信息来估计数据点的本质维数,将本质维数作为数据的新的特征,结合数据的原始特征来计算新的相似度。实验表明,基于新的相似度的聚类结果要优于单独使用本质维数或者原始特征的相似度得到的聚类结果。
     (3)对于具有复杂结构的数据集,只通过无监督的方法改变相似性度量很难达到满意的聚类结果。半监督聚类是利用有限的标签数据对全部数据的聚类过程进行指导,从而可以得到更好的聚类结果。本文提出一种基于近邻传播算法的半监督聚类算法。近邻传播算法是基于相似度矩阵的聚类算法,新算法根据约束对先验信息调整相似度矩阵,进而改善聚类结果。实验结果表明,通过加入少量先验约束对信息,半监督近邻传播算法较大程度地改善了无监督近邻传播算法的聚类结果。
     (4)基于VVittgenstein的家族相似性,提出了一种基于相似度矩阵的聚类算法。已有的基于相似度矩阵的聚类算法要么时间复杂度较高,要么需要调节的参数较多。新算法通过相似度矩阵构建邻接矩阵,然后在邻接矩阵中发现连通分支进而对数据进行划分。与常用的基于相似度矩阵的谱聚类算法相比,新算法无需计算特征向量,大大降低了时间消耗。而且,在给定相似度矩阵的基础上,新算法不需要设置任何参数。实验结果表明,新的聚类算法适用于图像分割问题。
     (5)为了降低噪声对图像分割和其他后续图像分析带来的影响,提出了一种基于稀疏表示的混合噪声去噪算法。该算法有效地结合了中值型滤波检测算法和字典学习算法,采用三阶段方法来优化本文提出的l1-l0去噪模型。新算法利用双重稀疏表示对图像进行二次重构,可以达到更好的去噪效果。实验结果表明新算法在去除脉冲噪声和高斯脉冲混合噪声上比现有算法有了明显改进。
As an unsupervised learning method, cluster analysis is one of the most important research fields in machine learning. In recent years, data clustering is under vigorous development and cluster analysis has been successfully used in numerous applications, including image processing, text data mining, market research and Bioinformatics.
     In this dissertation, we focus on two key problems of cluster analysis:similarity measure and the design and application of new clustering algorithms. The goal of clus-tering is to discover similar clusters, and therefore how to define and compute similarity is very crucial for clustering. Based on the existed Gaussian kernel similarity function, we propose a new similarity model. Besides the similarity model, the effect of the used features in similarity measure is also discussed and the intrinsic dimension is introduced as a new feature to improve the similarity measure. According to different clustering problems, designing fast and effective clustering algorithm is very necessary. We give a discussion about the advantages and disadvantages of the existed clustering methods, and propose a fast clustering algorithms, which is applicable for image segmentation. Consid-ering that most images contain noise in reality, in order to reduce the effect of the noise on both image segmentation and other subsequent image analysis, a sparse representation-based denoising algorithm is proposed for mixed noise removal.
     The main contributions of this dissertation are as follows:
     (1) A weighted self adaptive Gaussian kernel similarity measure is proposed. The tra-ditional Gaussian kernel similarity measure is suitable for the data set containing clusters with similar density, and moreover it is not robust enough against outliers in the data. Considering that there usually exist outliers and clusters with differ-ent densities in real data sets, we propose a new robust Gaussian kernel similarity measure. Based on the existed self adaptive Gaussian kernel similarity measure, the new similarity measure assigns a weight for each data point according to its neighbor information, and the aim of which is to reduce the similarities between outliers and other points via assigning small weights for outliers. Experimental results show that the proposed similarity measure gives better description of both intra-similarities and extra-similarities, leading to better clustering results.
     (2) We present a novel similarity measure based on intrinsic dimension. Similarity measure is dependent on not only similarity model but also data features. Each cluster can be considered as a sub-manifold, and the data points can be partitioned via defining a new feature reflecting the topology structure of manifolds. In some cases, intrinsic dimension can be used for distinguishing different manifolds, since the data points in the same cluster are expected to have the same intrinsic dimen-sion while data points with different intrinsic dimensions should lie in different manifolds. Based on its neighbor information, the intrinsic dimension of each data point is estimated and used as a new feature for similarity computation with the traditional features. Experimental results show that the clustering results gained by the new similarity measure are better than the results based on the similarity using only intrinsic dimension or original features.
     (3) For data sets with complex structure, it is very difficult to get satisfactory cluster-ing results via adjusting the similarity matrix using unsupervised method. Semi-supervised clustering employs limited amounts of labeled data to guide the cluster-ing process, which can gain better clustering results. In this dissertation, a semi-supervised clustering method based on affinity propagation algorithm is proposed. The affinity propagation algorithm is a similarity matrix based clustering algorithm, and its performance can be improved via adjusting the similarity matrix according to some known labeled data or pairwise constraints. The experimental results show that the semi-supervised affinity propagation method can improve the clustering accuracy over the unsupervised affinity propagation algorithm by adding a small number of pairwise constraints.
     (4) A novel method for data clustering is presented based on Wittgenstein's family resemblance. The existed clustering algorithms based on similarity matrix either have high time complexity or need to tune some parameters. The new algorithm constructs an adjacency matrix based on the gained similarity matrix, and finds the connected components in the adjacency matrix to partition the data. Compared with the commonly used similarity matrix based spectral clustering methods, the proposed method does not need to compute the eigenvectors, which greatly reduces the time consuming. Moreover, the new method has no parameter when the similar-ity matrix is given. Experimental results show that the proposed algorithm can be successfully applied in image segmentation and the results are very encouraging.
     (5) In order to reduce the effect of the noise on both image segmentation and other subsequent image analysis, we propose a sparse representation-based denoising algorithm for mixed noise removal. The new algorithm effectively combines a median-type filter with a dictionary learned method and optimizes the proposed l1-l0model via a three-phases method. It uses double-sparsity to make a double-construction, leading to an enhanced restoration. Experimental results show that the new method makes a notable improvement for both impulse noise and Gaussian-impulse mixed noise removal tasks.
引文
[1]A. Jain. Data clustering:50 years beyond K-means. Pattern Recognition Letters,2010, 31(8):651-666.
    [2]J. Han, M. Kamber. Data mining concepts and technique. San Fransisco:Morgan Kaufmann, 2001.
    [3]B. Everitt, S. Landau, M. Leese. Cluster analysis. London:Arnold,2001.
    [4]A. Jain, M. Murty, P. Flynn. Data clustering:a review. ACM Computing Surveys,1999, 31(3):264-323.
    [5]P. Hansen, B. Jaumard. Cluster analysis and mathematical programming. Mathmatical Pro-gramming,1997,79(1-3):191-215.
    [6]A. Baraldi, P. Blonda. A survey of fuzzy clustering algorithms for pattern recognition-part Ⅰ and Ⅱ. IEEE Transactions on Systems, Man, and Cybernetics, Part B,1999,29(6):778-801.
    [7]R. Xu, D. Wunsch. Survey of clustering algorithms. IEEE Transactions on Neural Networks, 2005,16(3):645-678.
    [8]J. MacQueen. Some methods for classification and analysis of Multivariate observations. Pro-ceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability,1967. 281-297.
    [9]L. Kaufman, P. Rousseeuw. Finding groups in data:an introduction to cluster analysis. Hobo-ken:John Wilsy & Sons,1990.
    [10]R. Ng, J. Han. Efficient and effective clustering methods for spatial data mining. Proceedings of the 20th International Conference on Very Large Databases,1994.144-155.
    [11]B. Frey, D. Dueck. Clustering by passing messages between data points. Science,2007, 315(5814):972-976.
    [12]A. Jain, R. Dubes. Algorithms for clustering Data. Englewood Cliffs:NJ:Prentice-Hall,1988.
    [13]K. Wu, M. Yang. Alternative c-means clustering algorithms. Pattern Recognition,2002, 35(10):2267-2278.
    [14]Z. Huang. Extensions to the k-means algorithm for clustering large data sets with categorical values. International Journal of Data Mining and Knowledge Discovery,1998,2(3):283-304.
    [15]J. Huang, M. Ng, H. Rong, et al. Automated variable weighting in k-means type clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(5):657-668.
    [16]J. Huang, J. Xu, M. Ng, et al. Weighting method for feature selection in k-means. Computa-tional Methods of Feature Selection,Chapman&Hall/CRC.193-209.
    [17]R. Amorim, B. Mirkin. Minkowski metric, feature weighting and anomalous cluster initializing in k-means clustering. Pattern Recognition,2012,45(3):1061-1075.
    [18]Y. Chan, W. Ching, M. Ng, et al. An optimization algorithm for clustering using weighted dissimilarity measures. Pattern Recognition,2004,37(5):943-952.
    [19]L. Bai, J. Liang, C. Dang, et al. A novel attribute weighting algorithm for clustering high-dimensional categorical data. Pattern Recognition,2011,44(12):2843-2861.
    [20]J. Dunn. A fuzzy relative of the ISODATA process and tts use in detecting compact well-separated clusters. Journal of Cybernetics,1974,3(3):32-57.
    [21]J. Bezdek. Pattern recognition with fuzzy objective function algorithms. New York:Plenum Press,1981.
    [22]J. Bezdek. A convergence theorem for the fuzzy isodata clustering algorithm. IEEE Transac-tions on Pattern Analysis and Machine Intelligence,1980,1 (2):1-8.
    [23]Z. Huang, M. Ng. A fuzzy k-modes algorithm for clustering categorical data. IEEE Transac-tions on Fuzzy Systems,1999,7(4):446-452.
    [24]R. Hathaway, Y. Hu. Density-weighted fuzzy c-means clustering. IEEE Transactions on Fuzzy Systems,2009,17(1):243-252.
    [25]S. Eschrich, J. K. J, L. Hall, et al. Fast accurate fuzzy clustering through data reduction. IEEE Transactions on Fuzzy Systems,2003,11 (2):262-270.
    [26]M. Li, M. Ng, Y. Cheung, et al. Agglomerative fuzzy k-Means clustering algorithm with selection of number of clusters. IEEE Transactions on Knowledge and Data Engineering, 2008,20(11):1519-1534.
    [27]M. Yang. On a class of fuzzy classification maximum likelihood procedures. Fuzzy Set and Systems,1993,57(3):365-375.
    [28]J. Lin. Fuzzy clustering using a compensated fuzzy hopfield network. Neural Processing Letters,1999,10(1):35-48.
    [29]W. Pedrycz. Conditional fuzzy c-means. Pattern Recognition Letters,1996,17(6):625-632.
    [30]M. Menard, V. Courboulay, P. Dardignac. Possibilistic and probabilistic fuzzy clustering:Uni-fication within the framework of the non-extensive thermostatislics. Pattern Recognition,2003, 36(6):1325-1342.
    [31]R. Krishnapuram, J. Keller. A possibilistic approach to clustering. IEEE Transactions on Fuzzy Systems,1993,1(2):98-110.
    [32]R. Krishnapuram, J. Keller. The possibilistic c-means algorithm:Insights and recommenda-tions. IEEE Transactions on Fuzzy Systems,1996,4(3):385-393.
    [33]R. Hathaway, J. Bezdek, Y. Hu. Generalized fuzzy c-means clustering strategies using norm distance. IEEE Transactions on Fuzzy Systems,2000,8(5):576-582.
    [34]S. Krinidis, V. Chatzis. A robust fuzzy local information c-means clustering algorithm. IEEE Transactions on Image Processing,2010,19(5):1328-1337.
    [35]J. Yu. General c-means clustering model. IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(8):1197-1211.
    [36]J. Yu, M. Yang. Optimality test for generalized FCM and its application to parameter selection. IEEE transactions on fuzzy systems,2005,13(1):164-176.
    [37]J. Yu, M. Yang. A generalized fuzzy clustering regularization model with optimality tests and model complexity analysis. IEEE transactions on fuzzy systems,2007,15(5):904-915.
    [38]R. Sibson. SLINK:an optimally efficient algorithm for the single-link cluster method. The Computer Journal (British Computer Society),1973,16(1):30-34.
    [39]D. Defays. An efficient algorithm for a complete link method. The Computer Journal (British Computer Society),1977,20(4):364-366.
    [40]J. Ward. Hierarchical grouping to optimize an objective function. Journal of the American Statistical Association,1963,58(301):236-244.
    [41]M. Girvan, M. Newman. Community structure in social and biological networks. Proceedings of the National Academy of Sciences,2002,99(12):782-786.
    [42]M. Ester, H. Kriegel, J. Sander, et al. A density-based algorithm for discovering clusters in large spatial databases with noise. Proceedings of the 2nd International Conference on Knowledge Discovery and Data Mining,1996.226-231.
    [43]M. Ankerst, M. Breunig, H. Kriegel, et al. OPTICS:ordering points to identify the clustering structure. Proceedings of the ACM SIGMOD International Conference on Management of Data,1999.49-60.
    [44]A. Hinneburg, D. Keim. An efficient approach to clustering in large multimedia databases with noise. Proceedings of the 4th International Conference on Knowledge Discovery and Data Mining,1998.58-65.
    [45]Z. Wu, R.Leahy. An optimal graph theoretic approach to data clustering:theory and tts ap-plication to image segmentation. Proceedings of Advances in Neural Information Processing Systems,1993.1103-1113.
    [46]J. Shi, J. Malik. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(8):888-905.
    [47]S. Sarkar, P. Soundararajan. Supervised learning of large perceptual organization:Graph spec-tral partitioning and learning automata. IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(5):504-525.
    [48]L. Hagen, A. Kahng. New spectral methods for ratio cut partitioning and clustering. IEEE Transactions on Computed-Aided Desgin,1992,11 (9):1074-1085.
    [49]C. Ding, X. He, H. Zha. A min-max cut algorithm for graph partitioning and data clustering. Proceedings of the IEEE International Conference on Data Mining,2001.107-114.
    [50]X. Li, Z. Tian. Optimum cut-based clustering. Signal Processing,2007,87(11):2491-2502.
    [51]A. Ng, M. Jordan, Y. Weiss. On spectral clustering:analysis and algorithm. Proceedings of Advances in Neural Information Processing Systems,2001.849-856.
    [52]S. Kamvar, D. Klein, C. Manning. Spectral learning. Proceedings of the 18th International Joint Conference on Artificial Intelligence,2003.561-566.
    [53]Z. Tian, X. Li, Y. Ju. Spectral clustering based on matrix perturbation theory. Science in China Series F,2007,50(1):63-81.
    [54]L. Zelnik-Manor, P. Perona. Self-tuning spectral clustering. In:L. Saul, Y. Weiss, L. Bottou, (eds.). Proceedings of Advances in Neural Information Processing Systems,2004.1601-1608.
    [55]H. Chang, D. Yeung. Robust path-based spectral clustering with application to image seg-mentation. Proceedings of the 10th IEEE International Conference on Computer Vision,2005. 278-285.
    [56]B. Fischer, T. Zoller, J. Buhmann. Path based pairwise data clustering with application to texture segmentation. Energy Minimization Methods in Computer Vision and Pattern Recog-nition,2134:235-250.
    [57]M. Bilenko, S. Basu, R. Mooney. Integrating constraints and metric learning in semi-supervised clustering. Proceedings of the 21st International Conference on Machine Learning,2004.81-88.
    [58]K. Wagstaff, C. Cardie. Clustering with instance-level constraints. Proceedings of the 17th International Conference on Machine Learning,2000.1103-1110.
    [59]Q. Xu, M. DesJardins, K. Wagstaf. Constrained spectral clustering under a local proximity structure assumption. Proceedings of the 18th International Florida Artificial Intelligence Re-search Society Conference,2005.866-867.
    [60]王玲,薄列峰,焦李成.密度敏感的半监督谱聚类.软件学报,2007,18(10):2412-2422.
    [61]E. Xing, A. Ng, M. Jordan, et al. Distance metric learning with application to clustering with side-information. Proceedings of Advances in Neural Information Processing Systems,2003. 505-512.
    [62]G. Medioni and M.S. Lee and C.K. Tang. A computational framework for segmentation and grouping. Elsevier Science B.V.,2000.
    [63]章毓晋.图像分制.北京:科学出版社,2001.
    [64]Y. Zhang. New advancements in image segmentation for CBIR. Encyclopedia of Information Science and Technology,4:2105-2109.
    [65]Y. Zhang. Advances in image and video segmentation. USA:IRM Press,2006.
    [66]G. Coleman, H. Andrews. Image segmentation by clustering. Proceedings of the IEEE,1979, 67(5):773-785.
    [67]T. Pappas. An adaptive clustering algorithm for image segmentation. IEEE Transactions on Image Processing,1992,40(4):901-914.
    [68]Y. Tolias, S. Panas. Image segmentation by a fuzzy clustering algorithm using adaptive spatially constrained membership functions. IEEE Transactions on Systems, Man, and Cybernetics, 1998,28(3):359-369.
    [69]J. Noordam, W. Broek, L. Buydens. Geometrically guided fuzzy C-means clustering for mul-tivariate image segmentation. Proceedings of the 15th International Conference on Pattern Recognition,2000.462-465.
    [70]M. Yang, Y. Hu, K. Lin, et al. Segmentation techniques for tissue differentiation in MRI of ophthalmology using fuzzy clustering algorithms. Magnetic Resonance Imaging,2002, 20(2):173-179.
    [71]D. Comaniciu, P. Meer. Mean shift:a robust approach toward feature space analysis. IEEE Transactions on Pattern analysis and Machine Intelligence,2002,24(5):603-619.
    [72]I. Cox, S. Rao, Y. Zhong. "Ratio regions":a technique for image segmentation. Proceedings of the 13th International Conference on Pattern Recognition,1996.557-564.
    [73]S. Wang, J. Siskind. Image segmentation with minimum mean cut. Proceedings of the 8th IEEE International Conference on Computer Vision,2001.517-524.
    [74]S. Wang, J. Siskind. Image segmentation with ratio cut. IEEE Transactions on Pattern Analysis and Machine Intelligence,2003,25(6):675-690.
    [75]T. Cour, F. Benezit, J. Shi. Spectral segmentation with multiscale graph decomposition. Pro-ceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2005.1124-1131.
    [76]I. Dhillon, Y. Guan, B. Kulis. Weighted graph cuts without eigenvectors:A multilevel ap-proach. IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(11):1944-1957.
    [77]B. Sumengen, B. Manjunath. Graph partitioning active contours (GPAC) for image segmenta-tion. IEEE Transactions on Pattern Analysis and Machine Intelligence,2006,28(4):509-521.
    [78]L. Bertelli, B. Sumengen, B. Manjunath, et al. A variational framework for multiregion pairwise-similarity-based image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence,2008,30(8):1400-1413.
    [79]M. Ahmed, S. Yamany, N. Mohamed, et al. A modified fuzzy c-means algorithm for bias field estimation and segmentation of MRI data. IEEE Transactions on Medical Imaging,2002, 21(3):193-199.
    [80]S. Chen, D. Zhang. Robust image segmentation using FCM with spatial constraints based on new kernel-induced distance measure. IEEE Transactions on Systems, Man, and Cybernetics, Part B:Cybernetics,2004,34(4):1907-1916.
    [81]L. Szilagyi, Z. Benyo, S. Szilagyii, et al. MR brain image segmentation using an enhanced fuzzy C-means algorithm. Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society,2003.724-726.
    [82]W. Cai, S. Chen, D. Zhang. Fast and robust fuzzy c-means clustering algorithms incorporating local information for image segmentation. Pattern Recognition,2007,40(3):825-838.
    [83]L. Rudin, S. Osher, E. Fatemi. Nonlinear total variation based noise removal algorithms. Phys-ica D,1992,60:259-268.
    [84]T. Chan, K. Chen. An optimization-based multilevel algorithm for total variation image de-noising. Multiscale Modeling & Simulation,2006,5(2):615-645.
    [85]T. Chan, S. Esedoglu, F. Park, et al. Recent developments in total variation image restoration. In Mathematical Models of Computer Vision,2005..
    [86]M. Figueiredo, J. Bioucas-Dias, R. Nowak. Majorization-minimization algorithms for wavelet-based image restoration. IEEE Transactions on Image Processing,2007,16(12):2980-2991.
    [87]F.Luisier, T. Blu, M. Unser. A new SURE approach to image denoising:interscale orthonormal wavelet thresholding. IEEE Transactions on Image Processing,2007,16(3):593-606.
    [88]C. Bouman, K. Sauer. On discontinuity-adaptive smoothness priors in computer vision. IEEE Transactions on Pattern Analysis and Machine Intelligence,1995,17(6):579-586.
    [89]T. Chan, H. Zhou, R. Chan. Continuation method for total variation denoising problems. Ad-vanced Signal Processing Algorithms,1995,2563(1):314-325.
    [90]P. Charbonnier, L. Blanc-Feraud, G. Aubert, et al. Deterministic edge-preserving regularization in computed imaging. IEEE Transactions on Image Processing,1997,6(2):298-311.
    [91]C. Vogel, M. Oman. Fast, robust total variation-based reconstruction of noisy, blurred images. IEEE Transactions on Image Processing,1998,7(6):813-824.
    [92]M. Nikolova. A variational approach to remove outliers and impulse noise. Journal of Mathe-matical Imaging and Vision,2004,20(1-2):99-120.
    [93]M. Aharon, M. Elad, A. Bruckstein. K-SVD:an algorithm for designing of overcom-plete dictionaries for sparse representation. IEEE Transactions on Image Processing,2006, 54(11):4311-4322.
    [94]M. Elad, M. Aharon. Image denoising via sparse and redundant representations over learned dictionaries. IEEE Transactions on Image Processing,2006,15(12):3736-3745.
    [95]A. Gersho, R. Gray. Vector quantization and signal compression. Norwell:Kluwer Academic, 1991.
    [96]K. Engan, S. Aase, J. H. Hus. Multi-frame compression:theory and design. EURASIP Signal Processing,2000,80(10):2121-2140.
    [97]K. Kreutz-Delgado, J. Murray, B. Rao, et al. Dictionary learning algorithms for sparse repre-sentation. Neural Computation,2003,15(2):349-396.
    [98]K. Dabov, A. Foi, V. Katkovnik, et al. Image denoising by sparse 3-D transform-domain collaborative filtering. IEEE Transactions on Image Processing,2007,16(8):2080-2095.
    [99]M. Protter, M. Elad. Image sequence denoising via sparse and redundant representations. IEEE Transactions on Image Processing,2009,18(1):27-36.
    [100]J. Turek, I. Yavneh, M. Elad. On MMSE and MAP denoising under sparse representation mod-eling over a unitary dictionary. IEEE Transactions on Signal Processing,2011,59(8):3526-3535.
    [101]E. Backer, A. Jain. A clustering performance measure based on fuzzy set decomposition. IEEE Transactions on Pattern Analysis and Machine Intelligence,1981,3(1):66-75.
    [102]刘忠伟.十种基于颜色特征的图像检索算法的比较和分析.信号处理,2000,3(16):79-84.
    [103]S. Dambreville, Y. Rathi, A. Tannenbaum. A framework for image segmentation using shape models and kernel space shape priors. IEEE Transactions on Pattern Analysis and Machine Intelligence,2008,30(8):1385-1399.
    [104]Y. Lin, J. Wu. A depth information based fast mode decision algorithm for color plus depth-map 3D videos. IEEE Transactions on Broadcasting,2011,57(2):542-550.
    [105]M. Li, X. Chen, X. Li, et al. The similarity metric. IEEE Transactions on Information Theory, 2004,50(12):3250-3264.
    [106]R. Shepard. The analysis of proximities:Multidimensional scaling with an unknown distance function Ⅰ. Psychometrika,1962,27(2):125-140.
    [107]R. Shepard. The analysis of proximities:Multidimensional scaling with an unknown distance function Ⅱ. Psychometrika,1962,27(3):219-246.
    [108]M. Deza, E. Deza. Encyclopedia of distance. New York:Springer,2009.
    [109]A. Tversky. Features of similarity. Psychological Review,1977,84(2):327-352.
    [110]A. Kolmogorov. Foundations of the theory of probability. New Yori:Chelsea Publishing Company,1956.
    [111]R. Hamming. Error detecting and error correcting codes. The Bell System Technical Journal, 1950,29(2):147-160.
    [112]V. Levenshtein. Binary codes capable of correcting deletions, insertions, and reversals. Soviet Physics Doklady,1966,10(8):707-710.
    [113]C. Kemp, A. Bernstein, J. Tenenbaum. A generative theory of similarity. Proceedings of the 27th Annual Conference of the Cognitive Science Society,2005.
    [114]J. Gower. A general coefficient of similarity and some of tts properties. Biometrics,1971, 27(4):857-874.
    [115]P. Tan, M. Steinbach, V. Kumar. Introduction to data mining. Boston:Addison-Wesley,2006.
    [116]N. Pal, J. Bezdek. On cluster validity for the fuzzy c-means model. IEEE transactions on fuzzy systems,1995,3(3):370-379.
    [117]J. Zhang, Y. Leung. Improved possibilistic c-means clustering algorithm. IEEE Transactions on Fuzzy Systems,2004,12(2):209-217.
    [118]M. Yang, C. Lai. A robust automatic merging possibilistic clustering method. IEEE Transac-tions on Fuzzy Systems,2011,19(1):26-41.
    [119]M. Newman. Fast algorithm for detecting community structure in networks. Physical Review E-Statistical, Nonlinear, and Soft Matter Physics,2004,69(066133):1-5.
    [120]X. Zhou, X. Wang. An overview of algorithms for analyzing community structure in complex networks. Complex Systoms and Complexity Science,2005,2(3):1-12.
    [121]W. Donath, A. Hoffman. Lower bounds for the partitioning of graphs. IBM Journal of Research and Development,1973,17(5):420-425.
    [122]M. Fiedler. Algebraic connectivity of graphs. Czechoslovak Mathematical Journal,1973, 23(98):298-305.
    [123]F. Chung. Spectral graph theory(Vol.92 of the CBMS Regional Conference Series in Mathe-matics). Washington:Conference Board of the Mathematical Sciences,1997.
    [124]D. Zhou, O. Bousquet, T. Lal, et al. Learning with local and global consistency. Proceedings of Advances in Neural Information Processing Systems,2004.321-328.
    [125]B. Mohar. The laplacian spectrum of graphs. Proceedings of Graph Theory, Combinatorics, and Applications, volume 2. Wiley,1991.871-898.
    [126]B. Mohar. Some applications of Laplace eigenvalues of graphs. Proceedings of Graph Symme-try:Algebraic Methods and Applications Volume 197 of NATO ASI Series C. Kluwer,1997. 225-275.
    [127]C. Alpert, S. Yao. Spectral partitioning:the more eigenvectors, the better. Proceedings of the 32nd Conference on Design Automation,1995.195-200.
    [128]C. Alpert, A. Kahng, S. Yao. Spectral partitioning with multiple eigenvectors. Discrete Applied Mathematics,1999,90(1-3):3-26.
    [129]J. Malik, S. Belongie, T. Leung, et al. Contour and texture analysis for image segmentation. International Journal of Computer Vision,2001,43(1):7-27.
    [130]Q. Zhang, L. Couloigner. A new and efficient k-medoid algorithm for spatial clustering. Com-putational Science and Its Applications,3482:207-224.
    [131]R. Shepard. Toward a universal law of generalization for psychological science. Science,1987, 237(4820):1317-1323.
    [132]S. Santini, R. Jain. Similarity measures. IEEE Transactions on Pattern Analysis and Machine Intelligence,1999,21 (9):871-883.
    [133]D. Navarro, A. Perfors. Similarity, feature discovery, and the size principle. Acta Psychologica, 2010,133(3):256-268.
    [134]H. Cheng, Z. Liu, J. Yang. Sparsity induced similarity measure for label propagation. Proceed-ings of the 12th IEEE International Conference on Computer Vision,2009.317-324.
    [135]H. Jiang, C. Ngo, H. Tan. Gestalt-based feature similarity measure in trademark database. Pattern Recognition,2006,39(5):988-1001.
    [136]C. Liu. The bayes decision rule induced similarity measures. IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(6):1086-1090.
    [137]X. Zhang, J. Li, H. Yu. Local density adaptive similarity measurement for spectral clustering. Pattern Recognition Letters,2011,32(2):352-358.
    [138]J. Santos, J. Sa, L. Alexandre. LEGClust-a clustering algorithm based on layered entropic subgraphs. IEEE Transactions on Pattern Analysis and Machine Intelligence,2008,1(30):62-75.
    [139]X. Bai, X. Yang, L. Latecki, et al. Learning context-sensitive shape similarity by graph trans-duction. IEEE Transactions on Pattern Analysis and Machine Intelligence,2010,32(5):861-874.
    [140]J. Wang, Y. Li, X. Bai, et al. Learning context sentitive similarity by shortest path propagation. Pattern Recognition,44(10):2367-2374.
    [141]S. Watanabe. Knowing and guessing:a quantitative study of inference and information. New York:Wiley.376-377.
    [142]谷瑞军,叶宾,须文波.一种改进的谱聚类算法.计算机研究与发展,2007,44:145-149.
    [143]F. Camastra. Data dimensionality estimation methods:A survey. Pattern Recognition,2003, 36(12):2945-2954.
    [144]B. Kegl. Intrinsic dimension estimation using packing numbers. Proceedings of the Advances in Neural Information Processing Systems. MIT Press,2002.681-688.
    [145]J. Costa, A. Hero. Geodesic entropic graphs for dimension and entropy estimation in manifold learning. IEEE Transactions on Signal Processing,2004,52(8):231-252.
    [146]M. Fan, H. Qiao, B. Zhang. Intrinsic dimension estimation of manifolds by incising balls. Patter Recoginition,2009,42(5):780-787.
    [147]B. Eriksson, M. Crovella. Estimation of inrinsic dimension via clustering. Technical report, Department of Computer Science. Boston University, June 6,2011.
    [148]K. Fukunaga, D. Olsen. An algorithm for finding intrinsic dimensionality of data. IEEE Transactions on Computers,1971, C-20(2):176-183.
    [149]K. Pettis, T. Bailey, A. Jain, et al. An intrinsic dimensionality estimator from near-neighbor information. IEEE Transactions on Pattern Analysis and Machine Intelligence,1979,1 (1):25-36.
    [150]J. Costa, A. Girotra, A. Hero. Estimating local intrinsic dimension with k-nearest neighbor graphs. Proceedings of IEEE Workshop on Statistical Signal Processing,2005.417-422.
    [151]E. Levina, P. Bickel. Maximum likelihood estimation of intrinsic dimension. Proceedings of Advances in Neural Information Processing Systems,2004.777-784.
    [152]K. Carter, R. Raich, A. Hero. On local intrinsic dimension estimation and its applications. IEEE Transactions on Signal Processing,2010,58(2):650-663.
    [153]Http://archive.ics.uci.edu/ml/datasets.html.
    [154]P. Petland. Fractal-based description of natural scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence,1984,6(6):661-674.
    [155]B. Chaudhuri, N. Sarkar. Texture segmentation using fractal dimension. IEEE Transactions on Pattern Analysis and Machine Intelligence,1995,17(1):72-77.
    [156]Http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/segbench/BSDS300/html /dataset/images.html.
    [157]A. Demiriz, K. Benneit, M. Rmbrechts. Semi-supervised clustering using genetic algorithm. Proceedings of the Intelligent Engineering Systems Through Artificial Neural Networks,1999. 809-814.
    [158]S. Basu, M. Bilenko, R. Mooney. A Probabilistic Framework for Semi-Supervised Clustering. Proceedings of the 10th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining,2004.59-68.
    [159]K. Wagstaff, C. Cardie, S. Rogers, et al. Constrained k-means clustering with background knowledge. Proceedings of the 18th International Conference on Machine Learning,2001. 577-584.
    [160]S. Basu, A. Banerjee, R. Mooney. Semi-supervised clustering by seeding. Proceedings of the 19th International Conference on Machine Learning,2002.27-34.
    [161]D. Klein, S. Kamvar, C. Manning. From instance-level constraints to space-level constraints: making the most of prior knowledge in data clustering. Proceedings of the 19th International Conference on Machine Learning,2002.307-314.
    [162]M. Schultz, T. Joachims. Learning a distance metric from relative comparisons. Proceedings of Advances in Neural Information Processing Systems,2003.41-48.
    [163]A. Bar-Hillel, T. Hertz, N. Shental, et al. Learning distance functions using equivalence re-lations. Proceedings of the 20th International Conference on Machine Learning(ICML'03), 2003.11-18.
    [164]W. Tang, H. Xiong, S. Zhong, et al. Enhancing semi-supervised clustering:a feature projection perspective. Proceedings of the 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining,2007.707-716.
    [165]C. Erick, N. Zeidat, Z. Zhao. Supervised clustering-algorithms and benefits. Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence,2004.774-776.
    [166]M. Dettling, P. Buhlmann. Supervised clustering of genes. Genome Biology,2002,3(12):1-15.
    [167]U. Luxburg. A tutorial on spectral clustering. Technical report, Max Planck Institute for Biological Cybernetics, August 149,2006.
    [168]L. Wittgenstein. Philosophical Investigation. Oxford:Basil Blackwell,1958.
    [169]R. Shepard. Toward a Universal Law of Generalization for Psychological Science. Science, 1987,237(4820):1317-1323.
    [170]M. Sezgin, B. Sankur. Survey over image thresholding techiques and quantitative performance evaluation. Journal of Electronic Imaging,2004,13(1):146-165.
    [171]N. Otsu. A threshold selection method from gray level histogram. IEEE Transactions on Systems, Man and Cybernetics,1979,9(1):62-66.
    [172]R. Tarjan. Depth first search and linear graph algorithms. SIAM Journal on Computing,1972, 1(2):146-160.
    [173]G. Lakoff. Women, fire, and dangerous things. Chicago:The University of Chicago Press, 1990.
    [174]Http://www.cs.ust.hk/hongch/image-segmentation.html.
    [175]Http://www.seas.upenn.edu/-timothee/software/ncut_multiscale/ncut_multiscale.html.
    [176]X. Ren, J. Malik. Learning a classification model for segmentation. Proceedings of the 9th IEEE International Conference on Computer Vision,2003.10-17.
    [177]G. Mori, X. Ren, A. Efros, et al. Recovering human body configurations:combining segmcn-tation and recognition. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition,2004.326-333.
    [178]G. Mori. Guiding model search using segmentation. Proceedings of the 10th IEEE Interna-tional Conference on Computer Vision,2005.1417-1423.
    [1791 W. Tao, H. Jin, Y. Zhang. Color image segmentation based on mean shift and normalized cuts. IEEE Transactions on Systems, Man and Cybernetics-Part B:Cybernetics,2007,37(2):1382-1389.
    [1801 A. Levinshtein, A. Stere, K. N. Kutulakos, et al. TurboPixels:fast superpixels using geometric flows. IEEE Transactions on Pattern Analysis and Machine Intelligence,2009,31(12):2290-2297.
    [181]O. Veksler, Y. Boykov. Superpixels. and supervoxels in an energy optimization framework. Proceedings of the 11th European Conference on Computer Vision,2010.211-224.
    [182]A. Moore, S. Prince, J. Warrell, et al. Superpixel lattices. Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition,2008.1-8.
    [183]A. Moore, S. Prince, J. Warrell. "lattice cut"-construcitng superpixles using layer con-straints. Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition,2010.2117-2124.
    [184]M. Liu, O. Tuzel, S. Ramalingam, et al. Entropy rate superpixel segmentation. Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition,2011.2097-2104.
    [1851 J-Ning, L. Zhang, D. Zhang, et al. Interactive imagesegmentation by maximal similarity based region merging. Pattern Recognition,2010,43(2):445-456.
    [186]R. Gonzalez, R. Woods, S. Eddins. 数字图像处理.北京:电子工业出版社,2008.
    [187]S. Morillas, V. Gregori, A. Hervas. Fuzzy peer groups for reducing mixed Gaussian-Impulse noise from color images. IEEE Transactions on Image Processing,2009,18(1):1452-1466.
    [188]J. Yang, H. Wu. Mixed gaussian and uniform impulse noise analysis using robust estimation for digital images. Proceedings of the 16th international conference on Digital Signal Processing, 2009.468-472.
    [189]E. Lopez-Rubio. Restoration of images corrupted by gaussian and uniform impulsive noise. Pattern Recognition,2010,43(5):1835-1846.
    [190]L. Ji, Z. Yi. A mixed noise image filtering method using weighted-linking PCNNs. Neuro-computing,2008,71(13-15):2986-3000.
    [191]Y. Huang, M. Ng, Y. Wen. Fast image restoration methods for impulse and gaussian noise removal. IEEE Transactions on Image Processing Letters,2009,16(6):457-460.
    [192]O. Ghita, P. Whelan. A new GVF-based image enhancement formulation for use in the presence of mixed noise. Pattern Recognition,2010,43(8):2646-2658.
    [193]C. Lin, J. Tsai, C. Chiu. Switching bilateral filter with a texture/noise detector for universal noise removal. IEEE Transactions on Image Processing,2010,19(9):2307-2320.
    [194]H. Kong, L. Guan. A neural network adaptive filter for the removal of impulse noise in digital images. Neural Networks,1996,9(3):993-1003.
    [195]A. Buades, B. Coll, J. Morel. A review of image denoising algorithms with a new one. Multi-scale Modeling & Simulation,2005,4(2):490-530.
    [196]J. Astola, P. Kuosmanen. Fundamentals of nonlinear digital filtering. Boca Raton,United States of America:CRC,1997.
    [197]H. Hwang, R. Haddad. Adaptive median filters:new algorithms and results. IEEE Transactions on Image Processing,1995,4(4):499-502.
    [198]T. Chen, H. Wu. Space variant median filters for the restoration of the impulse noise corrupted images. IEEE Transactions on Circuits and Systems,2001,48(8):784-789.
    [199]G. Pok, J. Liu, A. Nair. Selective removal of impulse noise based on homogeneity level infor-mation. IEEE Transactions on Image Processing,2003,12(1):85-92.
    [200]H. Eng, K. Ma. Noise adaptive soft-switching median filter. IEEE Transactions on Image Processing,2001,10(2):242-251.
    [201]S. Wang, C. Wu. A new impulse detection and filtering method for removal of wide range impulse noises. Pattern Recognition,2009,42(9):2194-2202.
    [202]R. Chan, C. Ho, M. Nikolova. Salt-and-pepper noise removal by median-type noise de-tector and detail-preserving regularization. IEEE Transactions on Image Processing,2005, 14(10):1479-1485.
    [203]R. Chan, C. Hu, M. Nikolova. An iterative procedure for removing random-valued impulse noise. IEEE Transactions on Signal Processing Letter,2004,11 (12):921-924.
    [204]J. Cai, R. Chan, M. Nikolova. Fast two-phase image deblurring under impulse noise. Journal of Mathematical Imaging and Vison,2010,36(1):46-53.
    [205]S. Chen, X. Yang, G. Cao. Impulse noise suppression with an augmentation of ordered dif-ference noise detector and an adaptive variational method. Pattern Recognition Letters,2009, 30(4):460-467.
    [206]J. Cai, R. Chan, M. Nikolova. Two-phase approach for deblurring images corrupted by impulse plus gaussian noise. Inverse Problems and Imaging,2008,2(2):187-204.
    [207]Y. Pati, R. Rezaiifar, P. Krishnaprasad. Orthogonal matching pursuit:recursive function ap-proximation with applications to wavelet decomposition. Proceedings of the 27th Asilomar Conference on Signals, Systems and Computers,1993.40-44.
    [208]S. Ko, Y. Lee. Center weighted median filters and their applications to image enhancement. IEEE Transactions on Circuits and Systems,1991,38(9):984-993.
    [209]E. Hale, W. Yin, Y. Zhang. Fixed-point continuation for ll-minimization:methodology and convergence. SIAM Journal on Optimization,2008,19(3):1107-1130.
    [210]T. Zeng, X. Li, M. Ng. Alternating minimization method for total variation based wavelet shrinkage model. Communications in Computational Physics,2010,8(5):976-994.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700