用户名: 密码: 验证码:
基于图的半监督学习和维数约简方法及其应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
半监督学习和维数约简已经成为当前机器学习领域的研究热点。半监督学习研究的目的是在整个数据集中只有一部分样本有标记的情况下,如何对数据进行分类。本文主要研究的是基于图的半监督学习方法与应用。维数约简是在尽可能多地保持数据集结构的前提下,将数据集转换成一个新的数据集,新数据集的维数是原始数据集的本征维数。本文对基于图的半监督学习和维数约简方法与应用进行了系统的研究,具体来说,全文的主要工作概括如下:
     (1)提出了一种新的多步骤降维方法对基因表达谱数据进行降维。首先采用秩和检验方法来进行差异表达的基因选择,然后将排在前面的一定数量的基因再进行离散余弦变换,并采用主成分分析对变换后的系数进行主成分提取。我们首先将基于图的半监督方法引入到肿瘤分类中,采用基于图的半监督学习算法对抽取的主成分特征进行分类性能评估。
     (2)针对基于图的半监督学习方法提出了一种新的自适应权值学习方法。传统的基于图的半监督学习算法大都采用高斯函数来计算图的边权。我们提出一种新颖的针对基于图的半监督学习方法的边权设计方法。该方法添加了标签信息,并且采用测地距离而不是欧氏距离来计算两个样本点之间的距离。此外,我们还添加了类的先验信息,并针对基于局部和全局一致性的学习方法来改进边权。实验结果表明,我们所提出的方法要优于原算法。
     (3)提出了一种基于局部保持投影的监督特征提取方法,即局部保持判别投影算法。局部保持投影(LPP)没有加入判别信息,仅仅考虑局部信息。我们将类内散度矩阵和类间散度矩阵加入到LPP的目标函数中,从而提出局部保持判别投影(LPDP)方法。该方法的优点是能够最大化类间距离和最小化类内距离,同时保持LPP的局部保持特性。LPDP可被看作是一种组合了流形准则和Fisher准则的新方法。因此,与LPP相比,LPDP能够成功地找到具有更好判别性能的子空间,因而更适合于做分类,从而能有效地提高识别率。
     (4)提出了基于谱回归的判别分析(Spectral Regression Discriminant Analysis, SRDA)和基于谱回归的核化判别分析(Spectral Regression Kernel DiscriminantAnalysis, SRXDA)的正则化参数估计方法。SRDA的正则化参数的估计在以往的研究中没有得到很好的解决。我们基于扰动的线性判别分析(Perturbation Linear Discriminant Analysis, PLDA)准则提出一种新的方法,来估计SRDA的正则化参数。在另一方面,SRKDA的正则化参数估计在以前的研究中也没有解决。我们提出两种方法来估计SRKDA的正则化参数,在不同数据集上的实验结果显示我们的方法是有效可行的。
Recently, semi-supervised learning and dimensionality reduction have become hot topics in the field of machine learning. The goal of semi-supervised learning is to learn from partially labeled data. In this thesis, I focused on graph-based semi-supervised learning. Dimensionality reduction techniques can transform dataset X with dimensionality D into a new dataset Y with dimensionality d, while retaining the geometry of the data as much as possible. The dimensionality of the new data set, i.e., d is the intrinsic dimensionality. I make a through study on graph-based semi-supervised learning and dimensionality reduction methods. More concretely, the main work for this thesis can be summarized as follows:
     (1) Both supervised methods and unsupervised methods have been widely used to solve the tumor classification problem based on gene expression profiles. This paper introduces a semi-supervised graph-based method for tumor classification. Feature extraction plays a key role in tumor classification based on gene expression profiles, and can greatly improve the performance of a classifier. In this paper we proposed a novel feature extraction method for extracting tumor-related features. First the Wilcoxon rank-sum test was used for gene selection. Then gene ranking and discrete cosine transform are combined with principal component analysis for feature extraction. Finally, the performance was evaluated by semi-supervised learning algorithms.
     (2) A modified version for semi-supervised learning algorithm with local and global consistency was proposed in this paper. The new method adds the label information, and adopts the geodesic distance rather than Euclidean distance as the measure of the difference between two data points when conducting calculation. In addition, we add class prior knowledge into the cost function. It was found that the effect of class prior knowledge was different between under high label rate and low label rate. The experimental results show that the changes attain the satisfying classification performance better than the original algorithms.
     (3) A new subspace learning algorithm called locality preserving discriminant projections (LPDP) was proposed by adding the maximum margin criterion (MMC) into the objective function of locality preserving projections (LPP). LPDP remains the locality preserving characteristic of LPP and utilizes label information in MMC, which can maximize the between-class distance and minimize the within-class distance. Thus our proposed LPDP is a new method that combines manifold criterion and Fisher criterion and has more discriminant power and more suitable for recognition tasks than LPP which considers only the local information for clustering or classification tasks. Moreover, two kinds of tensorized (multilinear) forms of LPDP are also derived in this paper. One is iterative while the other is non-iterative. Finally, the proposed LPDP method is applied to face and palmprint biometrics and is examined using the Yale, ORL face image databases and the PolyU palmprint database. Experimental results show the effectiveness of the proposed LPDP and demonstrate that LPDP is a good choice for real-world biometrics applications.
     (4) Spectral regression discriminant analysis (SRDA) and its kernel version SRKDA are important subspace learning methods proposed recently, both of which have a free parameter, i.e., the regularization parameter. However, how to set this parameter automatically has not been well solved before. In SRDA, this regularization parameter was only set as a constant, which is obviously suboptimal. In this paper, we developed a new algorithm to automatically estimate the regularization parameter of SRDA based on the perturbation linear discriminant analysis (PLDA). We also proposed two methods for regularization parameter estimation of SRKDA. One is derived from the method of optimal regularization parameter estimation for SRDA (OR-SRDA). The other is to utilize the kernel version of PLDA. Experiments on different data sets demonstrate the effectiveness and feasisblity of proposed methods.
引文
AHMED N, NATARAJAN T, RAO K R 1974. Discrete Cosine Transfom. IEEE Transactions on Computers [J],100:90-93.
    ALIZADEH A A, EISEN M B, DAVIS R E, et al.2000. Distinct types of diffuse large B-cell lymphoma identified by gene expression profiling. Nature [J],403:503-511.
    ALON U, BARKAI N, NOTTERMAN D A, et al.1999. Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays [M]. National Acad Sciences:6745-6750.
    ANTAL P, FANNES G, TIMMERMAN D, et al.2003. Bayesian applications of belief networks and multilayer perceptrons for ovarian tumor classification with rejection. Artificial Intelligence in Medicine [J],29:39-60.
    ASHOK A K 1970. Learning with a Propablistic Teacher. IEEE Transactions on Information Theory [J],16:373-379.
    BAGIROV A M, FERGUSON B, IVKOVIC S, et al.2003. New algorithms for multi-class cancer diagnosis using tumor gene expression signatures. Bioinformatics [J],19:1800-1807.
    BAUDAT G, ANOUAR F E 2000. Generalized discriminant analysis using a kernel approach. Neural Computation [J],12:2385-2404.
    BELHUMEUR P N, HESPANHA J P, KRIEGMAN D J 1997. Eigenfaces vs. Fisherfaces:Recognition using class specific linear projection. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],19:711-720.
    BELKIN M, NIYOGI P 2002. Laplacian eigenmaps and spectral techniques for embedding and clustering. Advances in Neural Information Processing Systems 14, Vols 1 and 2 [J],14: 585-591.
    BELKIN M, NIYOGI P 2003. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation [J],15:1373-1396.
    BELKIN M, NIYOGI P 2004. Semi-supervised learning on Riemannian manifolds. Machine Learning [J],56:209-239.
    BENGIO Y, PAIEMENT J, VINCENT P, et al.2004. Out-of-sample extensions for lle, isomap, mds, eigenmaps, and spectral clustering [M], Advances in Neural Information Processing Systems. The MIT Press; Cambridge, MA, USA:177.
    BERTONI A, VALENTINI G 2006. Randomized maps for assessing the reliability of patients clusters in DNA microarray data analyses. Artificial Intelligence in Medicine [J],37:85-109.
    BLEI D M, NG A Y, JORDAN M I 2003. Latent Dirichlet allocation. Journal of Machine Learning Research [J],3:993-1022.
    BO L F, WANG L, JIAO L C 2006. Feature scaling for kernel fisher discriminant analysis using leave-one-out cross validation. Neural Computation [J],18:961-978.
    BRUN A, WESTIN C F, HERBERTHSON M, et al.2005. Fast manifold learning based on Riemannian normal coordinates. Image Analysis, Proceedings [J],3540:920-929.
    CAAN M W A, VERMEER K A, VAN VLIET L J, et al.2006. Shaving diffusion tensor images in discriminant analysis:A study into schizophrenia. Medical Image Analysis [J],10:841-849.
    CAI D, HE X, HAN J 2007a. Efficient kernel discriminant analysis via spectral regression [M], International Conference on Data Mining (ICDM'07); Omaha, NE.
    CAI D, HE X, HAN J 2007b. Efficient Kernel Discriminant Analysis via Spectral Regression. Department of Computer Science Technical Report No.2888, University of Illinois at Urbana-Champaign (UIUCDCS-R-2007-2888) [J].
    CAI D, HE X, HAN J 2007c. Semi-supervised discriminant analysis [M], IEEE International Conference on Computer Vision (ICCV 07); Rio de Janeiro, Brazil.
    CAI D, HE X, HAN J 2007d. Spectral Regression for Dimensionality Reduction. Department of Computer Science Technical Report No.2856, University of Illinois at Urbana-Champaign (UIUCDCS-R-2007-2856) [J].
    CAI D, HE X, HAN J 2007e. Spectral regression for efficient regularized subspace learning [M], IEEE International Conference on Computer Vision (ICCV'07); Rio de Janeiro, Brazil.
    CAI D, HE X, HAN J 2007f. Spectral regression:A unified approach for sparse subspace learning [M], International Conference on Data Mining (ICDM'07); Omaha, NE,.
    CAI D, HE X, ZHOU K, et al.2007g. Locality Sensitive Discriminant Analysis [M], Proc.2007 Int. Joint Conf. on Artificial Intelligence (IJCAI'07); Hyderabad, India.
    CAI D, HE X F, HAN J W 2008. SRDA:An efficient algorithm for large-scale discriminant analysis. Ieee Transactions on Knowledge and Data Engineering [J],20:1-12.
    CAI D, HE X F, HAN J W, et al.2006. Orthogonal laplacianfaces for face recognition. Ieee Transactions on Image Processing [J],15:3608-3614.
    CAMPS-VALLS G, MARSHEVA T V B, ZHOU D Y 2007. Semi-supervised graph-based hyperspectral image classification. Ieee Transactions on Geoscience and Remote Sensing [J], 45:3044-3054.
    CARREIRA-PERPINAN M A 1997. A Review of Dimension Reduction Techniques,Technical Report CS-96-09,Dept. of Computer Science, University of Sheffield.
    CASTELLI V, COVER T M 1995. On the Exponential Value of Labeled Samples. Pattern Recognition Letters [J],16:105-111.
    CHAO Y H, TSAI W H, WANG H M, et al.2008. Using Kernel Discriminant Analysis to Improve the Characterization of the Alternative Hypothesis for Speaker Verification. Ieee Transactions on Audio Speech and Language Processing [J],16:1675-1684.
    CHAPELLE O, SCH LKOPF B, ZIEN A 2006. Semi-Supervised Learning [M],1st ed. Cambridge, MA:MIT Press.
    CHEN C, GONG Y-C, BIE R 2008. Discriminant Analysis Methods for Microarray Data Classification [M],21st Australian Joint Conference on Artificial Intelligence, DEC 01-05,2008 Auckland Univ Technol, Auckland, NEW ZEALAND:268-277.
    CHEN J D, LAI J H, FENG G C 2004. Gabor-based Kernel Fisher Discriminant Analysis for pose discrimination. Advances in Biometric Person Authentication, Proceedings [J],3338:153-161.
    CHEN L F, LIAO H Y M, KO M T, et al.2000. A new LDA-based face recognition system which can solve the small sample size problem. Pattern Recognition [J],33:1713-1726.
    CHEN W, SHAN C F, DE HAAN G 2009. Optimal Regularization Parameter Estimation for Spectral Regression Discriminant Analysis. IEEE Transactions on Circuits and Systems for Video Technology [J],19:1921-1926.
    CHO S B, WON H H 2003. Machine learning in DNA microarray analysis for cancer classification [C] //,Australian Computer Society, Inc. Darlinghurst, Australia, Australia; City.189-198.
    CHOI H, CHOI S 2007. Robust kernel Isomap. Pattern Recognition [J],40:853-862.
    CHUZHANOVA N A, JONES A J, MARGETTS S 1998. Feature selection for genetic sequence classification. Bioinformatics [J],14:139-143.
    COVELL D G, WALLQVIST A, RABOW A A, et al.2003. Molecular classification of cancer: Unsupervised self-organizing map analysis of gene expression microarray data. Molecular Cancer Therapeutics [J],2:317-332.
    COZMAN F G, COHEN I 2002. Unlabeled data can degrade classification performance of generative classifiers [M], Proceedings of the 15th International Conference of the Florida Artificial Intelligence Research Society; Pensacola, FL:327-331.
    CULP M, MICHAILIDIS G 2009. A co-training algorithm for multi-view data with applications in data fusion. Journal of Chemometrics [J],23:294-303.
    DAI G, YEUNG D Y, QIAN Y T 2007. Face recognition using a kernel fractional-step discriminant analysis algorithm. Pattern Recognition [J],40:229-243.
    DE RIDDER D, KOUROPTEVA O, OKUN O, et al.2003. Supervised locally linear embedding. Artificail Neural Networks and Neural Information Processing-Ican/Iconip 2003 [J],2714: 333-341.
    DE SILVA V, TENENBAUM J 2003. Global versus local methods in nonlinear dimensionality reduction [M], Advances in Neural Information Processing Systems:721-728.
    DENG L, PEI J, MA J, et al.2004. A rank sum test method for informative gene discovery [C]//; City. 410-419.
    DETTLING M 2004. BagBoosting for tumor classification with gene expression data. Bioinformatics [J],20:3583-3593.
    DIMITRIADOU E, WEINGESSEL A, HORNIK K 2002. A mixed ensemble approach for the semi-supervised problem [M], Artificial Neural Networks-Icann 2002:571-576.
    DONOHO D L, GRIMES C 2003. Hessian eigenmaps:Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America [J],100:5591-5596.
    DU W W, INOUE K, URAHAMA K 2005. Dimensionality reduction for semi-supervised face recognition. Fuzzy Systems and Knowledge Discovery, Pt 2, Proceedings [J],3614:1-10.
    EMMERT-STREIB F, DEHMER M 2008. Analysis of Microarray Data:A Network-Based Approach [M]. Wiley-VCH.
    FANG J W, GRZYMALA-BUSSE J W 2006. Mining of microRNA expression data-A rough set approach [C]//, Springer-Verlag; City.758-765.
    FERGUS R, WEISS Y, TORRALBA A 2009. Semi-supervised Learning in in Gigantic Image Collections [M], Neural Information Processing Systems.
    FRIEDMAN J H 1989. Regularized Discriminant-Analysis. Journal of the American Statistical Association [J],84:165-175.
    FUKUNAGA K 1990. Introduction to Statistical Pattern Recognition, second ed [M]. Academic Press Professional.
    GEMERT J C V, VEENMAN C J, SMEULDERS A W M, et al.2009 (Accepted). Visual Word Ambiguity. Ieee Transactions on Pattern Analysis and Machine Intelligence [J].
    GENG X, ZHAN D C, ZHOU Z H 2005. Supervised nonlinear dimensionality reduction for visualization and classification. Ieee Transactions on Systems Man and Cybernetics Part B-Cybernetics [J],35:1098-1107.
    GHOSH A K 2008. Kernel discriminant analysis using case-specific smoothing parameters. Ieee
    Transactions on Systems Man and Cybernetics Part B-Cybernetics [J],38:1413-1418.
    GHOSH D 2003. Penalized discriminant methods for the classification of tumors from gene expression data. Biometrics [J],59:992-1000.
    GOLUB G H, VAN LOAN, C.F.1996. Matrix Computations(third edition) [M]. The Johns Hopkins Univ. Press.
    GOLUB T R, SLONIM D K, TAMAYO P, et al.1999. Molecular classification of cancer:Class discovery and class prediction by gene expression monitoring. Science [J],286:531-537.
    GOUDELIS G, ZAFEIRIOU S, TEFAS A, et al.2007. Class-specific Kernel-discriminant analysis for face verification. Ieee Transactions on Information Forensics and Security [J],2:570-587.
    GUYON I, WESTON J, BARNHILL S, et al.2002. Gene selection for cancer classification using support vector machines. Machine Learning [J],46:389-422.
    HE X, CAI D, NIYOGI P 2006. Tensor subspace analysis [M], Advances in Neural Information Processing Systems:499.
    HE X F, NIYOGI P 2004. Locality preserving projections. Advances in Neural Information Processing Systems 16 [J],16:153-160.
    HE X F, YAN S C, HU Y X, et al.2005. Face recognition using Laplacianfaces. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],27:328-340.
    HOU C P, JIAO Y Y, WU Y, et al.2008. Relaxed maximum-variance unfolding. Optical Engineering [J],47:
    HOYER P O 2004. Non-negative matrix factorization with sparseness constraints. Journal of Machine Learning Research [J],5:1457-1469.
    http://CVC.YALE.EDU/PROJECTS/YALEFACES/YALEFACES.HTML 2002. Yale Univ. Face Database [M].
    HU S, RAO J 2007. Statistical redundancy testing for improved gene selection in cancer classification using microarray data. Cancer Informatics [J],3:29-41.
    HUANG D S, ZHENG C H 2006. Independent component analysis-based penalized discriminant method for tumor classification using gene expression data. Bioinformatics [J],22: 1855-1862.
    HUANG T M, KECMAN V 2004. Semi-supervised learning from unbalanced labeled data-An improvement [C]//, Springer Verlag; City.802-808.
    HULL J J 1994. A Database for Handwritten Text Recognition Research. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],16:550-554.
    HUO X M, SMITH A K 2009. Matrix perturbation analysis of local tangent space alignment. Linear Algebra and Its Applications [J],430:732-746.
    IIZUKA N, OKA M, YAMADA-OKABE H, et al.2003. Oligonucleotide microarray for prediction of early intrahepatic recurrence of hepatocellular carcinoma after curative resection. Lancet [J], 361:923-929.
    JI S W, YE J P 2008a. Generalized Linear Discriminant Analysis:A Unified Framework and Efficient Model Selection. Ieee Transactions on Neural Networks [J],19:1768-1782.
    JI S W, YE J P 2008b. Kernel uncorrelated and regularized discriminant analysis:A theoretical and computational study. Ieee Transactions on Knowledge and Data Engineering [J],20: 1311-1321.
    KHAN J, WEI J S, RTNGNER M, et al.2001. Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks. Nature Medicine [J],7:673-679.
    KIM S W, OOMMEN B J 2008. On using prototype reduction schemes to optimize kernel-based Fisher discriminant analysis. Ieee Transactions on Systems Man and Cybernetics Part B-Cybernetics [J],38:564-570.
    KROGEL M A, SCHEFFER T 2004. Multi-relational learning, text mining, and semi-supervised learning for functional genomics. Machine Learning [J],57:61-81.
    KROGH A, BROWN M, MIAN I S, et al.1994. Hidden Markov-Models in Computational Biology-Applications to Protein Modeling. Journal of Molecular Biology [J],235:1501-1531.
    KULKARNI S R, MITTER S K, TSITSEKLIS J N 1993. Active Learning Using Arbitrary Binary Valued Queries. Machine Learning [J],11:23-35.
    LANCTOT J K 2003. Using multi-dimensional scaling to improve machine learning performance. Abstracts of Papers of the American Chemical Society [J],225:U782-U783.
    LEHMANN E L 1975. Non-Parametrics:Statistical Methods Based on Ranks San Francisco. Calif: Holden& Day [J]:123-141.
    LI B, ZHENG C H, HUANG D S 2008a. Locally linear discriminant embedding:An efficient method for face recognition. Pattern Recognition [J],41:3813-3821.
    LI H F, JIANG T, ZHANG K S 2006. Efficient and robust feature extraction by maximum margin criterion. Ieee Transactions on Neural Networks [J],17:157-165.
    LI H F, JIANG, T., ZHANG, K. S.2006. Efficient and robust feature extraction by maximum margin criterion. IEEE Transactions on Neural Networks [J],17:157-165.
    LI L P, UMBACH D M, TERRY P, et al.2004. Application of the GA/KNN method to SELDI proteomics data. Bioinformatics [J],20:1638-1640.
    LI L P, WEINBERG C R, DARDEN T A, et al.2001. Gene selection for sample classification based on gene expression data:study of sensitivity to choice of parameters of the GA/KNN method. Bioinformatics [J],17:1131-1142.
    LI M, ZHOU Z H 2007. Improve computer-aided diagnosis with machine learning techniques using undiagnosed samples. Ieee Transactions on Systems Man and Cybernetics Part a-Systems and Humans [J],37:1088-1098.
    LI X L, LIN S, YAN S C, et al.2008b. Discriminant locally linear embedding with high-order tensor data. Ieee Transactions on Systems Man and Cybernetics Part B-Cybernetics [J],38:342-352.
    LI Y Z, LUO D Y, LIU S Q 2009. Orthogonal discriminant linear local tangent space alignment for face recognition. Neurocomputing [J],72:1319-1323.
    LIN T, ZHA H B 2008. Riemannian manifold learning. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],30:796-809.
    LIN T, ZHA H B, LEE S U 2006. Riemannian manifold learning for nonlinear dimensionality reduction. Computer Vision-Eccv 2006, Pt 1, Proceedings [J],3951:44-55.
    LIU H, ZHANG C M, JI X H, et al.2009. An Algorithm for Co-Training in Medical Image Retrieval. International Journal of Innovative Computing Information and Control [J],5:4327-4333.
    LIU J, CHEN S C, TAN X Y, et al.2007. Comments on "Efficient and robust feature extraction by maximum margin criterion". Ieee Transactions on Neural Networks [J],18:1862-1864.
    LIU J, CHERI, S. C., TAN, X. Y., ZHANG, D. Q.2007. Efficient and robust feature extraction by maximum margin criterion. IEEETransactions on Neural Networks [J],18:1862-1864.
    LIU Q S, LU H Q, MA S D 2004. Improving Kernel Fisher Discriminant Analysis for face recognition. Ieee Transactions on Circuits and Systems for Video Technology [J],14:42-49.
    LIU Q S, TANG X O, LU H Q, et al.2006. Face recognition using kernel scatter-difference-based
    discriminant analysis. IEEETransactions on Neural Networks [J],17:1081-1085.
    LIU W, CHANG S-F 2009. Robust Multi-Class Transductive Learning with Graphs [M], Proc.2009 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR'09).
    LU H P, PLATANIOTIS K N, VENETSANOPOULOS A N 2009. Uncorrelated Multilinear Discriminant Analysis With Regularization and Aggregation for Tensor Object Recognition. Ieee Transactions on Neural Networks [J],20:103-123.
    LU J W, PLATANIOTIS K N, VENETSANOPOULOS A N 2003. Face recognition using kernel direct discriminant analysis algorithms. Ieee Transactions on Neural Networks [J],14:117-126.
    MAATEN LVD, POSTMA E, HERIK J V D 2009. Dimensionality Reduction:A Comparative Review.Tilburg University Technical Report, TiCC-TR 2009-005.
    MACKAY D J C 1995. Bayesian Neural Networks and Density Networks. Nuclear Instruments& Methods in Physics Research Section a-Accelerators Spectrometers Detectors and Associated Equipment [J],354:73-80.
    MIKA S, RATSCH G, WESTON J, et al.1999. FISHER DISCRIMINANT ANALYSIS W ITH KERNELS [C]//; City.
    NG A, JORDAN M, WEISS Y 2002. On spectral clustering:Analysis and an algorithm [M], ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS:849-856.
    NIE F P, XIANG S M, SONG Y Q, et al.2009. Extracting the optimal dimensionality for local tensor discriminant analysis. Pattern Recognition [J],42:105-114.
    NIGAM K, MCCALLUM A K, THRUN S, et al.2000. Text classification from labeled and unlabeled documents using EM. Machine Learning [J],39:103-134.
    NUTT C L, MANI D R, BETENSKY R A, et al.2003. Gene expression-based classification of malignant gliomas correlates better with survival than histological classification. Cancer Research [J],63:1602-1607.
    PAN W, SHEN X T, JIANG A X, et al.2006. Semi-supervised learning via penalized mixture model with application to microarray sample classification. Bioinformatics [J],22:2388-2395.
    PEKALSKA E, HAASDONK B 2009. Kernel Discriminant Analysis for Positive Definite and Indefinite Kernels. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],31: 1017-1031.
    PICCARRETA R, LIOR O 2010. Exploring sequences:a graphical tool based on multi-dimensional scaling. Journal of the Royal Statistical Society Series a-Statistics in Society [J],173:165-184.
    POCHET N, DE SMET F, SUYKENS J A K, et al.2004. Systematic benchmarking of microarray data classification:assessing the role of non-linearity and dimensionality reduction. Bioinformatics [J],20:3185-3195.
    PU X R, YI Z, ZHENG Z M, et al.2005. Face recognition using Fisher non-negative matrix factorization with sparseness constraints. Advances in Neural Networks-Isnn 2005, Pt 2, Proceedings [J],3497:112-117.
    QIAO L S, CHEN S C, TAN X Y 2010. Sparsity preserving projections with applications to face recognition. Pattern Recognition [J],43:331-341.
    ROWEIS S T, SAUL L K 2000. Nonlinear dimensionality reduction by locally linear embedding. Science [J],290:2323-2326.
    SAUL L K, ROWEIS S T 2004. Think globally, fit locally:Unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research [J],4:119-155.
    SCHENKER A, BUNKE H, LAST M, et al.2004. A graph-based framework for web document mining [M], Proc. DAS:401-412.
    SCHOLKOPF B, SMOLA A, MULLER K R 1998. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation [J],10:1299-1319.
    SEEGER M Learning with labeled and unlabeled data [M].
    SHAHSHAHANI B M, LANDGREBE D A 1994. The Effect of Unlabeled Samples in Reducing the Small Sample-Size Problem and Mitigating the Hughes Phenomenon. Ieee Transactions on Geoscience and Remote Sensing [J],32:1087-1095.
    SHAO J D, RONG G 2009. Nonlinear process monitoring based on maximum variance unfolding projections. Expert Systems with Applications [J],36:11332-11340.
    SHIN H C, PARK J H, KIM S D 2007. Combination of warping robust elastic graph matching and kernel-based projection discriminant analysis for face recognition. Ieee Transactions on Multimedia [J],9:1125-1136.
    SONG Y Q, NIE F P, ZHANG C S, et al.2008. A unified framework for semi-supervised dimensionality reduction. Pattern Recognition [J],41:2789-2799.
    SUN J, BOYD S, XIAO L, et al.2006. The fastest mixing Markov process on a graph and a connection to a maximum variance unfolding problem. Siam Review [J],48:681-699.
    SZUMMER M, JAAKKOLA T 2002. Partially labeled classification with Markov random walks. Advances in Neural Information Processing Systems 14 [J].
    TAO D C, LI X L, WU X D, et al.2008. Tensor Rank One Discriminant Analysis-A convergent method for discriminative multilinear subspace selection. Neurocomputing [J],71:1866-1882.
    TAO D C, LI X L, WU X D, et al.2007. General tensor discriminant analysis and Gabor features for gait recognition. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],29: 1700-1715.
    TAO D C, TANG X O, LI X L, et al.2006. Direct kernel biased discriminant analysis:A new content-based image retrieval relevance feedback algorithm. Ieee Transactions on Multimedia [J],8:716-727.
    TENENBAUM J B, DE SILVA V, LANGFORD J C 2000. A global geometric framework for nonlinear dimensionality reduction. Science [J],290:2319-2324.
    TIAN Q, YU J, XUE Q, et al.2004. A new analysis of the value of unlabeled data in semi-supervised learning for image retrieval [M], Proceedings of the IEEE International Conference on Multimedia Expo; Taibei:1019-1022.
    TIBSHIRANI R, HASTIE T, NARASIMHAN B, et al.2002. Diagnosis of multiple cancer types by shrunken centroids of gene expression. Proceedings of the National Academy of Sciences of the United States of America [J],99:6567-6572.
    VAPNIK V N 1995. The Nature of Statistical Learning Theory, New York:Springer [M].
    WANG F, WANG X 2009. Neighborhood discriminant tensor mapping. Neurocomputing [J],72: 2035-2039.
    WANG F, ZHANG C S 2006. Label Propagation Through Linear Neighborhoods [M], Proceedings of the 23 rd International Conference on Machine Learning; Pittsburgh, PA.
    WANG F, ZHANG C S 2008. Label propagation through linear Neighborhoods. Ieee Transactions on Knowledge and Data Engineering [J],20:55-67.
    WANG J 2008. Improve local tangent space alignment using various dimensional local coordinates. Neurocomputing [J],71:3575-3581.
    WANG J, PLATANIOTIS K N, LU J W, et al.2008. Kernel quadratic discriminant analysis for small
    sample size problem. Pattern Recognition [J],41:1528-1538.
    WANG L P, CHU F, XIE W 2007a. Accurate cancer classification using expressions of very few genes. Ieee-Acm Transactions on Computational Biology and Bioinformatics [J],4:40-53.
    WANG S, CHEN H, LI S, et al.2007b. Feature Extraction from Tumor Gene Expression Profiles Using DCT and DFT. LECTURE NOTES IN COMPUTER SCIENCE [J],4874:485.
    WANG T S, SHI P F 2009. Kernel Grassmannian distances and discriminant analysis for face recognition from image sets. Pattern Recognition Letters [J],30:1161-1165.
    WEI Y T, LI H, LI L Q 2009. Tensor Locality Sensitive Discriminant Analysis and Its Complexity. International Journal of Wavelets Multiresolution and Information Processing [J],7:865-880.
    WESSELS L F A, REINDERS M J T, HART A A M, et al.2005. A protocol for building and evaluating predictors of disease state based on microarray data. Bioinformatics [J],21:3755-3762.
    WEST M 2003. Bayesian factor regression models in the "Large p, Small n" paradigm. Bayesian Statistics 7 [J]:733-742.
    WESTON J, LESLIE C, IE E, et al.2005. Semi-supervised protein classification using cluster kernels. Bioinformatics [J],21:3241-3247.
    WILLIAMS C K I 2002. On a connection between kernel PCA and metric multidimensional scaling. Machine Learning [J],46:11-19.
    WORDEN K, MANSON G 2000. Damage identification using multivariate statistics:Kernel discriminant analysis. Inverse Problems in Engineering [J],8:25-46.
    WRIGHT J, YANG A Y, GANESH A, et al.2009. Robust Face Recognition via Sparse Representation. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],31:210-227.
    WU X H, ZHOU H H 2006. Fuzzy discriminant analysis with kernel methods. Pattern Recognition [J], 39:2236-2239.
    X.ZHU Semi-supervised learning literature survey [M].
    XING H J, YANG Y J, WANG Y, et al.2005. Sparse Kernel Fisher Discriminant Analysis. Advances in Neural Networks-Isnn 2005, Pt 1, Proceedings [J],3496:824-830.
    XU D, YAN S C, ZHANG L, et al.2008. Reconstruction and recognition of tensor-based objects with concurrent subspaces analysis. IEEE Transactions on Circuits and Systems for Video Technology [J],18:36-47.
    XU Y, YANG J Y, LU J F, et al.2004a. An efficient renovation on kernel Fisher discriminant analysis and face recognition experiments. Pattern Recognition [J],37:2091-2094.
    XU Y, YANG J Y, YANG J 2004b. A reformative kernel Fisher discriminant analysis. Pattern Recognition [J],37:1299-1302.
    XU Y, ZHANG D, JIN Z, et al.2006. A fast kernel-based nonlinear discriminant analysis for multi-class problems. Pattern Recognition [J],39:1026-1033.
    YAN B J, DOMENICONI C 2006. Subspace metric ensembles for semi-supervised clustering of high dimensional data [M], Machine Learning:Ecml 2006, Proceedings:509-520.
    YAN S C, XU D, YANG Q, et al.2007a. Multilinear discriminant analysis for face recognition. Ieee Transactions on Image Processing [J],16:212-220.
    YAN S C, XU D, ZHANG B Y, et al.2007b. Graph embedding and extensions:A general framework for dimensionality reduction. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],29:40-51.
    YANG J, FRANGI A F, YANG J Y, et al.2005a. KPCA plus LDA:A complete kernel fisher discriminant framework for feature extraction and recognition. Ieee Transactions on Pattern
    Analysis and Machine Intelligence [J],27:230-244.
    YANG J, JIN Z, YANG J Y, et al.2004a. Essence of kernel Fisher discriminant:KPCA plus LDA. Pattern Recognition [J],37:2097-2100.
    YANG J, YANG J Y 2003. Why can LDA be performed in PC A transformed space? Pattern Recognition [J],36:563-566.
    YANG J, ZHANG D, FRANGI A F, et al.2004b. Two-dimensional PCA:A new approach to appearance-based face representation and recognition. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],26:131-137.
    YANG J, ZHANG D, YANG J Y, et al.2007. Globally maximizing, locally minimizing:Unsupervised discriminant projection with applications to face and palm biometrics. Ieee Transactions on Pattern Analysis and Machine Intelligence [J],29:650-664.
    YANG J, ZHANG D, YONG X, et al.2005b. Two-dimensional discriminant transform for face recognition. Pattern Recognition [J],38:1125-1129.
    YANG J C, YAN S C, HUANG T S 2009. Ubiquitously Supervised Subspace Learning. Ieee Transactions on Image Processing [J],18:241-249.
    YANG L P, GONG W G, GU X H, et al.2008. Null space discriminant locality preserving projections for face recognition. Neurocomputing [J],71:3644-3649.
    YE J 2007. Least squares linear discriminant analysis [M], The Twenty-Fourth International Conference on Machine Learning (ICML 2007). ACM New York, NY, USA:1087-1093.
    YE J, JANARDAN R, LI Q 2004a. Two-Dimensional Linear Discriminant Analysis [M], The Eighteenth Annual Conference on Neural Information Processing Systems (NIPS 2004): 1569-1576.
    YE JP 2005. Generalized low rank approximations of matrices. Machine Learning [J],61:167-191.
    YE J P, LI T, XIONG T, et al.2004b. Using uncorrelated discriminant analysis for tissue classification with gene expression data. Ieee-Acm Transactions on Computational Biology and Bioinformatiocs [J],1:181-190.
    YE J P, XIONG T 2006. Computational and theoretical analysis of null space and orthogonal linear discriminant analysis. Journal of Machine Learning Research [J],7:1183-1204.
    YIN J W, LIU X M, FENG Z L, et al.2006. A local tangent space alignment based transductive classification algorithm. Artificial Neural Networks in Pattern Recognition, Proceedings [J], 4087:93-106.
    YOSHIDA M, KINASE R, KUROKAWA J, et al.1970. Multi-Dimensional Scaling of Emotion. Japanese Psychological Research [J],12:45-&.
    YU W W, TENG X L, LIU C Q 2006. Face recognition using discriminant locality preserving projections. Image and Vision Computing [J],24:239-248.
    YU X, WANG X, LIU B 2007. A direct kernel uncorrelated discriminant analysis algorithm. Ieee Signal Processing Letters [J],14:742-745.
    ZAFEIRIOU S 2009. Discriminant Nonnegative Tensor Factorization Algorithms. Ieee Transactions on Neural Networks [J],20:217-235.
    ZHANG D, ZHOU Z-H, CHEN S 2007a. Semi-supervised dimensionality reduction [M], Proceedings of the 7th SIAM International Conference on Data Mining (SDM'07); Minneapolis, MN: 629-634.
    ZHANG T H, YANG J, ZHAO D L, et al.2007b. Linear local tangent space alignment and application to face recognition. Neurocomputing [J],70:1547-1553.
    ZHANG Z Y, ZHA H Y 2003. Nonlinear dimension reduction via local tangent space alignment. Intelligent Data Engineering and Automated Learning [J],2690:477-481.
    ZHAO Z Q, HUANG D S, JIA W 2007. Palmprint recognition with 2DPCA+PCA based on modular neural networks. Neurocomputing [J],71:448-454.
    ZHENG W M 2005. A note on kernel uncorrelated discriminant analysis. Pattern Recognition [J],38: 2185-2187.
    ZHENG W S, LAI J H, YUEN P C, et al.2009. Perturbation LDA:Learning the difference between the class empirical mean and its expectation. Pattern Recognition [J],42:764-779.
    ZHENG X Q, PARLETT B N, PANG A 2005. Topological lines in 3D tensor fields and discriminant hessian factorization. Ieee Transactions on Visualization and Computer Graphics [J],11: 395-407.
    ZHOU D, BOUSQUET O, LAL T N, et al.2004. Learning with local and global consistency [C]//, MIT Press; City.
    ZHOU D, SCHOLKOPF B 2004. A regularization framework for learning from graph data [M], Proc. ICML Workshop Statistical Relational Learning and Its Connections to Other Fields; Banff, AB, Canada:132-137.
    ZHOU Z H, LI M 2005. Tri-training:Exploiting unlabeled data using three classifiers. Ieee Transactions on Knowledge and Data Engineering [J],17:1529-1541.
    ZHU X, GHAHRAMANI Z, LAFFERTY J 2003a. Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions [M], Proceedings of the Twentieth International Conference on Machine Learning; Washington DC.
    ZHU X, LAFFERTY J, GHAHRAMANI Z 2003b. Combining Active Learning and Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions [M], Proceedings of the ICML-2003 Workshop on The Continuum from Labeled to Unlabeled Data; Washington DC.
    王飞 2008.图上的半监督学习算法研究[D][M].清华大学博士学位论文.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700