基于张量数据的机器学习方法研究与应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
在传统的机器学习领域,大多数经典的机器学习算法都是基于向量空间的数据进行设计的。然而在实际问题中,许多实际数据需要通过张量形式才能进行更好的表示。若只是直接的将张量数据其转化为向量数据进行操作,这样因为大量的结构信息的丢失使得学习结果不甚理想。因此近些年来基于张量数据的机器学习方法得到了众多研究者的极大关注。使用张量类型的数据,不但保留了其独特的空间结构信息,同时张量学习方法也可以有效的控制优化问题中变量的个数,从而克服了在向量学习过程中经常出现的过度拟合现象。目前,基于张量数据的机器学习新方法被广泛研究及应用,己成为当今数据挖掘领域的一个新的研究热点。
     本文将从最优化方法的角度研究张量数据的学习问题,特别是基于张量数据学习问题的新模型的建立及其相应最优化方法,并将其最终应用于实际问题中。支持向量机方法是基于最优化方法解决向量型数据挖掘的有效方法,本文将以其为基础,针对张量数据建立各类数据挖掘问题的支持张量机新模型及其求解算法。本文所涉及的研究成果主要包括以下几个方面:
     1.建立了全新的张量学习框架---低秩支持张量机模型
     本文以统计学习理论为基础,讨论了经典支持张量机模型与支持向量机模型中存在的局限性。考虑打破经典支持张量机中张量权重参数的秩一限制,讨论了一种新的低秩映射方法,从而建立了全新的张量学习框架---低秩支持张量机模型。
     2.设计了求解低秩支持张量机模型的相关优化算法:张量梯度下降算法与张量二步法
     针对低秩支持张量机模型的求解,本文着重讨论了两个基于不同思想的张量优化算法:张量梯度下降算法和张量二步法。张量梯度下降算法通过对优化变量整体梯度的计算,避免了传统张量迭代算法中的大量的交替迭代过程。从而使得新方法在求解速度上得到了大大提升。张量二步法则基于找到一个较优的近似解的思路,通过顺序求解两个目标函数及可行域都更为简单的子优化问题,得到了低秩支持张量机原始模型的一个近似解。
     3.不平衡数据分类低秩支持张量机的建立
     基于本文所提出的低秩张量学习思想,通过对经典的双子支持向量机模型的推广,本文建立了用于解决张量数据的不平衡学习的新模型LS-TNPPC模型。新模型的提出不仅丰富了处理不平衡数据分类问题的数据挖掘方法,同时也说明了使用低秩张量模型的思想对传统的向量方法进行张量上的推广是行之有效的。
     4.核方法张量学习与多标签核支持张量机
     本文详细讨论了张量数据应用核方法时应遵循的原则。并根据张量数据的特点,给出了一种可应用于张量数据的核构造方法。通过这种核方法,本文建立了一个用以解决图像场景分类中多标签分类问题的优化模型,在实际问题中也取得了一定的成功。
In the traditional research for machine learning, most of the classical learning algorithms are based on the vector space model. But many objects are naturally represented by tensors in computer vision research. In prevenient research, the tensor was always scanned into vector, thus leading to the data structure destroyed. It discarded a great deal of useful structural information, such as spatial information and temporal information. Recently, the advantages of tensorial algorithms have attracted significant interest from the research community. Compared with vector representation, tensor representation is helpful to overcome the over fitting problem in vector-based learning and the tensor learning algorithms specially suited for small-sample-size problems. Therefore tensor representation and tensor learning have become a new research hotspot at present.
     In this paper, the reaserch of tensor learning method is based on optimization method, especially focus on the establishment of the new model, new algorithm and its applications. Support Vector Machine (SVM) is a powerful tool of data mining and pattern recognition. In this paper, SVM algorithms have been extended to deal with tensors. The new tensor models and algrithms are presented. Specifically, the main achievement of this paper is as follows:
     1. The new tensor learning framework, low rank Support Tensor Machine, is presented:
     Based on statistical learning theory, the paper discussed the limitation of classical Support Vector Machine (SVM) and Support Tensor Machine (STM). The Rank-One limitation of the formulation of weight parameters tensor is broken, a novel low rank tensor projection has been discussed. At last, the new tensor learning framework, low rank Support Tensor Machine (LR-STM), has been presented.
     2. Two novel tensorial algorithms has been designed to sovle the proposed LR-STM:
     Tensor gradient descent algorithm calculated the descent direction for LR-STM by some smoothing operations. It avoids the alternating process which existed in traditional tensor algorithm and gets the optimal solution directly and fast.
     Tensor two-step algorithm divided the primal problem of LR-STM into two sub-problems. By skillful combining the two solutions from the sub-problems, Tensor two-step algorithm can find an approximate solution for the LR-STM model.
     3. LS-TNPPC algorithm is presented to deal with imbalance tensor data classification problem:
     Based on the idea of low rank projection, the classical Twin-SVM algorithm has been extended to solve imbalance tensor data classification problem. In this paper, a novel LS-TNPPC algorithm has been presented. The new method can get better prediction accuracy in standard test data. It proved the idea of low rank projection can help the extending of traditional tensor algorithm to handle tensor data.
     4. Tensor kernel method and multi-label kernel support tensor machine:
     Based the theory of kernel method, this paper discussed the application of kernel learning in tensor learning. In this paper, a tensor kernel method has been presented. By the new tensor kernel method, a novel multi-label kernel support tensor machine is presented. Experiments on some real applications suggest the efficiency and the effectiveness of this method.
引文
[1]Lathauwer L D. Signal processing based on multilinear algebra [Ph.D.dissertatiori]. Katholieke Universiteit Leuven, Leuven, Belgium,1997
    [2]Tao Da-Cheng, Li Xue-Long, Hu Wei-Ming, Maybank S J, Wu Xin-Dong. Supervised tensor learning//Proceedings of the IEEE Conference on Data Mining.Texas, USA,2005:450-457
    [3]Kotsia I, Guo Wei-Wei. Patras I. Higher rank support tensor machines for visual recognition. Patter Recognition,2012,45 (12):4192-4203
    [4]Tao Da-Cheng, Li Xue-Long, Wu Xin-Dong, Hu Wei-Ming, Maybank S J. Supervised tensor learning. Knowledge and Information Systems.2007,13 (1):1-42
    [5]Kotsia I, Patras I. Support tucker machines. Proceedings of IEEE Conference on Computer Vision and Pattern Recogition. Colorado, USA,2011:633-640
    [6]Guo Wei-Wei, Kotsia I, Patras I. Tensor learning for regression. IEEE Transations on Image Processing.2012,21 (2):816-827
    [7]Kotsia I, Patras I. Relative margin support tensor machines for gait and action recognition//Proceedings of International Conference on Image and Video Retrieval. Xi'an, China,2010
    [8]Gao Chao, Wu Xiao-Jun. Kernel support tensor regression. Procedia Engineering,2012,29: 3986-3990
    [9]Cai Deng, He Xiao-Fei, Han Jia-Wei. Learning with tensor representation. Department of Computer Science, Department of Computer Science, University of Illinois at Urbana-Champaign. Technical report:No.2716,2006
    [10]Cai Deng, He Xiao-Fei, Wen Ji-Rong, Han Jia-Wei, Ma Wei-Ying. Support tensors machines for text categorization. Department of Computer Science, University of Illinois at Urbana-Champaign. Technical report:No.2714,2006
    [11]Daniusis P, Vaitkus P. Kernel regression on matrix patterns. Lithuanian Mathematical Journal, 2008, Spec, edition 48-49:191-195
    [12]Zhang Xin-Sheng, Gao Xin-Bo, Wang Ying. Twin support tensor machines for MCs detection. Journal of Electronics,2009,26 (3):318-325
    [13]Khemchandani R, Karpatne A, Chandra S. Proximal support tensor machines. Journal of Machine Learning and Cybernetics, online,2012
    [14]Lu Hai-Ping, Plataniotis K N, Venetsanopoulos A N. MPCA:multilinear principal component analysis of tensor objects.IEEE Transations on Neural Networks,2008,19 (1):18-39
    [15]Yan Shui-Cheng, Xu Dong, Yang Qiang, Zhang Lei, Tang Xiao-Ou, Zhang Hong-Jiang, Multilinear discriminant analysis for face recognition, IEEE Transations on Image Processing, 2007,16(1):212-220
    [16]Marco Signoretto, Lieven De Lathauwer and Johan A.K. Suykens. A kernel-based framework to tensorial data analysis. Neural Networks 24 (2011) 861-874.
    [17]S. Yan, D. Xu, Q. Yang, L. Zhang, X. Tang, and H.-J. Zhang. Multilinear discriminant analysis for face recognition. IEEE Transaction on ImageProcessing, (1),2007.
    [18]X. He, D. Cai, S. Yan, H.J. Zhang, Neighborhood preserving embedding, in:Proceedings of IEEE International Conference on Computer Vision,2005, pp.1208-1213.
    [19]X.F. He, D. Cai, P. Niyogi, Tensor subspace analysis. Advances in Neural Information Processing System 18 (NIPS'05), Vancouver Canada,2005.
    [20]M.A.O.Vasilescu, D.Terzopoulos, Multilinear image analysis for facial recognition. In Proceedings of International Conference on Pattern Recognition,2002, pp:511-514.
    [21]S.C. Yan, D. Xu, Q. Yang, L. Zhang, X.O. Tang, H.J. Zhang, Multilinear discriminant dnalysis for face recognition. IEEE Transactions on Image Processing,16 (2007) 212-220.
    [22]X.F. He, D. Cai, S.C. Yan, H.J. Zhang, Neighborhood preserving embedding. In Proceedings of IEEE Conference on Computer Vision,2011, pp.1208-1213.
    [23]D.Cai, X.F.He, J.W.Han, Subspace learning based on tensor analysis. Department of Computer Science Technical Report No.2572, University of Illinois at Urbana-Champaign (UIUCDCS-R-2005-2572), May 2005.
    [24]T.G. Kolda, B.W. Bader, Tensor decompositions and applications. SIAM Review,51 (2009) 455-500.
    [25]X.L.Li, Y.W.Pang, Y.Yuan, L1-norm-based 2DPCA. IEEE Transactions on Systems, Man, and Cybernetics, Part B:Cybernetics,40 (2010) 1170-1175.
    [26]Z.Wang, X.S.He, D.Q.Gao, X.YXue, An efficient kernel-based matrixized least squares support vector machine. Neural Computing and Applications,22 (2013) 143-150.
    [27]Xiaofei He. Tensor Subspace Analysis. http://books.nips.co/papers/files/nipsl8/NIPS2005_0249.pdf
    [28]Deng Cai, Xiaofei He, Jiawei Han. Subspace Learning Based on Tensor Analysis. http://www.zjucadcg.cn/dengcai/Publication/TR/UIUCDCS-R-2005-2572.pdf
    [29]Dacheng Tao, Xuelong Li, Xindong Wu etc. General Tensor Discriminant Analysis and Gabor Features for Gait Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence.2007, 10,29(10).1700-1715
    [30]D. Xu, S. Yan, L. Zhang etc. Concurrent Subspace Analysis. IEEE Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision.
    [31]E L. Hitchcock. The expression of a tensor or apolyadic as a sum of products. J. Math. Plays. Camb.,1927.6:164-189
    [32]R.A. Harshman. Foundations of the PARAFAC procedure Models and conditions for an explanatory multi-modal factor analysis. UCLA Working Papers in Phonetics,1970,16:1-84
    [33]J.D. Carroll, J.J. Chang. Analysis of individual differences in multidimensional scaling via an N-way generalization of Eckart-Young dccomposition. Psychometrika,1970,35:283-319
    [34]J.B. Kruskal. Rank, decomposition and uniqueness for 3-way and N-way arrays. Multiway DataAnalysis 1989:7-18
    [35]P. Comon, J.M.F. Ten Berge, L. De Lathauwer, et al. Generic and typical ranks of multi-way arrays. Linear Algebra and its Applications,2009,430:11-12
    [36]L. R. Tucker. Implications of factor analysis of three-way matrices for measurement of change. in Problems in Measuring Change, C.W. Harris, ed., University of Wisconsin Press.1963:122-137.
    [37]L. R. Tucker. The extension of factor analysis to three. dimensional matrices. Contributions to Mathematical Psychology, H. Gulliksen and N. Frederiksen, eds, New York,1964.
    [38]L. R. Tucker. Some mathematical notes on three-mode factor analysis, Psychometrika,1966. 31:279-311.
    [39]J. Levin. Three-mode factor analysis. (PhD thesis), University of Illinois, Ur-bana,1963.
    [40]Qun Li, Xiangqiong Shi and Dan Schonfeld, Robust HOSVD-Based Higher-Order Data Indexing and Retrieval. IEEE signal processing letters, vol.20, no.10,2013
    [41]Ajit Rajwade, Anand Rangarajan and Arunava Banerjee, Image Denoising Using the Higher' Order Singular Value Decomposition. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(4)984-987.2013.
    [42]Vapnik V N. The nature of statistical learning theory. USA, New York:Springer,1996(中译本:张学工译.《统计学习理论的本质》.清华大学出版社,2000)
    [43]Deng Nai-Yang, Tian Ying-Jie. Support vector machines-theory, algorithm and extension. China, Beijing:Science Press,2009邓乃扬,田英杰.支持向量机---理论、算法与拓展.中国,北京:科学出版社,2009
    [44]张学工.关于统计学习理论与支持向量机.自动化学报,2000年第1期.
    [45]Vapnik V, Levin E, LeCun Y. Measuring the VC-dimension of a learning machine. Neural Computation,1994, (6):851-876.
    [46]韩力群.人工神经网络教程[M].北京:北京邮电大学出版社,2006:185-200
    [47]Lanckriet,N. Cristianini,L.El Ghaoui,P. Barlett and M. Jordan. Learning the kernel matrix with semi-definite programming. Journal of Machine Learning Research,5:27-72,2004b.
    [48]Francis R. Bach and Gert R.G. Lanckriet. Multiple kernel learning, conic duality, and the SMO algorithm.Proceedings of the 21th international conference on Machine learning,2004.
    [49]Scholkopf and A.Smola. Learning with Kernels. MIT Press,2001.
    [50]Z. Liu and L. Vandenberghe. Interior-point method for nuclear norm approximation with application to system identification. SIAM Journal on Matrix Analysis and Applications, 31(3):1235-1256,2009.
    [51]L.El Ghaoui and P.Gahinet. Rank minimization under LMI constraints:A framework for output feedback problems. In Proceedings of the European Control Conference,1993.
    [52]N. Linial, E. London, and Y. Rabinovich. The geometry of graphs and some of its algorithmic applications.Combinatorica,15:pp215-245,1995.
    [53]E.J.Candes and B.Recht, Exact matrix completion via convex optimization, Found. Comput. Math.,9 (2009), pp.717-772.
    [54]B. Recht, M. Fazel, and P. Parrilo. Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization.To appear in SIAM Review,2007
    [55]Sturm, J.F.:Using SeDuMi 1.02, a Matlab toolbox for optimization over symmetric cones. Optim. Methods Softw.11(12),625-653 (1999).
    [56]Tutuncii, R.H., Toh, K.C., Todd, M.J.:Solving semidefinite-quadratic-linear programs using SDPT3.Math. Program.95,189-217 (2003).
    [57]Cai, J.-R, Candes, E.J., Shen, Z.W.:A singular value thresholding algorithm f or matrix completion.SIAM J. Optim.
    [58]Ma, S.Q., Goldfarb, D., Chen, L.F.:Fixed point and Bregman iterative methods for matrix rank minimization. Math. Program.
    [59]Toh, K.C., Yun, S.W.:An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems. Pac. J. Optim.
    [60]Y. Liu, D. Sun, and K. C. Toh, An implementable proximal point algorithmic framework fornuclear norm minimization, tech. rep., National University of Singapore,2009
    [61]Chapelle O., Training a support vector machine in the primal. Neural Comput 19(5).2007.
    [62]Keerthi SS, Chapelle O., Decoste D, Building support vector machines with reduced classifier complexity. J Mach Learn Res 7:1493-1515.2006.
    [63]Melacci S, Belkin M, Laplacian support vector machines trained in the primal. J Mach Learn Res 12:1149-1184.2011.
    [64]J J. Hull, A database for handwritten text recognition research, IEEE Transactions on Pattern Analysis and Machine Intelligence 16 (5) (1994) 550-554.
    [65]X.Li, Y.Pang, Deterministic column-based matrix decomposition, IEEE Trans. Knowl. Data Eng.22(1) (2010) 145-149.
    [66]G. Dai, D.Y. Yeung, Tensor embedding methods, in:Proceedings of AAAI Conference on Artificial Intelligence,2006.
    [67]S. Yan, D. Xu, Q. Yang, L. Zhang, X. Tang, H.J. Zhang, Discriminant analysis with tensor representation, in:Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol.1,2005, pp.526-532.
    [68]X. He, P. Niyogi, Locality preserving projections, in:Advances in Neural Information Processing Systems (NIPS),2003, pp.1-8.
    [67]Trevor Hastie and Rolbert Tibshirani, Discriminant Adaptive Nearest Neighbor Classification, IEEE TRANSACTIONS ON PAITERN ANALYSIS AND MACHINE INTELLIGENCE, VOL.18, NO. 6, JUNE 1996
    [68]J. Weston and C. Watkins. Support vector machines for multiclass pattern recognition. In Proceedings of the Seventh European Symposium On Artificial Neural Networks,4 1999
    [69]http://www.uk.research.att.com/facedatabase.html
    [70]J.B. Tenenbaum, V. de Silva, J.C. Langford A global geometric framework for nonlinear dimensionality reduction, Science,290 (5500) (2000), pp.2319-2323
    [71]W.S. Torgerson Multidimensional scaling:Theory and method Psychometrika,17 (4) (1952), pp.401-419
    [72]P.N. Belhumeur, J.P. Hespanha, D.J. Kriengman, Eigenfaces vs. fisherfaces:Recognition using class specific linear projection, IEEE Trans. Pattern Anal. Machine Intell.,19 (7) (1997), pp. 711-720
    [73]S.T. Roweis, L.K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science,290 (5500) (2000), pp.2323-2326
    [74]M. Turk, A. Pentland, Eigenfaces for recognition, J. Cognitive Neurosci.,3 (1) (1991), pp. 71-86
    [75]http://images.ee.umist.ac.uk/danny/database.html
    [76]Jayadeva, Khemchandani, R., Chandra, S., Twin support vector machines for pattern classification, IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(5),905-910.
    [77]Santanu Ghorai, Anirban Mukherjee, Pranab K. Dutta, Nonparallel plane proximal classifier, Signal Processing,2009,89,510-512.
    [78]Y.-H. Shao, W.-J. Chen,N.-Y. Deng. Nonparallel hyperplane support vector machine for binary classification problems. Information Sciences,2014,263(1) 2014,22-35
    [79]Y.-H. Shao, N.-Y. Deng, W.-J. Chen. A proximal classifier with consistency Knowledge-Based Systems,2013,49:171-178
    [80]Y.-H. Shao, C.-H. Chun, X.-B. Wang, N.-Y. Deng.Improvements on Twin Support Vector Machines. IEEE Transactions on Neural Networks, vol.22 no.6 pp.962-968,2011
    [81]KUBAT M, HOLTE R C, MATWIN S. Machine learning for the detection of oil spills in satellite radar images[J]. Machine Learning,1998,30(2-3):195-215.
    [82]PHUA C, ALAHAKOON, D. Minority report in fraud detection:classication of skewed data[J].SIGKDD Explorations,2004,6(1):50-59.
    [83]PEREZ J M. MUGUERZA J, ARBELAITZ O, et al. Consolidated tree classifier learning in a car insurance fraud detection domain with class imbalance[C]//Proc of the 3rd International Conference on Advances in Pattern Recognition(ICAPR'05).2005:381-389.
    [84]CASTILLO M D del, SERRANO J I. A multistrategy approach for digital text categorization from imbalanced documents[J]. SIGKDD Explorations,2004,6(1):70-79.
    [85]ZHENG Zhaohui,Wu X,SRIHARI R K.Feature selection for text categorization on imbalanced data[J].SIGKDD Explorations,2004,6(l):80-89.
    [86]COHEN G,HILLARIO M,SAX H,et al.Data imbalance in surveillance of nosocomial infections[C]//Proc of the 4th International Symposium on Medical Data Analysis(ISMDA'03).Berlin: [s.n.],2003:109-117.
    [87]CHEN Jian-xun,CHENG T H,CHAN A L F,et al.An application of classification analysis for skewed class distribution in therapeutic drug monitoring the case of vancomycin [C]//Proc of Workshop on Medical Information Systems(IDEAS-DH'04).Beijing:[s.n.],2004:35-39.
    [88]YOON K,KWEK S.An unsupervised learning approach to resolving the data imbalanced issue in supervised learning problems in functional genomics[C]//Proc of the 5th International Conference on Hybrid Intelligent Systems(HIS'05).Rio de Janeiro:[s.n.],2005:303-308.
    [89]RADIVOJAC P,KORAD U,SIVALINGAM K M,et al. Learning from class-imbalanced data in wireless sensor networks[C]//Proc of Vehicular Technology Conference(VTC'03-Fall)、Orlando:[s.n.], 2003:3030-3034.
    [90]Weiss G M,Provost F.Learning when training data are costly:the effect of class distribution on tree induction[J] Journal of Artificial Intelligence Research,2003,19:315-354.
    [91]Zadronzny B,Elkan C.Learning and making decisions when costs and probabilities are both unknown[C]//Proceedings of the 7th International Conference on Knowledge Discovery and Data Mining.New York,USA:ACM,2001:204-213.
    [92]Miao Zhimin. Research on imbalanced data based on one-class classifiers[D].Nanjing:Institute of Automation Command,PLA University of Science and Technology.2008.(In Chinese)
    [93]Holte R C,Acker L E,Porter B W.Concept learning and the problem of small disjuncts[c]// Proceedings of the 11th International Joint Conference on Artificial Intelligence.Austin:Morgan Kaufmann,1989:813-818.
    [94]Sun Y M,Kamel M S,Wong A K C,et al.Cost-sensitive boosting for classification of imbalance data[J].Pattern Recognition,2007,40:3358-3378.
    [95]Young-Sik Choi, Least squares one-class support vector machine, Pattern Recognition Letters 30 (2009) 1236-1240.
    [96]M.S. Bazarra, H.D. Sherali, C.M.Shetty, Nonlinear Programming-Theory and Algorithms, second ed., Wiley,2004 Chapter4,149-172.
    [97]http://www.ics.uci.edu/mlearn/MLRepository.html.
    [98]R. Barandela, R.M. Valdovinos, J.S. Sanchez, F.J. Ferri, The imbalanced training sample problem:under or over sampling? in:Joint IAPR International Workshops on Structural, Syntactic, and Statistical Pattern Recognition, SSPR/SPR'04, in:Lecture Notes in Computer Science, vol.3138,2004, pp.806-814.
    [99]N. Chawla, K. Bowyer, L. Hall, W. Kegelmeyer, SMOTE:synthetic minority over-sampling technique, Journal of Artificial Intelligence Research 16 (2002) 321-357.
    [100]H. Han, W. Wang, B. Mao, Borderline-smote:a new over-sampling method in imbalanced data sets learning, in:International Conference on Intelligent Computing, ICIC'05, in:Lecture Notes in Computer Science, vol.3644,2005, pp.878-887.
    [101]T. Jo, N. Japkowicz, Class imbalances versus small disjuncts, SIGKDD Explorations 6 (2004) 40-49.
    [102]M. Kubat, S. Matwin, Addressing the curse of imbalanced training sets:one-sided selection, in:Proceeding of the 14th International Conference on Machine Learning,1997.
    [103]Chih-Chung Chang and Chih-Jen Lin, LIBSVM:a library for support vector machines. ACM Transactions on Intelligent Systems and Technology,2:27:1-27:27,2011. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm
    [104]Schapire, R. E., and Singer, Y., BoosTexter:A boostingbased system for text categorization. Machine Learning 39(23):135C168,2000.
    [105]Pertti Mattila, Geometry of Sets and Measures in Euclidean Spaces, Cambridge University Press, New York,1995.
    [106]Pavan Taruga, Ashok Veeraraghavan, Rama Chellappa:Statistical analysis on Stiefel and Grassmann manifolds with applications in computer vision, IEEE Conference on Computer Vision and Pattern Recognition,2008, pp.1-8
    [107]Boutell, M.R., Luo, J., Shen, X. & Brown, C.M., Learning multi-label scene classification, Pattern Recognition, vol.37, no.9, pp.1757-71,2004.
    [108]F.d. Comite, R.Gilleron, M.Tommasi, Learning multi-label altenating decision tree from texts and data, in:P.Perner, A. Rosenfeld(Eds.), Lecture Notes in Computer Science 2734, Springer, Berlin, 2003, pp.35-49.
    [109]Elisseeff, A. and Weston, J., A kernel method for multi-labeled classification. In NIPS 14, MIT Press, Cambridge, MA,2002, PP.681-687.
    [110]G. Tsoumakas and I. Katakis. Multi-label classification:An overview. International Journal of Data Warehousing and Mining,3(3):1-13,2007.
    [111]Z.-H. Zhou and M.-L. Zhang. Multi-instance multi-label learning with application to scene classification. In Advances in Neural Information Processing Systems 19, MIT Press, Cambridge, MA, 1609-1616,2007.
    [112]GJ.Qi, X.S. Hua, Y.Rui, J.H.Tang and H.T.Zhang. Two-Dimentional multi-label active learning with an efficient onling adaptation model for image classification. IEEE Tranactions on Pattern Analysis and Machine Intelligence.2008
    [113]A.L.Yuille and A. Rangarajan. The concave-convex procedure, Neural Computation, vol.15, no.4,915-936,2003.
    [114]M.L.Zhang, Z.H. Zhou. ML-kNN:A lazy learning approach to multi-label learning. Pattern Recognition, v01.40, PP.2038-2048,2007.
    [115]Y.Freund, R.E.Schapire. A decision-theoretic generalization of on-line learning and an application to boosting.Journal of Computer and System Sciences, v01.55, No.l, pp.119-139,1997.
    [116]A.McCallum. Multi-label text classification with a mixture model trained by EM.In Proc. Working Notes Am.Assoc. Artificial Intelligence Workshop Text Leaming(AAAI 99),Orlando,FL,1999
    [117]M.R.Boutell, J.Luo, X.Shen and C.M.Brown. Learning multi-label scene classification. Pattern Recognition, V01.37, No.9, PP.1757-1771,2003.
    [118]X.Li, L.Wang and E.Sung. Multi-Label SVM active learning for image cl assification. In:Proceedings of 2004 international conference on Image Processing(ICIP'04), Singapore, pp. 2207-2210,2004.
    [119]P.Pavlidis, J.Weston, J.Cai and N.Grundy. Combining microarray expression data and phylogenetic profiles to learn functional categories using support vector machines.In:Proceedings of the fifth Annual international Conference on Computational Molecular Biology,Canada:Montreal, PP. 242-248,2001.
    [120]S.Diplaris, G. Tsoumakas, P. Mitkas and I. Vlahavas. Protein classification with multiple algorithms.In:Proceedings of the 1 0m Panhellenic Conference on Informatics(PCI 2005),Greece:Volos, LNCS 3746, PP.448-456,2005.
    [121]何伟,基于张量空间模型的文本分类研究[学位论文],合肥工业大学,2010
    [122]牛少波,基于张量学习的目标识别技术研究[学位论文],西安电子科技大学,2010
    [123]温浩,基于张量子空间人脸识别算法研究[学位论文],西安电子科技大学,2010
    [124]冯蕾,基于最优投影支持张量机的多分类算法研究[学位论文],西安电子科技大学,2011
    [125]章皓,张量分解及其在图像识别和个性化搜索中的应用--矩阵分解应用的高阶推广[学位论文],南开大学,2012

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700