用户名: 密码: 验证码:
堆叠图嵌入极限学习机算法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Stacked graph embedded extreme learning machine algorithm
  • 作者:孙玮婷 ; 葛宏伟 ; 姚瑶 ; 孙亮
  • 英文作者:SUN Wei-ting;GE Hong-wei;YAO Yao;SUN Liang;College of Computer Science and Technology,Dalian University of Technology;Key Laboratory of Symbol Computation and Knowledge Engineering,Ministry of Education,Jilin University;
  • 关键词:人工智能 ; 极限学习机 ; 图嵌入 ; 堆叠自编码器 ; 深度神经网络
  • 英文关键词:artificial intelligence;;extreme learning machine;;graph embedding;;stacked autoencoder;;deep neural network
  • 中文刊名:JLGY
  • 英文刊名:Journal of Jilin University(Engineering and Technology Edition)
  • 机构:大连理工大学计算机科学与技术学院;吉林大学符号计算与知识工程教育部重点实验室;
  • 出版日期:2018-03-26 10:48
  • 出版单位:吉林大学学报(工学版)
  • 年:2019
  • 期:v.49;No.201
  • 基金:国家自然科学基金项目(61572104,61402076);; 符号计算与知识工程教育部重点实验室课题(93K172017K03)
  • 语种:中文;
  • 页:JLGY201901028
  • 页数:12
  • CN:01
  • ISSN:22-1341/T
  • 分类号:235-246
摘要
极限学习机(Extreme learning machine,ELM)因其训练参数少、学习速度快、泛化能力强等特点,已被广泛应用于训练单隐藏层前馈神经网络。本文首先结合图嵌入框架提出一种新的极限学习机自编码器(GEELM-AE),在ELM空间中挖掘数据的局部近邻结构信息和全局结构信息。在GEELM-AE中,采用局部Fisher判别分析构建了图嵌入框架下的本征图和惩罚图。进而,通过堆叠多个GEELM-AE提出了深度框架下的堆叠图嵌入极限学习机(SGE-ELM)算法。在多个标准数据集上的实验结果表明,与已有算法比较,本文算法获得了更高的精度并具有较快的训练速度。这验证了提出的图嵌入极限学习机自编码器能够对原始数据进行有效的特征表示,堆叠的多层图嵌入极限学习机能够获得数据的有效的高层次抽象表征。
        Extreme Learning Machine(ELM)is characterized by least training parameters,fast training speed and strong generalization ability.It has been extensively applied to train single layer feed-forward neural networks.To exploit both local near-neighbor structure and global structure information in ELM spaces,a Graph Embedded Extreme Machine Autoencoder(GEELM-AE)is proposed.In GEELM-AE,an intrinsic graph and penalty graph for graph embedding are constructed by local Fisher discrimination analysis.Further,the framework of Stacked Graph Embedded ELM(SGE-ELM)by stacking several GEELM-AEs is proposed.Experimental results on several benchmarks indicate that the SGE-ELM obtains higher accuracy and faster training speed as compared with other algorithms.This validates that the GEELM-AE can obtain effective feature representationof the original data,and the SGE-ELM van obtain high level abstract and efficient representations.
引文
[1]Rumelhart D E,Hinton G E,Williams R J.Learning representations by back-propagating errors[J].Nature,1986,323(6088):533-536.
    [2]Hagan M T,Menhaj M B.Training feedforward networks with the Marquardt algorithm[J].IEEE Transactions on Neural Networks,1994,5(6):989-993.
    [3]Huang G B,Zhu Q Y,Siew C K.Extreme learning machine:a new learning scheme of feedforward neural networks[C]∥IEEE International Joint Conference on Neural Networks,Budapest, Hungary,2004:985-990.
    [4]Serre D.Matrices:Theory and Applications[M].New York:Springer-Verlag,2002.
    [5]Huang G B,Zhu Q Y,Siew C K.Extreme learning machine:theory and applications[J].Neurocomputing,2006,70(1-3):489-501.
    [6]Huang G B,Wang D H,Lan Y.Extreme learning machines:a survey[J].International Journal of Machine Learning and Cybernetics,2011,2(2):107-122.
    [7]Huang G,Huang G B,Song S,et al.Trends in extreme learning machines:a review[J].Neural Networks,2015,61:32-48.
    [8]Huang G,Song S,Gupta J N,et al.Semi-supervised and unsupervised extreme learning machines[J].IEEE Transactions on Cybernetics,2014,44(12):2405-2417.
    [9]Peng Y,Zheng W L,Lu B L.An unsupervised discriminative extreme learning machine and its applications to data clustering[J].Neurocomputing,2016,174:250-264.
    [10]Iosifidis A,Tefas A,Pitas I.Graph embedded extreme learning machine[J].IEEE Transactions on Cybernetics,2016,46(1):311-324.
    [11]Kasun L L C,Zhou H,Huang G B,et al.Representational learning with extreme learning machine for big data[J].IEEE Intelligent Systems,2013,28(6):31-34.
    [12]Hinton G E,Osindero S,Teh Y W.A fast learning algorithm for deep belief nets[J].Neural Computation 2006,18(7):1527-1554.
    [13]Vincent P,Larochelle H,Lajoie I,et al.Stacked denoising autoencoders:Learning useful representations in a deep network with a local denoising criterion[J].Journal of Machine Learning Research,2010,11(12):3371-3408.
    [14]Le Cun,Y,Bottou,L,Haffner P.Gradient-based learning applied to document recognition[C]∥Proceedings of the IEEE,1998,86(11):2278-2324.
    [15]Tang J,Deng C,Huang G B.Extreme learning machine for multilayer perceptron[J].IEEE Transactions on Neural Networks and Learning Systems,2016,27(4):809-821.
    [16]Cao L L,Huang W B,Sun F C.Building feature space of extreme learning machine with sparse denoising stacked autoencoder[J].Neurocomputing,2016,174:60-71.
    [17]Sun K,Zhang J S,Zhang C X,et al.Generalized extreme learning machine autoencoder and a new deep neural network[J].Neurocomputing,2017,230:374-381.
    [18]Yan S,Xu D,Zhang B,et al.Graph embedding and extensions:ageneral framework for dimensionality reduction[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(1):40-51.
    [19]Sugiyama M.Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis[J].Journal of Machine Learning Research,2007,8:1027-1061.
    [20]Bartlett P L.The sample complexity of pattern classification with neural networks:the size of the weights is more important than the size of the network[J].IEEE Transaction on Information Theory,1998,44(2):525-536.
    [21]UCI Machine Learning Repository.Center for machine learning and intelligent systems[DB/OL].[2017-07-23].http:∥archive.ics.uci.edu/ml.
    [22]Lee K C,Ho J,Kriegman D J.Acquiring linear subspaces for face recognition under variable lighting[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(5):684-698.
    [23]Melacci S,Belkin M.Laplacian support vector machines trained in the primal[J].Journal of Machine Learning Research,2011,12:1149-1184.
    [24]Papadimitriou C H,Steiglitz K.Combinatorial Optimization:Algorithms and Complexity[M].Englewood:Prentice Hall,1998.
    [25]Huang G B,Zhou H M,Ding X T,et al.Extreme learning machine for regression and multiclass classification[J].IEEE Transactions on Systems,Man,and Cybernetics,2012,42(2):513-529.
    [26]Zhou H,Huang G B,Lin Z,et al.Stacked extreme learning machines[J].IEEE Transactions on Cybernetics,2015,45(9):2013-2025.
    [27]Zhang N,Ding S F,Zhang J.Multi layer ELMRBF for multi-label learning[J].Applied Soft Computing,2016,43:535-545.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700