用户名: 密码: 验证码:
基于去噪自编码器的极限学习机
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Denoising autoencoder based extreme learning machine
  • 作者:来杰 ; 王晓丹 ; 李睿 ; 赵振冲
  • 英文作者:LAI Jie;WANG Xiaodan;LI Rui;ZHAO Zhenchong;College of Air and Missile Defense, Air Force Engineering University;
  • 关键词:极限学习机 ; 深度学习 ; 去噪自编码器 ; 特征提取 ; 特征降维 ; 鲁棒性
  • 英文关键词:Extreme Learning Machine(ELM);;deep leaning;;Denoising AutoEncoder(DAE);;feature extraction;;feature reduction;;robustness
  • 中文刊名:JSJY
  • 英文刊名:Journal of Computer Applications
  • 机构:空军工程大学防空反导学院;
  • 出版日期:2019-01-09 13:44
  • 出版单位:计算机应用
  • 年:2019
  • 期:v.39;No.346
  • 基金:国家自然科学基金资助项目(61876189,61806219)~~
  • 语种:中文;
  • 页:JSJY201906012
  • 页数:7
  • CN:06
  • ISSN:51-1307/TP
  • 分类号:69-75
摘要
针对极限学习机算法(ELM)参数随机赋值降低算法鲁棒性及性能受噪声影响显著的问题,将去噪自编码器(DAE)与ELM算法相结合,提出了基于去噪自编码器的极限学习机算法(DAE-ELM)。首先,通过去噪自编码器产生ELM的输入数据、输入权值与隐含层参数;然后,以ELM求得隐含层输出权值,完成对分类器的训练。该算法一方面继承了DAE的优点,自动提取的特征更具代表性与鲁棒性,对于噪声有较强的抑制作用;另一方面克服了ELM参数赋值的随机性,增强了算法鲁棒性。实验结果表明,在不含噪声影响下DAE-ELM相较于ELM、PCA-ELM、SAA-2算法,其分类错误率在MNIST数据集中至少下降了5.6%,在Fashion MNIST数据集中至少下降了3.0%,在Rectangles数据集中至少下降了2.0%,在Convex数据集中至少下降了12.7%。
        In order to solve the problem that parameter random assignment reduces the robustness of the algorithm and the performance is significantly affected by noise of Extreme Learning Machine(ELM), combining Denoising AutoEncoder(DAE) with ELM algorithm, a DAE based ELM(DAE-ELM) algorithm was proposed. Firstly, a denoising autoencoder was used to generate the input data, input weight and hidden layer parameters of ELM. Then, the hidden layer output was obtained through ELM to complete the training of classifier. On the one hand, the advantages of DAE were inherited by the algorithm, which means the features extracted automatically were more representative and robust and were impervious to noise. On the other hand, the randomness of parameter assignment of ELM was overcome and the robustness of the algorithm was improved. The experimental results show that, compared to ELM, Principal Component Analysis ELM(PCA-ELM), SAA-2, the classification error rate of DAE-ELM at least decreases 5.6% on MNIST, 3.0% on Fashion MINIST, 2.0% on Rectangles and 12.7% on Convex.
引文
[1]HUANG G B,ZHU Q Y,SIEW C K.Extreme learning machine:theory and applications[J].Neurocomputing,2006,70(1/2/3):489-501.
    [2]HUANG G B,CHEN L,SIEW C K.Universal approximation using incremental constructive feedforward networks with random hidden nodes[J].IEEE Transactions on Neural Networks,2006,17(4):879-892.
    [3]HUANG G B,ZHOU H M,DING X J,et al.Extreme learning machine for regression and multiclass classification[J].IEEE Transactions on Systems,Man,and Cybernetics,Part B:Cybernetics,2012,42(2):513-529.
    [4]ZONG W W,HUANG G B,CHEN Y Q.Weighted extreme learning machine for imbalance learning[J].Neurocomputing,2013,101:229-242.
    [5]LIANG N Y,HUANG G B,SARATCHANDRAN P,et al.A fast and accurate online sequential learning algorithm for feedforward networks[J].IEEE Transactions on Neural Networks,2006,17(6):1411-1423.
    [6]LAN Y,HU Z J,SOH Y C,et al.An extreme learning machine approach for speaker recognition[J].Neural Computing and Applications,2013,22(3/4):417-425.
    [7]王光华,李素梅,朱丹,等.极端学习机在立体图像质量客观评价中的应用[J].光电子·激光,2014,25(9):1837-1842.(WANG G H,LI S M,ZHU D,et al.Application of extreme learning machine in objective stereoscopic image quality assessment[J].Journal of Optoelectronics·Laser,2014,25(9):1837-1842.)
    [8]XU Y,DAI Y Y,DONG Z Y,et al.Extreme learning machinebased predictor for real-time frequency stability assessment of electric power systems[J].Neural Computing and Applications,2013,22(3/4):501-508.
    [9]HORATA P,CHIEWCHANWATTANA S,SUNAT K.Robust extreme learning machine[J].Neurocomputing,2013,102:31-44.
    [10]RONG H J,ONG Y S,TAN A H,et al.A fast pruned-extreme learning machine for classification problem[J].Neurocomputing,2008,72(1/2/3):359-366.
    [11]CHARAMA L L,ZHOU H,HUANG G B.Representational learning with ELMs for big data[J].IEEE Intelligent Systems,2013,28(6):31-34.
    [12]TANG J X,DENG C W,HUANG G B.Extreme learning machine for multilayer perceptron[J].IEEE Transactions on Neural Networks and Learning Systems,2016,27(4):809-821
    [13]ZHU W T,MIAO J,QING L Y,et al.Hierarchical extreme learning machine for unsupervised representation learning[C]//Proceedings of the 2015 International Joint Conference on Neural Networks.Piscataway,NJ:IEEE,2015:1-8.
    [14]YANG Y M,WU Q M J.Multilayer extreme learning machine with subnetwork nodes for representation learning[J].IEEETransactions on Cybernetics,2016,46(11):2570-2583.
    [15]VINCENT P,LAROCHELLE H,BENGIO Y,et al.Extracting and composing robust features with denoising autoencoders[C]//ICML 2008:Proceedings of the 25th International Conference on Machine Learning.New York:ACM,2008:1096-1103.
    [16]HUANG G,HUANG G B,SONG S J,et al.Trends in extreme learning machines:a review[J].Neural Networks,2015,61:32-48.
    [17]郭旭东,李小敏,敬如雪,等.基于改进的稀疏去噪自编码器的入侵检测[J].计算机应用,2019,39(3):769-773.(GUO XD,LI X M,JING R X,et al.Intrusion detection based on improved sparse denoising autoencoder[J].Journal of Computer Applications,2019,39(3):769-773.)
    [18]LECUN Y,BOTTOU L,BENGIO Y,et al.Gradient-based learning applied to document recognition[J].Proceedings of the IEEE,1998,86(11):2278-2324.
    [19]XIAO H,RASUL K,VOLLGRAF R.Fashion-MNIST:a novel image dataset for benchmarking machine learning algorithms[EB/OL].[2018-09-15].https://arxiv.org/pdf/1708.07747.pdf.
    [20]ERHAN D.Rectangles Data[DB/OL].[2018-09-15].http://www.iro.umontreal.ca/~lisa/twiki/bin/view.cgi/Public/Rectangles Data.
    [21]ERHAN D.Recognition of convex sets[DB/OL].[2018-09-15].http://www.iro.umontreal.ca/~lisa/twiki/bin/view.cgi/Public/Convex Non Convex.
    [22]肖冬,王继春,潘孝礼,等.基于改进PCA-ELM方法的穿孔机导盘转速测量[J].控制理论与应用,2010,27(1):19-24.(XIAO D,WANG J C,PAN X L,et al.Modeling and control of guide-disk speed of rotary piercer[J].Control Theory&Applications,2017,27(1):19-24.)
    [23]马萌萌.基于深度学习的极限学习机算法研究[D].青岛:中国海洋大学,2015:28-30.(MA M M.Research on Extreme learning machine algorithm based on deep learning[D].Qingdao:Ocean University of China,2015:28-30.)

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700