采用CNN和Bidirectional GRU的时间序列分类研究
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Research on Time Series Classification Using CNN and Bidirectional GRU
  • 作者:张国豪 ; 刘波
  • 英文作者:ZHANG Guohao;LIU Bo;College of Information Science and Technology,Jinan University;
  • 关键词:时间序列分类 ; 深度学习 ; 卷积神经网络 ; 循环神经网络 ; 双向门控循环单元
  • 英文关键词:time series classification;;deep learning;;convolutional neural network;;recurrent neural network;;bidirectional gated recurrent unit
  • 中文刊名:KXTS
  • 英文刊名:Journal of Frontiers of Computer Science and Technology
  • 机构:暨南大学信息科学技术学院;
  • 出版日期:2019-04-19 13:58
  • 出版单位:计算机科学与探索
  • 年:2019
  • 期:v.13;No.129
  • 基金:国家自然科学基金No.U1431227;; 广州市科技计划基金No.201604010037~~
  • 语种:中文;
  • 页:KXTS201906003
  • 页数:12
  • CN:06
  • ISSN:11-5602/TP
  • 分类号:21-32
摘要
时间序列数据具有非离散性、数据之间的时序相关性、特征空间维度大等特点,当前大多数分类方法需要经过复杂的数据处理或特征工程,未考虑到时间序列具有不同时间尺度特征以及序列数据之间的时序依赖。通过结合卷积神经网络和循环神经网络中的双向门控循环单元,提出了一个新的端对端深度学习神经网络模型BiGRU-FCN,不需要对数据进行复杂的预处理,并且通过不同的网络运算来获取多种特征信息,如卷积神经网络在时序信息上的空间特征以及双向循环神经网络在序列上的双向时序依赖特征,对单维时间序列进行分类。在大量的基准数据集上对模型进行实验与评估,实验结果表明,与现有的多种方法相比,所提出的模型具有更高的准确率,具有很好的分类效果。
        Time series data have the characteristics of non-discreteness, correlation between data and large feature space dimensions. Most current classification methods require complex data processing or feature engineering,which don't take into account different time scale features and the timing dependencies between sequence data. In this paper, a new end-to-end deep learning neural network model named Bi GRU-FCN is proposed by combining convolutional neural network and bidirectional gated recurrent unit of recurrent neural network. It does not require complex preprocessing of data. A plurality of features can be obtained through different networks operations, such as the spatial characteristics of convolutional neural networks on time series information and the bidirectional time-dependent characteristics of bidirectional recurrent neural networks in sequence, which can be used to classify the univariate time series data. The model is tested and evaluated on a large number of benchmark datasets. The experimental results show that the proposed model has higher accuracy than the existing methods and has preferable ability of classification.
引文
[1]Keogh E,Kasetty S.On the need for time series data mining benchmarks:a survey and empirical demonstration[J].Data Mining and Knowledge Discovery,2003,7(4):349-371.
    [2]Zheng Y,Liu Q,Chen E H,et al.Exploiting multichannels deep convolutional neural networks for multivariate time series classification[J].Frontiers of Computer Science,2016,10(1):96-112.
    [3]Wang Z G,Yan W Z,Oates T.Time series classification from scratch with deep neural networks:a strong baseline[C]//Proceedings of the International Joint Conference on Neural Networks,Anchorage,May 14-19,2017.Piscataway:IEEE,2017:1578-1585.
    [4]Chen Y P,Keogh E,Hu B,et al.The UCR time series classification archive[EB/OL].(2015)[2018].http://www.cs.ucr.edu/~eamonn/time_series_data/.
    [5]Faloutsos C,Ranganathan M,Manolopoulos Y.Fast subsequence matching in time-series databases[C]//Proceedings of the ACM SIGMOD International Conference on Management of Data,Minneapolis,May 24-27,1994.New York:ACM,1994:419-429.
    [6]Berndt D J,Clifford J.Using dynamic time warping to find patterns in time series[C]//Proceedings of the AAAI Workshop on Knowledge Discovery in Databases,Seattle,Jan,1994.Menlo Park:AAAI,1994:359-370.
    [7]Chen L,Ng R T.On the marriage of Lp-norms and edit distance[C]//Proceedings of the 30th International Conference on Very Large Data Bases,Toronto,Aug 31-Sep 3,2004.San Mateo:Morgan Kaufmann,2004:792-803.
    [8]Vickers A J.What is a P-value anyway?34 stories to help you actually understand statistics[M].Boston:AddisonWesley,2010.
    [9]Nanopoulos A,Alcock R,Manolopoulos Y.Feature-based classification of time-series data[M]//Information Processing and Technology.Commack:Nova Science Publishers,Inc,2001.
    [10]Geurts P.Pattern extraction for time series classification[C]//LNCS 2168:Proceedings of the 5th European Conference on Principles and Practice of Knowledge Discovery in Databases,Freiburg,Sep 3-5,2001.Berlin,Heidelberg:Springer,2001:115-127.
    [11]Rodríguez J J,Alonso C J,Bostr?m H.Boosting interval based literals[J].Intelligent Data Analysis,2001,5(3):245-262.
    [12]Rodríguez J J,Alonso C J,Maestro J A.Support vector machines of interval-based features for time series classification[J].Intelligent Data Analysis,2005,18(4/5):171-178.
    [13]Schapire R E.A brief introduction to boosting[C]//Proceedings of the 16th International Joint Conference on Artificial Intelligence,Stockholm,Jul 31-Aug 6,1999.San Mateo:Morgan Kaufmann,1999:1401-1406.
    [14]Deng H T,Runger G C,Tuv E,et al.A time series forest for classification and feature extraction[J].Information Sciences,2013,239:142-153.
    [15]Baydogan M G,Runger G C,Tuv E.A bag-of-features framework to classify time series[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2013,35(11):2796-2802.
    [16]Sch?fer P.The boss is concerned with time series classification in the presence of noise[J].Data Mining and Knowledge Discovery,2015,29(6):1505-1530.
    [17]Sch?fer P,H?gqvist M.SFA:a symbolic fourier approximation and index for similarity search in high dimensional datasets[C]//Proceedings of the the 15th International Conference on Extending Database Technology,Berlin,Mar 27-30,2012.New York:ACM,2012:516-527.
    [18]Sch?fer P.Scalable time series classification[J].Data Mining and Knowledge Discovery,2016,30(5):1273-1298.
    [19]Lines J,Bagnall A J.Time series classification with ensembles of elastic distance measures[J].Data Mining and Knowledge Discovery,2015,29(3):565-592.
    [20]Bagnall A J,Lines J,Hills J,et al.Time-series classification with COTE:the collective of transformation-based ensembles[C]//Proceedings of the 32nd IEEE International Conference on Data Engineering,Helsinki,May 16-20,2016.Washington:IEEE Computer Society,2016:1548-1549.
    [21]Cui Z C,Chen W L,Chen Y X.Multi-scale convolutional neural networks for time series classification[J].ar Xiv:1603.06995,2016.
    [22]Karim F,Majumdar S,Darabi H,et al.LSTM fully convolutional networks for time series classification[J].IEEE Access,2018,6:1662-1669.
    [23]Ioffe S,Szegedy C.Batch normalization:accelerating deep network training by reducing internal covariate shift[C]//Proceedings of the 32nd International Conference on Machine Learning,Lille,Jul 6-11,2015.Stroudsburg:International Machine Learning Society,2015:448-456.
    [24]Rumelhart D E,Hinton G E,Williams R J.Learning representations by back-propagating errors[J].Nature,1986,323:533-536.
    [25]Bengio Y,Simard P Y,Frasconi P.Learning long-term dependencies with gradient descent is difficult[J].IEEETransactions on Neural Networks,1994,5(2):157-166.
    [26]Hochreiter S,Schmidhuber J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780.
    [27]Cho K,van Merrienboer B,Bahdanau D,et al.On the properties of neural machine translation:encoder-decoder approaches[C]//Proceedings of the 8th Workshop on Syntax,Semantics and Structure in Statistical Translation,Doha,Oct 25,2014.Stroudsburg:ACL,2014:103-111.
    [28]Chung J,Gül?ehre?,Cho K,et al.Empirical evaluation of gated recurrent neural networks on sequence modeling[J].ar Xiv:1412.3555,2014.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700