基于双网络级联卷积神经网络的设计
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Design of Convolutional Neural Network Based on Dual-Network Cascade
  • 作者:潘兵 ; 曾上游 ; 杨远飞 ; 周悦 ; 冯燕燕
  • 英文作者:PAN Bing;ZENG Shang-you;YANG Yuan-fei;ZHOU Yue;FENG Yan-yan;College of Electronic Engineering,Guangxi Normal University;
  • 关键词:图像识别 ; 卷积神经网络 ; 网络级联 ; 特征图
  • 英文关键词:image recognition;;Convolutional Neural Network(CNN);;network cascade;;feature map
  • 中文刊名:DGKQ
  • 英文刊名:Electronics Optics & Control
  • 机构:广西师范大学电子工程学院;
  • 出版日期:2018-11-19 15:46
  • 出版单位:电光与控制
  • 年:2019
  • 期:v.26;No.248
  • 基金:国家自然科学基金(11465004)
  • 语种:中文;
  • 页:DGKQ201902013
  • 页数:5
  • CN:02
  • ISSN:41-1227/TN
  • 分类号:61-65
摘要
传统的卷积神经网络通常采用单一的网络结构进行特征提取,但是单一网络结构提取的特征不够充分,导致图片分类的精度不高。针对这个问题提出了采用两种网络同时进行特征提取,再将两种网络级联在一起,得到两种网络的融合特征,使提取的特征更具有辨别性。双网络级联是采用两条支路进行特征提取,一条支路为传统的CNN,另一条支路为在传统的CNN基础上加上残差操作,在下一次特征图降维前通过级联操作将两条不同的网络支路结合在一起。本网络实验采用101_food和caltech256数据集进行测试,将级联后的网络和两条支路网络进行对比,实验最后表现出较好的结果。
        Convolutional Neural Network( CNN) usually adopts a single network for feature extraction.However, the extracted features are not sufficient, which may result in the poor accuracy in image classification.To solve the problem, it is proposed to use two networks for extracting features simultaneously. Then the two networks are cascaded together to obtain the fused features of the two networks, which makes the extracted features more discriminative. The dual-network cascade uses two network branches for feature extraction. One branch is the traditional CNN. The other branch is the traditional CNN plus the residual operation. Before the next dimensional reduction of the feature map, the two different branches are put together. We use the data sets of 101_food and caltech 256 to test the networks. The cascaded network is compared with the two separate branches, and the results are favorable.
引文
[1] KRIZHEVSKY A,SUTSKEVER I,HINTON G E. Image Net classification with deep convolutional neural networks[C]//International Conference on Neural Information Processing Systems,2012:1097-1105.
    [2] SIMONYAN K,ZISSERMAN A. Very deep convolutional networks for large-scale image recognition[EB/OL].[2018-01-05]. https://arxiv. org/pdf/1409. 1556. pdf.
    [3] SZEGEDY C,LIU W,JIA Y,et al. Going deeper with convolutions[C]//IEEE Conference on Computer Vision and Pattern Recognition,2015:1-9.
    [4] HE K,ZHANG X,REN S,et al. Deep residual learning for image recognition[C]//IEEE Conference on Computer Vision and Pattern Recognition,2016:770-778.
    [5] HUANG G,LIU Z,MAATEN L. Densely connected convolutional networks[C]//IEEE Conference on Computer Vision and Pattern Recognition,2017. doi:10. 1109/CVPR.2017. 243.
    [6] RUMELHART D E,HINTON G,WILLIAMS R J. Learning representations by back-propagating errors[J]. Nature,1986,323:533-536.
    [7] LECUN Y,BOTTOU L,BENGIO Y,et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE,1998,86(11):2278-2324.
    [8] HINTON G E,SALAKHUTDINOV R R. Reducing the dimensionality of data with neural networks[J]. Science,2006,313:504-507.
    [9] GAO L G,CHEN P Y,YU S M. Demonstration of convolution kernel operation on resistive cross-point array[J].IEEE Electron Device Letters,2016,37(7):870-873.
    [10] ZEILER M D,FERGUS R. Stochastic pooling for regularization of deep convolutional neural networks[J/OL].[2018-01-08]. https://arxiv. org/pdf/1301. 3557. pdf.
    [11] IOFFE S,SZEGEDY C. Batch normalization:accelerating deep network training by reducing internal covariate shift[J/OL].[2018-01-02]. https://arxiv. org/pdf/1502.03167v3. pdf.
    [12] HOCHREITER S. The vanishing gradient problem during learning recurrent neural nets and problem solutions[C]//International Journal of Uncertainty Fuzziness and Knowledge-Based System,1998,6(2):107-116.
    [13] SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al.Dropout:a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research,2014,15(1):1929-1958.
    [14] JIA Y,SHELHAMER E,DONAHUE J,et al. Caffe:convolutional architecture for fast feature embedding[C]//ACM International Conference on Multimedia,2014:675-678.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700