用户名: 密码: 验证码:
基于加权Hash特征与卷积辅助网络的ACF行人检测研究
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Research on ACF Pedestrian Detection Based on Improved Hash Code and Auxiliary CNN
  • 作者:王薇薇 ; 王江涛 ; 陈燕
  • 英文作者:WANG Wei-wei;WANG Jiang-tao;CHEN Yan;School of Physical and Electronic Information, Huaibei Normal University;
  • 关键词:行人检测 ; ACF ; AdaBoost ; Hash码 ; CNN
  • 英文关键词:pedestrian detection;;ACF;;AdaBoost;;Hash Code;;CNN
  • 中文刊名:CCSS
  • 英文刊名:Journal of Changchun Normal University
  • 机构:淮北师范大学物理与电子信息学院;
  • 出版日期:2019-04-20
  • 出版单位:长春师范大学学报
  • 年:2019
  • 期:v.38;No.353
  • 基金:国家自然科学基金项目“基于辅助区域的公共环境中目标协同跟踪研究”(61203272);; 安徽省高校自然科学研究重大项目“面向自主驾驶的深浅层学习协同行人检测研究”(KJ2018ZD038);; 安徽省高等学校省级质量工程项目“高级语言程序设计精品开放课程”(2017kfk043)
  • 语种:中文;
  • 页:CCSS201904007
  • 页数:7
  • CN:04
  • ISSN:22-1409/G4
  • 分类号:38-44
摘要
传统的ACF+AdaBoost行人检测框架在达到较为理想的检测率时,误检率也会迅速增高,难以满足实际需求。针对该问题,本文提出了一种自适应加权的Hash码特征,用来增加行人特征的多样性。在此基础上,通过级联一个辅助网络降低系统的误检率,该辅助网络采用了浅层的CNN结构,在保证系统实时性的前提下对AdaBoost分类器的分类结果进行二次分类。在INRIA数据中进行检测实验的结果表明,改进的Hash码简单、易算,对行人的表征能力强,在不影响实时性的前提下,把系统的MR-FPPI(Miss rate against false positives per image)从17.05%降低到16.31%。系统级联辅助CNN后系统的MR-FPPI降低到16.93%,而加入Hash码通道,且级联辅助CNN后,系统的MR-FPPI降低到15.96%,检测性能得到较为明显的提高。
        When the traditional ACF+AdaBoost pedestrian detection framework achieves a better detection rate, the false detection rate will also increase rapidly; and it is difficult to meet the actual demand. In order to solve this problem, we proposed an adaptive weighted Hash Code feature to increase the diversity of pedestrian features. Beside this, an auxiliary network is designed to reduce the false detection rate of the system. The proposed auxiliary network uses a shallow CNN structure to re-classify the results of AdaBoost detector on the premise of ensuring the real-time performance of the system. The results of experiments performed in INRIA dataset show that the improved Hash Code is simple, easy to calculate, and has a strong ability to characterize pedestrians. Without affecting real-time performance, the system has MR-FPPI(Miss Rate against False Positives per Image) from 17.05% to 16.31%. After cascading the CNN classifier, the MR-FPPI of the system is reduced to 16.93%. When the Hash Code channel and cascading assisted CNN are both utilized, the MR-FPPI of the system is reduced to 15.96%. The detection performance has significant improvement compared with the traditional ACF+Adaboost system.
引文
[1]Dalal N,Triggs B.Histograms of oriented gradients for human detection[C].IEEE Computer Society Conference on Computer Vision & Pattern Recognition,IEEE Computer Society,2005:886-893.
    [2]田仙仙,鲍泓,徐成.一种改进HOG特征的行人检测算法[J].计算机科学,2014(9):320-324.
    [3]刘威,段成伟,遇冰,等.基于后验HOG特征的多姿态行人检测[J].电子学报,2015(2):217-224.
    [4]田亚娜,童莹,曹雪虹.基于HOG特征和DSPP降维的人脸识别算法[J].计算机技术与发展,2018(1):69-73.
    [5]Dollár P,Tu Z,Perona P,et al.Integral Channel Features[C].British Machine Vision Conference,2009.
    [6]Dollár P,Belongie S,Perona P.The fastest pedestrian detector in the west[C].British Machine Vision Conference,2010:1-11.
    [7]Dollár P,Appel R,Kienzle W.Crosstalk cascades for frame-rate pedestrian detection[C].Springer-Verlag, European Conference on Computer Vision,2012:645-659.
    [8]Dollár P,Belongie S,Belongie S,et al.Fast feature pyramids for object detection[C].IEEE Transactions on Pattern Analysis & Machine Intelligence,2014:1532-1545.
    [9]Nam W,Dollár P,Han J H. Local decorrelation for improved pedestrian detection[C].NIPS,2014:1-9.
    [10]彭志蓉,赵美蓉,杨伟明,等.改进的多光谱聚合通道行人检测[J].光电工程,2017(9):882-887.
    [11]李庆武,仇春春,俞楷,等.基于多尺度聚合通道特征的实时行人检测[J].电子测量与仪器学报,2015(11):1691-1697.
    [12]韩建栋,邓一凡.结合纹理与轮廓特征的多通道行人检测算法[J].计算机应用,2017(10):3012-3016.
    [13]郭仪权.基于哈希的多目标跟踪算法的研究[D].合肥:安徽大学,2017.
    [14]陈金辉,叶西宁.行人检测中非极大值抑制算法的改进[J].华东理工大学学报,2015(3):371-378.
    [15]李子印,朱明凌,陈柱.融合图像感知哈希技术的运动目标跟踪[J].中国图象图形学报,2015(6):795-804.
    [16]曲景影,孙显,高鑫.基于CNN模型的高分辨率遥感图像目标识别[J].国外电子测量技术,2016(8):45-50.
    [17]李彦冬,郝宗波,雷航.卷积神经网络研究综述[J].计算机应用,2016(9):2508-2515.
    [18]谢林江,季桂树,彭清,等.改进的卷积神经网络在行人检测中的应用[J].计算机科学与探索,2018(5):708-718.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700