基于改进光流法和纹理权重的视觉里程计
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Visual odometry based on modified optical flow and texture weights
  • 作者:吴荻 ; 战凯 ; 肖小凤
  • 英文作者:WU Di;ZHAN Kai;XIAO Xiao-feng;School of Mechanical Engineering,University of Science and Technology Beijing;Beijing General Research Institute of Mining and Metallurgy;
  • 关键词:直接法视觉里程计 ; LK光流 ; Census变换 ; 特征提取 ; 不同纹理权重
  • 英文关键词:direct visual odometry;;LK optical flow;;Census transform;;feature detector;;different texture weights
  • 中文刊名:SJSJ
  • 英文刊名:Computer Engineering and Design
  • 机构:北京科技大学机械学院;北京矿冶研究总院;
  • 出版日期:2019-01-16
  • 出版单位:计算机工程与设计
  • 年:2019
  • 期:v.40;No.385
  • 基金:国家863高技术研究发展计划基金项目(2011AA060408)
  • 语种:中文;
  • 页:SJSJ201901037
  • 页数:6
  • CN:01
  • ISSN:11-1775/TP
  • 分类号:238-243
摘要
提出一种基于改进Census变换和不同纹理区域的直接法视觉里程计。为解决直接法视觉里程计中像素误差对结果造成较大影响的问题,使用改进的Census变换计算LK光流。在传统Census变换中增加邻域周围像素和邻域均值像素的比较关系。针对图像中不同纹理区域的特征点,在相机位姿计算过程中,根据其所在区域的纹理复杂程度设定不同的权重值。使用TUM和KITTI数据库为测试平台,分别验证了ORB、FAST、BRISK这3种不同检测子的光流跟踪结果,并验证了改进视觉里程计的正确性和有效性。
        A direct visual odometry based on improved Census transform and different texture regions was proposed.The improved Census transform was used for reducing the error caused by intensity error in optical flow.The comparisons between the neighbor pixel intensity and the average intensity of neighbor pixel were added to traditional Census transform.For the order of elements in transforming binary string to decimal representation,a new arrangement was proposed.TUM and KITTI datasets were used to compare the optical flow results of ORB,FAST,BRISK descriptors.Experimental results demonstrate the validity and effectiveness.
引文
[1]Mur-Artal R, Montiel JMM,Tardós JD.ORB-SLAM:A versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics,2017,31(5):1147-1163.
    [2]Jaimez M,Gonzalez-Jimenez J.Fast visual odometry for 3-D range sensors[J].IEEE Transactions on Robotics,2015,31(4):809-822.
    [3]Engel J,Sch9ps T,Cremers D.LSD-SLAM:Large-scale direct monocular SLAM[C]//European Conference on Computer Vision.Zurich:Springer,2014:834-849.
    [4]Demetz O,Stoll M,Volz S,et al.Learning brightness transfer functions for the joint recovery of illumination changes and optical flow[C]//European Conference on Computer Vision.Zurich:Springer,2014:455-471.
    [5]Klose S,Heise P,Knoll A.Efficient compositional approaches for real-time robust direct visual odometry from RGB-D data[J].Scientific American,2013,114(18):1100-1106.
    [6]Alismail H,Kaess M,Browning B,et al.Direct visual odometry in low light using binary descriptors[J].IEEE Robotics&Automation Letters,2017,99:1-1.
    [7]Forster C,Pizzoli M,Scaramuzza D.SVO:Fast semi-direct monocular visual odometry[C]//IEEE International Conference on Robotics and Automation.Hong Kong:IEEE,2014:15-22.
    [8]Gang Z,Xiaoli W,Lirong W.Motion analysis and research of local navigation system for visual-impaired person based on improved LK optical flow[C]//International Conference on Intelligent Networks&Intelligent Systems. Tianjin:IEEE,2012:348-351.
    [9]Hafner D,Demetz O,Weickert J.Why is the census transform good for robust optic flow computation?[C]//International Conference on Scale Space and Variational Methods in Computer Vision.Berlin:Springer,2013:210-221.
    [10]Geiger A,Lenz P,Urtasun R.Are we ready for autonomous driving?The KITTI vision benchmark suite[C]//Computer Vision and Pattern Recognition.Providence:IEEE,2012:3354-3361.
    [11]Brox T,Malik J.Large displacement optical flow:Descriptor matching in variational motion estimation[J].IEEE Transactions on Pattern Analysis&Machine Intelligence,2011,33(3):500.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700