结合光流法和RANSAC的视觉里程计设计
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Visual Odometry Based on Optical Flow Tracking and RANSAC
  • 作者:赵卫东 ; 曹蒙 ; 蒋超
  • 英文作者:ZHAO Wei-dong;CAO Meng;JIANG Chao;School of Electrical Information and Engineering,Anhui University of Technology;
  • 关键词:双目视觉 ; 光流法 ; RANSAC
  • 英文关键词:stereo visual;;optical flow method;;RANSAC
  • 中文刊名:LZGD
  • 英文刊名:Journal of Lanzhou Institute of Technology
  • 机构:安徽工业大学电气与信息工程学院;
  • 出版日期:2019-06-15
  • 出版单位:兰州工业学院学报
  • 年:2019
  • 期:v.26;No.111
  • 语种:中文;
  • 页:LZGD201903012
  • 页数:6
  • CN:03
  • ISSN:62-1209/Z
  • 分类号:66-71
摘要
针对提高服务型机器人室内定位精度的需求,设计了一种将RANSAC算法和光流法结合的双目视觉里程计.采用RANSAC算法对误匹配点对进行剔除的方法,大幅降低了光流法跟踪存在的误匹配,以保证匹配准确度.推导了双目深度测距原理基于前后帧的相机运动模型,并通过PNP算法对运动模型进行求解,最终得到相机即机器人的位姿变化.结果表明:设计的算法将定位的平移误差降低到0.95%,旋转误差降低到0.004 5°/m,能够满足室内机器人的定位精度要求.
        To meet accurate positioning of service robots indoors,a stereo visual odometry based on optical flow method and RANSAC is proposed. The RANSAC algorithm is used to eliminate mismatched point pairs,which greatly reduces the mismatch of optical flow tracking to ensure matching accuracy. The binocular depth ranging principle and the camera motion model based on the front and rear frames are derived. The motion model is solved by the PNP algorithm,and finally the camera,that is,the pose change of the robot is obtained. Experiments show that the designed algorithm can reduce the translation error to 0. 95% and the rotation error to 0.0045°/m,which can meet the positioning accuracy requirements of indoor robots.
引文
[1]Nister D.Visual Odometry[C]//IEEE Computer Society Conference on Computer Vision&Pattern Recognition.Washington DC:IEEE,2004:652-659.
    [2]Corke P,Strelow D,Singh S.Omnidirectional visual odometry for a planetary rover[C]//IEEE/RSJ International Conference on Intelligent Robots&Systems.Edmonton:IEEE,2005:4007-4012.
    [3]Forster C,Pizzoli M,Scaramuzza D.SVO:Fast SemiDirect Monocular Visual Odometry[C]//IEEE International Conference on Robotics&Automation.Hong Kong:IEEE,2014:15-22.
    [4]Bachrach A,Prentice S,He R,et al.Estimation,planning,and mapping for autonomous flight using an RGB-D camera in GPS-denied environments[J].The International Journal of Robotics Research,2012,31(11):1320-1343.
    [5]Endres F,Hess J,Sturm J,et al.3-D Mapping With an RGB-D Camera[J].IEEE Transactions on Robotics,2014,30(1):177-187.
    [6]Geiger A,Ziegler J,Stiller C.Stereo Scan:Dense 3d reconstruction in real-time[J].IEEE Intelligent Vehicles Symposium,2012,32(14):963-968.
    [7]Kitt B,Geiger A,Lategahn H.Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme[C]//Intelligent Vehicles Symposium.San Diego:IEEE,2010:486-492.
    [8]Sun D,Roth S,Black M J.Secrets of optical flow estimation and their principles[C]//Computer Vision&Pattern Recognition.San Francisco:IEEE,2010:2432-2439.
    [9]Campbell J,Sukthankar R,Nourbakhsh I R.Visual Odometry Using Commodity Optical Flow.[J].Proceedings of the American Association of Artificial Intelligence,2004(1):1008-1009.
    [10]Wang C,Zhao C,Yang J.Monocular odometry in country roads based on phase-derived optical flow and4-DOF ego-motion model[J].Industrial Robot:An International Journal,2011,38(5):509-520.
    [11]Geiger A,Lenz P,Urtasun R.Are we ready for autonomous driving?The KITTI vision benchmark suite[C]//IEEE Conference on Computer Vision&Pattern Recognition.Providence,Rhode Island:IEEE,2012:3354-3361.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700