基于空间平面约束的视觉定位模型研究
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Vision-based localization model based on spatial plane constraints
  • 作者:高飞 ; 葛一粟 ; 汪韬 ; 卢书芳 ; 张元鸣
  • 英文作者:Gao Fei;Ge Yisu;Wang Tao;Lu Shufang;Zhang Yuanming;Zhejiang university of technology;
  • 关键词:双目视觉 ; 视觉定位 ; 平面约束 ; 三维信息获取
  • 英文关键词:stereo vision;;vision positioning;;plain constraints;;three-dimensional information acquisition
  • 中文刊名:YQXB
  • 英文刊名:Chinese Journal of Scientific Instrument
  • 机构:浙江工业大学;
  • 出版日期:2018-07-15
  • 出版单位:仪器仪表学报
  • 年:2018
  • 期:v.39
  • 基金:国家自然科学基金(61402410);; 浙江省重点研发计划(2018C01064)项目资助
  • 语种:中文;
  • 页:YQXB201807022
  • 页数:8
  • CN:07
  • ISSN:11-2179/TH
  • 分类号:186-193
摘要
针对现有视觉定位方法中存在的立体匹配效率低下等不足,为实现平面运动目标位姿的实时视觉计算,提出基于空间平面约束的视觉定位模型。首先,通过双目视觉标定得到相机内外参数;然后,使用平面标定方法计算出空间约束平面方程;其次,根据空间平面方程给出了目标平行平面上任意点的空间坐标计算方法,进而实现运动目标位姿测算;最后,以1∶49的集装箱卡车视觉引导进行了模拟实验。实验结果表明,该模型计算得到的物理位置的标准差在略优于双目模型的情况下,具有更强的实时性。
        The current vision-based localization methods have the limitations of stereo matching efficiency,etc. To realize the visual accurate measurement of real-time planar moving pose,a vision-based localization model with spatial plane constraints is proposed.First,the internal and external parameters of the camera are obtained by stereo vision calibration. Secondly,the plane equation is also calculated by using a plane calibration method. Thirdly,according to the proposed vision-based localization model,the equation of target parallel plane and the point coordinates are achieved,which is further utilized to estimate the orientation of the moving object. Finally, the simulation experiment of vision-based localizing with container truck with the ratio of 1 : 49 is implemented. Experimental results show that the proposed model is real-time and with equal or a little more measurement accuracy to/than the stereo model.
引文
[1]BARLOW H B.Vision:A computational investigation into the human representation and processing of visual information[J].Journal of Mathematical Psychology,1983,27(1):107-110.
    [2]DESOUZA G N,KAK A C.Vision for mobile robot navigation:A survey[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2002,24(2):237-267.
    [3]HOWARD A.Real-time stereo visual odometry for autonomous ground vehicles[C].IEEE/RSJ International Conference on Intelligent Robots and Systems,2008:3946-3952.
    [4]WOLF J,BURGARD W,BURKHARDT H.Robust vision-based localization by combining an image-retrieval system with Monte Carlo localization[J].IEEE Transactions on Robotics,2005,21(2):208-216.
    [5]DERETEY E,AHMED M T,MARSHALL J A,et al.Visual indoor positioning with a single camera using Pn P[C].International Conference on Indoor Positioning and Indoor Navigation,2015:1-9.
    [6]XU C,ZHANG L L,CHENG L,et al.Pose estimation from line correspondences:A complete analysis and a series of solutions[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2017,39(6):1209-1222.
    [7]孙凤梅,王卫宁.基于单个平行四边形单幅图像的物体定位[J].自动化学报,2006,32(5):746-752.SUN F M,WANG W N.Object location based on single parallelogram single image[J]Journal of Automation,2006,32(5):746-752.
    [8]王晓剑,潘顺良,邱力为.基于双平行线特征的位姿估计解析算法[J].仪器仪表学报,2008,29(3):600-604.WANG X J,PAN SH L,QIU L W.An algorithm for pose estimation based on double parallel lines[J].Chinese Journal of Scientific Instrument,2008,29(3):600-604.
    [9]CHEN S Y,LI Y J,CHEN H Y.A monocular vision localization algorithm based on maximum likelihood estimation[C].Proceedings of the 2017 International conference on Real-Time Computing and Robotics,2017:561-566.
    [10]NASEER T,BURGARD W,STACHNISS C.Robust visual localization across seasons[J].IEEE Transactions on Robotics,2018,34(2):289-302.
    [11]郭艾侠,熊俊涛,肖德琴,等.融合Harris与SIFT算法的荔枝采摘点计算与立体匹配[J].农业机械学报,2015,46(12):11-17.GUO AI X,XIONG J T,XIAO D Q,et al.Calculation and stereo matching of litchi picking point based on Harris and SIFT algorithm[J].Journal of Agricultural Machinery,2015,46(12):11-17.
    [12]李彩林,郭宝云,贾象阳.基于立体视觉的线迹质量自动检测方法[J].计算机辅助设计与图形学学报,2015,27(6):1067-1073.LI C L,GUO B Y,JIA X Y.Automatic detection method of stitch quality based on stereo vision[J].Journal of Computer-Aided Design and Computer Graphics,2015,27(6):1067-1073.
    [13]贺利乐,路二伟,李赵兴,等.基于ORB算子的快速立体匹配算法[J].计算机工程与应用,2017,53(1):190-194.HE L L,LU E W,LI ZH X,et al.Fast stereo matching algorithm based on ORB operator[J].Computer Engineering and Application,2017,53(1):190-194.
    [14]SUM C M.Fast stereo matching using rectangular subregioning and 3D maximum-surface techniques[J].International Journal of Computer Vision,2002,47(1/2/3):99-117.
    [15]STEFANO L D,MARCHINONNI M,MATTOCCIA S.A fast area-based stereo matching algorithm[J].Image Vision Computing,2004,22(12):983-1005.
    [16]季雨航,马利庄.基于稳定树的立体匹配视差优化算法[J].计算机辅助设计与图形学学报,2016,28(12):2159-2167.JI Y H,MA L ZH.Stereo matching parallax optimization algorithm based on stable tree[J].Journal of ComputerAided Design and Computer Graphics,2016,28(12):2159-2167.
    [17]项荣,应义斌,蒋烯煜.基于双目立体视觉的番茄定位[J].农业工程学报,2012,28(5):161-167.XIANG R,YING Y B,JIANG X Y.Location of tomato based on binocular stereo vision[J].Journal of Agricultural Engineering,2012,28(5):161-167.
    [18]YAYI Y,YACHIDA M.Real-time generation of environmental map and obstacle avoidance using omnidirectional image sensor with conic mirror[C].IEEE Computer Society Conference on Computer Vision and Pattern Recognition,1991:160-165.
    [19]汤一平,王庆,陈敏智,等.立体全方位视觉传感器的设计[J].仪器仪表学报,2010,31(7):1520-1527.TANG Y P,WANG Q,CHEN M ZH,et al.Design of stereo omni-directional vision sensor[J].Chinese Journal of Scientific Instrument,2010,31(7):1520-1527.
    [20]TAN L N.Omnidirectional-vision-based distributed optimal tracking control for mobile multirobot systems with kinematic and dynamic disturbance rejection[J].IEEE Transactions on Industrial Electronics,2018,65(7):5693-5703.
    [21]程思培,达飞鹏.基于动态规划的分层立体匹配算法研究[J]仪器仪表学报,2016,37(7):1665-1672.CHENG S P,DA F P.Research on hierarchical stereo matching algorithm based on dynamic programming[J].Chinese Journal of Scientific Instrument,2016,37(7):1665-1672.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700