基于双目视觉的管路位姿测量
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Accurate measurement method of pipeline pose based on binocular vision
  • 作者:王治 ; 刘检华 ; 刘少丽 ; 任杰轩 ; 吴天一
  • 英文作者:WANG Zhi;LIU Jianhua;LIU Shaoli;REN Jiexuan;WU Tianyi;Laboratory of Digital Manufacturing, School of Mechanical Engineering, Beijing Institute of Technology;
  • 关键词:双目视觉 ; 管路 ; 边缘像素点 ; 位姿测量 ; 鲁棒性
  • 英文关键词:binocular vision;;pipeline;;edge pixels;;pose measurement;;robustness
  • 中文刊名:JSJJ
  • 英文刊名:Computer Integrated Manufacturing Systems
  • 机构:北京理工大学机械与车辆学院数字化制造研究所;
  • 出版日期:2019-01-23 09:39
  • 出版单位:计算机集成制造系统
  • 年:2019
  • 期:v.25;No.253
  • 基金:国家自然科学基金资助项目(51875044);; 国防基础科研资助项目(JCKY2017204B502)~~
  • 语种:中文;
  • 页:JSJJ201905002
  • 页数:8
  • CN:05
  • ISSN:11-5946/TP
  • 分类号:15-22
摘要
针对管路位姿测量过程中管路的无纹理、少特征等难点,提出一种基于双目视觉的管路位姿精确测量方法。首先利用双目视觉原理从二维图像中计算管路的初始位姿;然后利用基于边缘像素点的位姿优化算法建立3D-2D的投影映射关系,通过优化边缘像素点到投影模型边缘的距离,对投影模型边缘和图像边缘进行拟合,从而优化管路初始位姿。本算法仅在投影模型边缘的一定区域内选取具有高灰度值梯度的像素点,从而增强算法的鲁棒性并提高计算效率。测量实验表明,该方法测量精度较高、操作简单、测量时间只需2 s~3 s,可有效提高管路位姿的测量精度及效率,满足工业应用需求。
        To address the problems such as texture-less and feature-less of pipe during the pose estimation, a novel method based on binocular vision was proposed. The principle of binocular vision was utilized to estimate the initial pose of pipe from 2D images. Then the 3D-2D mapping relationship was established by using the pose optimization algorithm based on the edge pixels. The model was fitted to the object in the images by optimizing the distance between the edge pixels and the projected model edges. To improve the robustness and efficiency of the algorithm, a limited number of pixels with high gray value gradients were selected only in a buffer around the projected model edges, and the pixels whose gradients were not perpendicular to the projected model edges were excluded. Experimental results showed that the method had high measurement accuracy and the measurement time was only 2-3s. The easy-operating and high efficiency of this method was proved as well, which enhanced the measurement accuracy effectively.
引文
[1] LI Haiou.Design and research on structure and control of spacecraft pipeline assembly robot[D].Langfang:North China Institute of Aerospace Engineering,2015(in Chinese).[李海欧.空间飞行器管路装配机器人结构和控制的设计与研究[D].廊坊:北华航天工业学院,2015.]
    [2] JIN Peng,LIU Jianhua,LIU Shaoli,et al.Centerline-based measuring method for endpoints of tube[J].Computer Integrated Manufacturing Systems,2016,22(10):2284-2293(in Chinese).[金鹏,刘检华,刘少丽,等.基于中心线的管路端点位置精确测量方法[J].计算机集成制造系统,2016,22(10):2284-2293.]
    [3] ZHANG Hongtao.Research on key technologies of online visual inspection system for steel surface defection[D].Tianjin:Tianjin University,2008(in Chinese).[张洪涛.钢板表面缺陷在线视觉检测系统关键技术研究[D].天津:天津大学,2008.]
    [4] YANG Tiebin.Research on automatic detection technology of ceramic ball surface defect based on machine vision[D].Harbin:Harbin Institute of Technology,2007(in Chinese).[杨铁滨.基于机器视觉的陶瓷球表面缺陷自动检测技术研究[D].哈尔滨:哈尔滨工业大学,2007.]
    [5] ULRICH M,WIEDEMANN C,STEGER C.CAD-based recognition of 3D objects in monocular images[C]//Proceedings of the IEEE International Conference on Robotics and Automation.Washington,D.C.,USA:IEEE,2009:1191-1198.
    [6] SADRAN E,KAI M W,BURSCHKA D.Sparse keypoint models for 6D object pose estimation[C]//Proceedings of the European Conference on Mobile Robots.Washington,D.C.,USA:IEEE,2014:307-312.
    [7] GRUNDMANN T,EIDENBERGER R,SCHNEIDER M,et al.Robust high precision 6D pose determination in complex environments for robotic manipulation[C]//Proceedings of the Workshop on IEEE International Conference on Robotics & Automation.Washington,D.C.,USA:IEEE,2010:301-308.
    [8] LENG Dawei,MA Hongbing.Iterative pose estimation of a 3D rigid object based on general 2D-3D contour pointcorrespondence[J].Journal of Graduate University of Chinese Academy of Sciences,2012,29(6):821-828(in Chinese).[冷大炜,马洪兵.基于2D-3D泛轮廓点对应的三维刚体目标的迭代姿态估计[J].中国科学院大学学报,2012,29(6):821-828.]
    [9] ZUO Junqing,WANG Huinan.Satellite attitude estimation based on dual quaternion from monocular camera[J].Journal of Chinese Inertial Technology,2008,16(5):577-581(in Chinese).[左俊青,王惠南.单目视觉下基于对偶四元数的卫星姿态的确定[J].中国惯性技术学报,2008,16(5):577-581.]
    [10] MICHEL F,KIRILLOV A,BRACHMANN E,et al.Global hypothesis generation for 6D object pose estimation[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.Washington,D.C.,USA:IEEE,2017,20(7):462-471.
    [11] PAYET N,TODOROVIC S.From contours to 3D object detection and pose estimation[C]//Proceedings of the International Conference on Computer Vision.Washington,D.C.,USA:IEEE Computer Society,2011:983-990.
    [12] ARIE-NACHIMSON M,BASRI R.Constructing implicit 3D shape models for pose estimation[C]//Proceedings of the IEEE International Conference on Computer Vision.Washington,D.C.,USA:IEEE,2009:1341-1348.
    [13] JIN Peng,LIU Jianhua,LIU Shaoli,et al.A new multi-vision-based reconstruction algorithm for tube inspection[J].International Journal of Advanced Manufacturing Technology,2017,93(4):1-15.
    [14] HINTERSTOISSER S,CAGNIART C,LLIC S,et al.Gradient response maps for real-time detection of texture-less objects[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2012,34(5):876-888.
    [15] ZhANG Haoruo,YANG Cao,ZHU Xiaoxiao,et al.An improved approach for model-based detection and pose estimation of texture-less objects[C]//Proceedings of the Advanced Robotics and Its Social Impacts.Washington,D.C.,USA:IEEE,2016:261-266.
    [16] ZHANG Tian,TANG Chengtong,LIU Jianhua.Bend tube spatial I parameter measurement method based on multi-vision[J].Chinese Journal of Scientific Instrument,2013,34(2):260-267(in Chinese).[张天,唐承统,刘检华.基于多目视觉的弯管空间参数测量方法[J].仪器仪表学报,2013,34(2):260-267.]
    [17] GAO Hongwei.Computer based binocular vision[M].Beijing:Electronic Industry Press,2012(in Chinese).[高宏伟.计算机双目立体视觉[M].北京:电子工业出版社,2012.]
    [18] LOWE D G.Fitting parameterized three-dimensional models to images[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1991,13(5):441-450.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700