特征点距离及平面约束的研抛机器人四自由度单目视觉定位系统
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文结合国家自然科学基金项目“自定位微小研抛机器人精整加工大型曲面研究”,以实现微小研抛机器人在自由曲面上的自主定位为研究目标,提出了一种基于光学特征点距离及平面约束的微小研抛机器人四自由度单目视觉定位方法,能够实时为机器人导航控制提供空间位置和行进方向参数。
     通过分析光学特征点在成像平面上的像点中心坐标,利用布置在机器人本体上的六个光学特征点已知的位置关系,得到在作业空间运动的机器人的三维位置坐标和运动方向,并实时将获得参数传输给导航和控制系统。本论文分析了定位系统的工作原理、建立了视觉定位系统的数学模型,对光学特征点成像中心坐标的算法进行了优化,创新性地提出了将光学特征点的平面位置关系作为约束条件,应用于光学特征点定位结果的修正上,有效地提高了单目视觉定位系统在深度方向上的定位精度和定位结果的稳定性。论文在分析和研究了经典的相机标定方法的基础上,结合微小研抛机器人定位的实际需要,采用了张正友平面网格标定方法,在MATLAB环境下,对CCD摄像机进行内外参数的标定。以Visual C++为开发平台,完成了定位系统的软件开发。最后,对定位系统进行了总体测试实验,验证了定位系统的定位精度能够满足微小研抛机器人在作业空间内的定位、导航和控制的需要。
     课题研究工作得到了国家高新技术研究发展计划“大型曲面自主研抛机器人技术”(项目编号:2006AA04Z214)和国家自然科学基金项目“自定位微小研抛机器人精整加工大型曲面研究”(项目编号:50575092)的资助。
There is a new method which automatic polishing large-scale free-form surface by micro-robot. In order to enable the polishing robot moving along set route, robot should be able to determine their location and movement by self. In related technology of Autonomous Mobile Robot Research, the navigation of mobile robot is not only a key technology, the base of the implementation of the robot mission. The goal of navigation research is the mobile robot is able to have a destination, finish an assigned task and carry out a specific operation. Localization for mobile robot is the most basic part of navigation system and the most basic problem which need be solved for achieve tasks of navigation. Real-time and precise localization is the key to improve the performance of robot and the guarantee to realize the robot autonomous navigation. In paper, we established the localization system of polishing robot with four degrees of freedom based on monolar vision according to the actual needs of the free-form surface polishing microrobot and advantages and disadvantages of different positioning methods. The localization system is able to provide the parameters of the position and direction of robot for control system of polishing microrobot.
     Localization system of polishing robot is mainly made up of seven parts, high-resolution CCD camera, fixed focal lens, optical filter, image acquisition card, computer workstations, optical feature points arranged in the robot body (infra-red light-emitting diode) and positioning software package. In the system, the six infra-red light-emitting diodes as measured three-dimensional information carriers and their coordinates at the robot coordinate system have been accurate calibrated by coordinate measuring machine. The positioning point’s coordinates in the robot coordinate are known. The images of feature points on the robot are taken by CCD and the visible light is flitted by 760mm wavelength optical filter. So the six infra-red light-emitting diodes is imaging in photosensitive area through the fixed focal lens. Then the light signal is turned into the electrical signal and transmitted to the signal processing circuit. In the signal processing circuit, the electrical signal generated by photosensitive component is amplified by amplifier, completed A/D conversion and outputted in the form of digital video. Image data is transmitted to the image acquisition card through CameraLink 1.10 data interface and turned into a standard 8-bit grayscale images. After image processing, them is transmitted to the computer for further treatment through high-speed PCI-X interface. In the positioning software package, the centers of feature points are calculated and we can obtain their coordinates at pixel coordinate system. Make use of the feature projection points’coordinates and feature points’coordinates at camera coordinate system, combined with the constraints of the location between six feature points in robot coordinate system, the conversion matrix between the robot coordinate system and the camera coordinate system will be obtained. Then the coordinates of target point at camera coordinate system can be calculated. Use of the conversion matrix between the camera coordinate system and the world coordinate system which be obtained from camera calibration to calculate the target point’s three-dimensional coordinates at the world coordinate system. Therefore, the current location parameters, the Attitude Parameters, as well as coordinate transformation matrix of the polishing robot is obtained and sent to the navigation, control and movement programs to control the robot to operate the next exercise.
     According to the above-mentioned principle, the localization system of the polishing robot is established based on the central projection model. The process of model building and algorithm are be introduced and explained in detail in the paper. In accordance with the disadvantage of monocular vision, the positioning accuracy and stability at the depth direction is less than at other directions, we proposed an error correcting technology to correct feature points’coordinates at camera coordinate system by means of fitting plane. First of all, we imposed the planar constraints on the layout of feature points on the robot in order to make six feature points located on the same plane. Secondly, we fitted a plane by feature points’coordinates at the camera coordinate system which calculated by programmer. In order to improve the accuracy of fitting plane, weighted fitting method is utilized to obtain fitting plane according to the accuracy of the calculation of different feature points. The inverse terms of distance between feature and the equal-weight fitting plane as feature points’fitting weight. Finally, we made vector between feature projection points and the optical center of camera as direction vector of a straight line, then the feature points’coordinates were replaced by the intersection points’coordinates of straight line and fitting plane. The method of correcting feature points’coordinates is able to improve the accuracy and stability of feature points’coordinates at camera coordinate system by the experiment.
     The calibration of camera is the important step in computer vision, but also a necessary condition for acquires the accuracy information. Camera calibration is process of establish a camera imaging model and aculeate parameters in it according to image information and relation between points in three-dimensional space and their projection in two-dimensional imaging. In the localization system of polishing robot, Zhang zhengyou’s plane template two-step method has been used for camera calibration. The technique only requires the camera observe a planar pattern shown at a few different orientations. Either the camera or the planar pattern can be freely moved. Compared with classical techniques which use expensive equipment such as two or three orthogonal planes, the Zhang zhengyou’s plane template method is flexible, easy to use and low cost.
     Finally, the experimental results of were provided. The experiments include of fitting plane correcting feature points’coordinate, location precision of a single point, repeatability precision of a single point and tracking of robot experiment. The experimental results show that: The vision location system of polishing robot able to meet the requirements of real-time location of the robot.
引文
[1]詹建明.机器人研磨自由曲面时的作业环境与柔顺控制研究[D].博士学位论文,长春:吉林大学,2002.
    [2]卢韶芳,刘大维.自主式移动机器人导航研究现状及其相关技术[J].农业机械学报.2002,33(2):112-116.
    [3]张毅,罗元,郑太雄,等.移动机器人技术及其应用[M].北京:电子工业出版社,2007.
    [4] Salichs M A,Moreno L.Navigation of mobile robots:open questions[J].Robotica,2000,18: 227-234.
    [5] Leonard J,Durrant-Whyte H F.Mobile robot localization by tracking geometric beacons[C]. IEEE Transactions on Robotics and Automation,1991,7(3):376-382.
    [6]徐海贵,王春香,杨汝清,杨明.磁传感系统在室外移动机器人导航中的研究[J].机器人,2007,29(1):61-66.
    [7]吴伟,刘兴刚,王忠实,徐心和.多传感器融合实现机器人精确定位[J].东北大学学报(自然科学版),2007,28(2):161-164.
    [8] Kai Lingemann,Andreas Nüchter.High-speed laser localization for mobile robots[J]. Robotics and Autonomous Systems,2005,51:275-296.
    [9]胡山,许增朴,等.基于彩色色标的服务机器人定位研究[J].制造业自动化,2005,27(2): 38-40.
    [10]崔峰,张明路,丁承军,刘兵.基于GPS/GIS/GSM的移动机器人定位技术研究[J].微计算机信息,2005,21(7-3):99-101.
    [11]唐璇,白涛,蔡自兴.基于光电鼠标的移动机器人室内定位方法[J].微计算机信息,2005,21(5):20-22.
    [12] Albert-Jan Baerveldt.A vision system for object verification and Localization based on local features[J].Robotics and Autonomous Systems,2001,34:83-92.
    [13] Wijk O,Christensen H I.Localization and navigation of a mobile robot using natural point landmarks extracted from sonar data[J].Robotics and Autonomous Systems,2000,31:31-42.
    [14] Chenavier F,Crowley J.Position estimation for a mobile robot using vision and odometry[C]. IEEE International Conference on Robotics and Automation.Nice,France:May 1992:2588-2593.
    [15] Evans J M.HelpMate:an autonomous mobile robot courier for hospitals[C].InternationalConference on Intelligent Robots and Systems (lROS '94).München, Germany:September 1994:1695-1700.
    [16] Mobile Robotics Lab of the University of Michigan[R/OL]. http://www.engin.umich.edu/research/mrl/index.html.
    [17] Barshan B,Durrant-White H F.Orientation estimate for mobile robots using gyroscopic Information[C].International Conference on Intelligent Robots and Systems (lROS '94). München,Germany:September 1994:1867-1874.
    [18] Barshan B,Durrant-White H F.An inertial navigation system for a mobile robot[C]. Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems Yokohama,Japan:July 26-30 1993:2243-2248.
    [19] Byrne R H,KIarer P R, Pletta J B.Techniques for Autonomous Navigation[R].Sandia Report SAND92-0457.Sandia National Laboratories,Albuquerque:March 1992.
    [20] Johann Borenstein,Liqiang Feng.Measurement and Correction of Systematic Odometry Errors in Mobile Robot[C].IEEE Transactions on Robotics and Automation,1996,12(6):869-880.
    [21] J. Borenstein,L. Feng.UMBmark—A Method for Measuring, Comparing, and Correcting Dead-reckoning Errors in Mobile Robots[R].Technical Report UM-MEAM-94-22,University of Michigan,1999.
    [22] Engelson S , McDermott D. Error correction in mobile robot map learning[C]. IEEE International Conference on Robotics and Automation,France:1992:2555-2560.
    [23] Fox D,Burgard W,Thrun S.Markov Localization for Mobile Robots in Dynamic Environments[J]. Journal of Artificial Intelligence Research,1999,11(0):391-427.
    [24] Ph.Bonnifait,G.Garcia.6 DOF dynamic localization of an outdoor mobile robot[J].Control Engineering Practice,1999.7:383-390.
    [25] Alessandro Rizzi , Riccardo Cassinis.A robot self-localization system based on omnidirectional color images[J].Robotics and Autonomous Systems,2001,34:23-38.
    [26] R. Cassinis,F. Tampalini,R. Fedrigotti.Active markers for outdoor and indoor robot localization[C].in: Proceeding of TAROS 2005,London,England:2005:27-34.
    [27] David Meger,Gregory Dudek.Simultaneous planning,localization,and mapping in a camera sensor network[J].Robotics and Autonomous Systems,2006,54:921-932.
    [28] Munir Zaman.High resolution relative localization using two cameras[J].Robotics andAutonomous System,2007,9(55):393-401.
    [29]陈军,王栋耀.基于全局彩色摄像机的移动机器人定位[J].工业控制计算机,2004,15:16-18.
    [30]厉茂海,洪炳熔.一种鲁棒的室内移动机器人定位方法[J].计算机工程与应用,2005.4:1-3.
    [31]谢云,杨宜民.自主足球机器人的单目视觉自定位方法[J].微电子学与计算机,2005,2(10):129-132.
    [32]陈锡爱,黄孝明,徐方.基于眼固定安装方式的机器人定位系统[J].机器人技术,2006,22(3-2):182-185.
    [33]厉茂海,洪炳熔,等.基于单目视觉的移动机器人全局定位[J].机器人,2007,29(2):140-144.
    [34]胡天策,蔡俊,等.基于内窥镜单目视觉手术导航的测距方法[J].中国组织工程研究与临床康复,2008,12(22):4241-4245.
    [35]张广军.机器视觉[M].北京:科学出版社,2005.6:24-63.
    [36]刘长英.大尺寸单目视觉测量关键技术研究[D].博士学位论文,哈尔滨:哈尔滨工业大学,2006
    [37]张贤达.矩阵分析与应用[M].北京:清华大学出版社,2006.12:408-450.
    [38]周孝宽,曹晓光,陈建革,等.实用微机图像处理[M].北京:北京航空航天大学出版社,1994.10:9-20.
    [39]张之江,车仁生,黄庆成,林伟国.测头成像视觉坐标测量系统中特征点成像中心获取[J].光学精密工程OPTICS AND PRECISION ENGINEERING,1998,6(5):12-18.
    [40] Martin A.Fischler,Robert C.Bolles.Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography.Communications of the ACM. 1981,24(6):381-395.
    [41] Joaquim Salvi,Xavier Armangué,Joan Batlle.A comparative review of camera calibrating methods with accuracy evaluation[J].Pattern Recognition 35,2002:1617-1635.
    [42]马颂德,张正友.计算机视觉——计算理论与算法基础[M].北京:科学出版社,1998.
    [43] Tsai R Y. An efficient and accurate camera calibration technique for 3D machine vision[C]∥LOS A.Proc of IEEE Conference on Computer Vision and Pattern Recognition.Washington: IEEE Computer Society,1986:364-374.
    [44] ABDEL2AZIZ Y I,KARARA H M.Direct linear transformation into object space coordinates in closerange photogrammetry[C]∥KARARA H M.Proceedings of the Symposium on Close2Range Photogrammetry.Urbana:American Society of Photogrammetry,1971:1-18.
    [45] MANBAN K S J,FAU GERAS O D. A theory of self-calibration of a moving camera[J].International Journal of Computer Vision,1992,8(2):123-151.
    [46]杨文强,张保生.计算机视觉摄像机标定技术及其应用研究[J].中国科技信息,2007.11:124-127.
    [47] WEN G J,COHEN P,HERNIOU M.Camera calibration with distortion models and accuracyevaluation[J].PAMI,1992,14(10):965 -980.
    [48] ABDEL-AZIZ Y I,KARARA H M.Direct linear transformation into object space coordinates in close-range photogrammetry[C]∥KARARA H M.Proceedings of the Symposium on Close-Range Photogrammetry.Urbana:American Society of Photogrammetry,1971:1-18.
    [49] LAVEST J M,VIALA M,DHOME M.Do we really need an accurate calibration pattern to achieve a reliable camera calibration[C]∥HANS B,BERND N.Proc of the 5th European Conference on Computer Vision.Berlin:Springer-Verlag,1998:158-174.
    [50]陈爱华,高诚辉,何炳蔚.计算机视觉中的摄像机标定方法[J].中国工程机械学报,2006,4(4):498-504.
    [51] Zhengyou Zhang.A flexible new technique for camera calibration[J].Pattern Analysis and Machine Intelligence,2000,22(11):1330-1334.
    [52] YANG Zhaojun,CHEN Fei,ZHAO Ji,et al.A Vision Calibration Method for Free-form Surface Polishing Robot[C]// Proceedings of the SPIE,Fourth International Symposium on Precision Mechanical Measurements,2008,7130:71300G-71300G-6.
    [53]李庆扬,王能超,易大义.数值分析[M].清华大学出版社,施普林格出版社.2002.4.
    [54]冯天祥.超平面拟合最小二乘问题[J].西南民族学院学报(自然科学版),2002,28(3): 291-293.
    [55] P.迈赛尔.最小二乘法平差近代方法[M].北京:测绘出版社,1985.
    [56] Zeev Zalevsky,Yoav Shrot,Uriel Levy,David Mendelovic.Novel denoising algorithm for obtaining a superresolved position estimation[J].Society of Photo-Optical Instrumentation Engineers,2002,41(6):1350-1357.
    [57]中国计量科学研究院.JJF1059-1999测量不确定度评定与表示[S]//全国法制计量技术委员会.北京:中国计量出版社,1999:2-10.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700