微小研抛机器人空间定位的视觉测量关键技术研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文结合国家高技术研究发展计划(863计划)“大型曲面自主研抛作业微小机器人技术”,以实现微小机器人在大型曲面上的自主定位为研究目标,创新性的将计算机视觉坐标测量技术应用于移动机器人空间定位方法中,提出了基于六个特征点平面距离约束的微小研抛机器人单目视觉空间定位新方法。实验证明该方法能够满足微小机器人在大型曲面上的定位要求,定位算法具有较好的鲁棒性,也可为多个移动机器人系统同步定位提供一种新的解决思路。
     论文根据PnP(Perspective n Points)理论,以摄像机中心投影模型为基础建立了视觉定位系统的数学模型,提出了基于六个平面特征点距离约束的参数建模方法和机器人空间位置坐标及姿态的求解算法。结合微小研抛机器人空间定位的实际需要,建立了摄像机标定的内外参数模型,提出了基于非线性模型的摄像机平面网格标定新方法。通过大量的摄像机标定实验,对标定方法的精度进行了研究,实现了在整个工作空间内对摄像机内外参数的校准。
     针对定位系统采集的灰度图像特点,提出具有较强抗干扰性、基于连通成分标记的自适应阈值图像分割方法,在实现图像分割的同时,快速、准确的将六个特征像点区域区分开来。对圆形和椭圆形目标的亚像素定位方法进行了研究,采用双线性插值灰度平方加权质心算法,对成像点的中心坐标进行了亚像素求解,仿真分析和实验结果均验证了该方法能够有效降低计算结果的离散程度,实现特征点的1/10倍像素细分,满足了机器人定位系统对特征点成像中心精确定位的要求。在大型曲面上进行了综合定位实验,并对定位系统误差产生原因进行了系统的理论分析,尤其针对当辅助定位特征点的光轴与摄像机光轴成一定角度时、亚像素定位算法计算出的特征像点的中心坐标与实际发光点成像中心坐标间存在较大偏差这一现象,提出了角度偏置误差补偿理论和预估-即时修正补偿方法,通过实验手段建立了误差补偿数学模型,实验证明了该方法能够有效提高定位系统的精度和系统的稳定性。
     课题研究工作得到了国家高技术研究发展计划“大型曲面自主研抛作业微小机器人技术”(项目编号:2006AA04Z214)和国家自然科学基金项目“自定位微小研抛机器人精整加工大型曲面研究”(项目编号:50575092)的资助。
With the development of science technology, the application fields of robot have been enlarged constantly. For large free-form surfaces, the polishing process is limited to range of the processing equipment, so it is very important to develop robotic polishing process technology of free-form surfaces in order to realize the automatic polishing proess of large free-form surface. In order to realize free movement of polishg robot in the large free-form surface, autosensing and self-positioning, the spacial localization methods of polishing robot is a preprquisite to realize autonomous navigation and automatic control of polishing robot in a certain working space. There is the significance of theory and reality in space localization fields of moving robot. Localization is an important content in the research field of robotics and it also is premise and guarantee of navigation and path planning. It is the real-time and high precision localization of mobile robot which make the mobile robot mounting industrial robot accurate working in large range possible. In addition, the perfect combination of the mobile robot and the industrial robot make have broad perspectives in manufacturing, unmanned factory and spatial working application. But the navigation systems and localization technology are a difficult problem both the individual mobile robot and muti-robot system.
     This paper applys the computer visual coordinate measuring technology to the spatial localization method of automated mobile robot creatively in order to improve the autonomous ability of robot localization and roboustness of localization algorithm, while to reduce the effect of nonsystematic error to the localization accuracy and offer an effective synchronous localization method of muti-robot system. The spatial localization method of micro-polishing robot based on constraint of characteristic point distance is researched n this paper with visiual coordinate measuring technology in order to improve localization precision and reat-time of localization system. The localization problem of micro-polishing robot is researched profoundly in this paper. The work of the paper mainly involves several aspects:
     Firstly, the domestic and overseas present status and development trend of visual localization method of mobile robot is analysed in this paper. A novel localization method of micro-polishing robot combined with the research subject, which is restricted within certain working space, is presented in this paper. On the basis of pinhole camera model, a new mathematical model of vision-localization of automated polishing robot is established. The vision-localization is based on the distance-constraints of feature points. The method to solve the mathematical model is discussed.
     According to the perspective principle of the camera, the mathematic model of localization system is established and solutions are solved by the mathematic method. The theoretical background of mathematic model is discussed; the problem of Perspective n Points is analyzed in the statement of problem, two defination methods and the present status respectively. The arrangement of six optical characteristic points in the micro-polishing robot are discussend secondly, the feasiblilty and reliability of the arrangement method are demonstrated both in theory and in experiments. The mathematic model of localization system is established according to the definantion of PnP and the corresponding center coordinate of image points. The optimization soloutions based on the total Least Squares are derived by the Singular Value Decomposition and the position and the posture of robot in working space are determined lastly.
     The calibration accuracy of camera parameters is the key part of high accuracy optical measurement On the basis of the analysis of various calibration methods, the plane net calibration method of nonlinear model including Radial Lens distortion and Tangle Lens Distortion is presented after the establishment of mathematical model, and the concrete resolved method is reduced. The following experiments show that the calibration method is high precision and can meet the technology requirement of robot localization.
     The presented localization method is based on the distance constraint theory of characteristic points. The image shape on the image coordinate is approximate to ellipse. On the basis of the present localization methods of sub-pixel center coordinate, the adaptive threshold image segmentation method and the gray square weighted algorithm of bilinear interpolation are presented in this paper. According to the given position of six optical points, the position matching algorithm is discussed .Canny operator rapid localization method and the principle of sub-pixel edge detecting algorithm based on Zernike orthogonal moments are presented in this paper. The center orientation method of elliptical curves fitting based on the Canny-Zernike algorithm is given which is closed to circular markers. Finally, the validity and precision of elliptical curves fitting algorithm based on Canny-Zernike operator is studied with computer simulation analysis and experimental research, the sub-pixel edge localization results are compared to gray weighted algorithm. The results indicate that the center localization precision of circular marker based on Canny-Zernike orthogonal moments edge detection operator is much better than traditional edge detection operator and the position precision of the operator is better than 0.05 pixels. The experiments show that the gray square weighted algorithm of bilinear interpolation can meet the requirement of accuracy localization of image points center.
     In order to improve the accuracy of the system, this paper analyzed the error factors of the measurement system, researched the systematic error caused by feature points’angular-displacement, and proposed the error compensation method. Aim at the inherent defect of single vision system which is the error of Z axis are large, the reson is analyzed both in quantatitive analysis and qualitative analysis. The method of super plane fitting is present in order to improve the measuring accuracy in z direction. At first, based on the LED construction and imaging principle, it analyzed refraction and reflection factors. Then established the refraction compensation function by theoretical analysis and fitted the reflection compensation function by experimental tests. After that, it researched the comprehensive compensation function depended on the two functions above. At last, on basis of the predictor-corrector method, this paper got the probe center coordinate through real-time modifying feature points’angular-displacement-error, and verified the feasibility of this method through the accuracy measurement experiments.Finally, the resolution of image system, the geometrical distortion of projected image, the deviation of optical characteristic points position, the center coordinante error of characteristic points and the quantification error are analyzed systematically.
     Finally, the total experimental project plan and procedure is discussed. The experimental results show that the presented model and solution method is correctness and high precision. The method is also suitable for localization of mobile robot in workspace. Many error sources exist in measuring system, so establishing rational error model and improving location precision of points are the future targets of research.
     In this paper, the main innovations are as follows:
     1. This paper applys the computer visual coordinate measuring technology to the spatial localization method of automated mobile robot creatively in order to improve the autonomous ability of robot localization and roboustness of localization algorithm, while to reduce the effect of nonsystematic error to the localization accuracy and offer an effective synchronous localization method of muti-robot system .
     2. The mathematical model of mobile robot spacial localization based on six optical characteristic points is presented in this paper and the optimization soloutions based on the total Least Squares are derived by the Singular Value Decomposition and the position and the posture of robot in working space are determined. The total experiments of localization system show that the method can improve the autonomous ability of robot localization and real-time of localization.
     3. The error compensation method of super plane fitting is presented in order to improve the measuring accuracy in z direction. The method advances the stability and the veracity of the center coordinate of characteristic image point and the measuring accuracy of localization system can be improved.
     4. The angle between the camera optical axis and the optical axis of the characteristic point changes gradually. The error between the real center coordinate and the calculative center coordinate of the characteristic image point is caused because the inner wall reflection of the optical characteristic points exists.The things account for its occurrence are analyzed, the error compensation theory is researched and the mathematical model of error compensation are established. The experiments show that the method can increase the accuracy of the localization system.
引文
[1]Albus J S. Brain, Behavior and Robotics [M]. New York: McGraw-Hill, 1981.
    [2]John Holland. Designing Autonomous Mobile Robots: Inside the Mind of an Intelligent Machine [M].American: Newnes, 2003.
    [3]Matthies, L., Shafer, S.A. Error Handling in Stereo Navigation [J]. IEEE Transaction On Robotics and Automation, 1987, 3: 239-248.
    [4]Aleksander I. Computing Techniques for Robots [M].New York: Kogan Page Ltd., 1985.
    [5]O. D. Faugeras. Three Dimensional Computer Vision: A Geometric View Point [M].Boston: MITP ress, 1993.
    [6]B.G.Batchelor, D.W.Braggins. Commercial vision systems in computer vision [J]. Theory and Industrial Applications.1992, 405-452.
    [7]钟玉琢,乔秉新,李树青.机器人视觉技术[M].北京:国防工业出版社,1994.
    [8]D.Marr. Vision:A computational investigation into the human representation and processing of visual information [M]. W.H.Freeman and Company, 1982.
    [9]Batchilor,B.G. Automated visual inspection in industry [J]. The Industrial Robot.l978, 5:174-l76.
    [10]Asada H, Slotine J JE. Robot Analysis and Control [M]. New York: John Wiley and Sons Inc., 1986.
    [11]赵继.模具自由曲面的自动研磨加工理论与实验研究[D].长春:吉林工业大学,1997.
    [12]Guvenc L ,Srinivasan K .An Overview of Robot-assisted Die and Mold Polishing with Emphasison Process Modeling [J].In Uoumal of Manufacturing System,1997,16( 1): 48-58.
    [13]McGillem, C., Rappaport, T., Infra-red Location System for Navigation of Autonomous Vehicles [C]. Proceedings of IEEE International Conference on Robotics and Automation, Philadelphia, PA, 1988, April 24-29:1236-1238.
    [14]Sebastian Thrun. Probabilistic Robotics [M]. Boston: the MIT Press, 2005
    [15]Roland Siegwart, Illah R. Nourbakhsh. Introduction to Autonomous Mobile Robots [M]. Boston: MIT Press, April, 2004.
    [16]Ayache Nicholas. Artificial vision for mobile robots [M].The MIT Press, Cambridge, Massachusetts,1991 .
    [17]R. Hartley, A .Zisserman. Multiple View Geometry in Computer Vision [M].Cambridge University Press, 2000.
    [18]David Forsyth, J.P., Computer Vision: A Modern Approach [M]. Prentice Hall, 2003.
    [19] Http://www.RoboticFan.com
    [20] http://www.robotworld.org.tw
    [21] http://www.robotschina.com/
    [22]McPherson, J.A., Engineering and Design Applications of Differential Global Positioning Systems (DGPS) for Hydrographic Survey and Dredge Positioning [J]. Engineering Technical Letter, US Army Corps of Engineers, Washington, DC. 1991, July 1: 1110-1-150.
    [23]Mesaki, Y., Masuda, I., A New Mobile Robot Guidance System Using Optical Reflectors [C]. Proceedings of the 1992 IEEE/RSJ International Conference on Intelligent Robots and Systems, Raleigh, NC, 1992, July 7-10:628-635.
    [24]蔡自兴.机器人学[M].北京:清华大学出版社,2003.
    [25]贾云得.机器视觉[M].北京:科学出版社,2000.
    [26]徐德,谭民,李原.机器人视觉测量与控制[M].北京:国防工业出版社,2008.
    [27]马颂德,张正友.计算机视觉-计算理论与算法基础[M].北京:科学出版社,1997.
    [28]章毓晋.图像工程[M].北京:清华大学出版社,1999.
    [29]Francois Blais. Review of 20 years of range sensor development [J].Journal of Electronic Imaging, 2004,13(1): 231-240.
    [30]Daniela G.Trevisan, Luciana P.Nedel, Benoit Macq. Augmented Vision for Medical Applications [C]. Proceedings of the 2008 ACM symposium on Applied computing, 1415-1419.
    [31]J.Borenstein, H.R.Everett, L.Feng, et al. Mobile Robot Positioning Sensors and Techniques [J].Invited paper for the Journal of Robotics Systems, Special Issue on Mobile Robots, 14(4):231-249.
    [32]吴健康,肖锦玉.计算机视觉基本理论和方法[M].合肥:中国科学技术大学出版社,1993.
    [33]李为民.大尺度范围内视觉测量技术研究[D].安徽:中国科技大学博士学位论文.
    [34]Miller, G.L., Wagner, E.R. An Optical Rangefinder for Autonomous Robot Cart Navigation [C]. Proceedings of the Advances in Intelligent Robotic Systems: SPIE Mobile Robots II. 1987.
    [35]Moravec, H.P., Robot Rover Visual Navigation [M]. UMI Research Press, Ann Arbor,Michigan. 1981.
    [36]Moravec, H.P. and Elfes, A., High Resolution Maps from Wide Angle Sonar [C].Proceedings of the IEEE Conference on Robotics and Automation, Washington, D.C., 1985:116-121.
    [37]Moravec, H.P. Sensor Fusion in Certainty Grids for Mobile Robots [J]. AI Magazine,Summer, 1988: 61-74.
    [38]Alessandro Rizzi, et al. A robot self-localization system based on omnidirectional color images [J]. Robotics and Autonomous Systems, 2001, 34:23-38.
    [39]David Meger, Gregory Dudek. Simultaneous planning, localization, and mapping in a camera sensor network [J]. Robotics and Autonomous Systems, 2006, 54:921-932.
    [40]J. Ure?a, A. Hernández, et al. Advanced sensorial system for an acoustic LPS [J]. Microprocessors and Microsystems, 2007, 31: 393-401.
    [41]Munir Zaman. High resolution relative localization using two cameras [J]. Robotics and Autonomous Systems (2007), doi: 10.1016/j.Robot.2007.05.
    [42]Sebastian Thrun, Dieter Fox, et al. Robust Monte Carlo localization for mobile robots [J]. Artificial Intelligence, 2001, 128:99-141.
    [43]Motazed, B. Measure of the Accuracy of Navigational Sensors for Autonomous Path Tracking[C]. Proceedings, SPIE Vol. 2058, Mobile Robots VIII, 1993: 240-249.
    [44]Fox D, et al. Collaborative Multi-robot Localization [J]. Autonomous Robots on Heterogeneous Multi2robot Systems, Special Issue, 2000, 8(3).
    [45]Stroupe A W, Martin MC, Balch T. Distributed Sensor Fusion for Object Position Estimation by Multi-robot Systems [C]. Proceedings of the IEEE International Conference on Robotics and Automation ( ICRA’01) , Seoul , Korea ,May , 2001 :1092– 1098.
    [46]Howard A, Mataric MJ,Sukhatme GS. Localization for Mobile Robot Teams Using Maximum Likelihood Estimation [C]. International Conference on Intelligent Robot and Systems ( IROS02) , 30 Sept -25 Oct ,2002 ,3 :2849 - 2854.
    [47]殷际英,何广平.关节型机器人[M].北京:化学工业出版社, 2003.
    [48]吴斌.大型物体三维形貌数字化测量关键技术研究[D].天津大学博士学位论文,2002.
    [49]Rekleitis I M, Dudek G, Milios E E. Multi-robot Cooperative Localization: A Study of Trade2off between Efficiency and Accuracy [ C ] .International Conference on Intelligent Robot and Systems (IROS02) , 30 Sept .25 Oct . , 2002.
    [50]Roumeliotis S I ,Bekey GA. Distributed Multirobot Localization [J]. IEEE Transaction on Robotics and Automation, 2002, 18 (5):781-788.
    [51]杨国光等编.光学经纬仪[M].北京:机械工业出版社, 1982.
    [52]Fox D, Burgard W, Dellaert F. Monte Carlo Localization: efficient position estimation for mobile robots[C]. Proceedings of the National Conference on Artificial Intelligence, 1999: 343-349.
    [53]Thrun S, Fox D, Burgard W. Robust Monte Carlo localization for mobile robots [J]. Artificial Intelligence,2002,128:99-141.
    [54]郑南宁.计算机视觉与模式识别[M].北京:国防工业出版社, 1998.
    [55]吴立德.计算机视觉[M].上海:复旦大学出版社, 1993.
    [56]Tsai C C. A localization system of a mobile robot by fusing dead-reckoning and ultrasonic measurements [J]. IEEE Transactions on Instrumentation and Measurement, 1998, 47(5):1399-1404.
    [57]Figueroa J F, Lamancusa J S. A method for accurate detection of time of arrival: analysis and design of an ultrasonic ranging System [J]. Journal of the Acoustical Society of America,1992 ,91(1): 486-494.
    [58]Cohen C, Koss F. A comprehensive study of there object triangulation [C]. Proceedings of the 1993 SPIE Conference on Mobile Robots. Boston,MA,1992:95-106.
    [59]Qing Hua Wang, Teodor Ivanov, Parham Aarabi. Acoustic robot navigation using distributed microphone arrays [J]. Information Fusion, 2004, 5:131-140.
    [60]Birsel Ayrulu, et al. Neural networks for improved target Differentiation and localization with sonar [J]. Neural Networks, 2001, 14:355-373.
    [61]Kai Lingemann, Andreas Nüchter. High-speed laser localization for mobile robots [J]. Robotics and Autonomous Systems, 2005, 51:275-296.
    [62]Albert-Jan Baerveldt. A vision system for object verification and Localization based on local features [J]. Robotics and Autonomous Systems, 2001, 34:83-92.
    [63]Wijk O, Christensen H I. Localization and navigation of a mobile robot using natural point landmarks extracted from sonar data [J].Robotics and Autonomous Systems, 2000, 31:31-42.
    [64] Ph.Bonnifait, G.Garcia. 6 DOF dynamic localization of an outdoor mobile robot [J]. Control Engineering Practice, 1999, 7:383-390.
    [65]S .F. E L-Hakim,et al .The VCM Automated 3 -D Measurement System, Theory Application and Performance Evaluation [C].Proc. S PIE, 1992,1708:460-482.
    [66]Borenstein J, Feng L. UMBmark-a method for measuring, comparing, and correcting dead-reckoning errors in mobile robots. Technical Report UM-MEAM-94-22, University of Michigan, 1994.
    [67]Xu D,Tan M,Jiang Z,et al. A shape constraint based visual positioning method fora humanoid robot [J]. Robotica, 2006, 24(4):429-431.
    [68]翟乃斌.基于计算机视觉的汽车整车尺寸测量系统的研究[D].吉林:吉林大学,2007.
    [69]于之靖.大空间立体视觉测量关键技术研究[D].哈尔滨:哈尔滨工业大学,2004.
    [70]周富强.双目立体视觉检测的关键技术研究[D].北京航空航天大学博士后研究工作报告,2002.
    [71]Birchfield S,Tomasi C. Depth discontinuities by pixel-to-pixel stereo[J].International Journal of Computer Vision, 1999,35(3):269-293.
    [72]Malis E,Chaumette F,Boudet S. 2D 1/2 visual servoing[J].IEEE Transactions on Robotics and Automation, 1999,15(2):234-246.
    [73]Juergen Peipe, Peter Andrae. Optical 3D Coordinate Measurement using a Range Sensor and Photogrammetry[J]. Proceeding of Videometrics VⅡ, Santa Clara,CA,2003[C],5013:110-116. USA: Proceedings SPIE.
    [74]M.A.Fishler , R.C.Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartomated Cartography[J]. Communications of the ACM, 1981, 24(6):381-395.
    [75]M.A.Abidi,T,Chandra. A New Efficient and Direct Solution for pose Estimation Using Quadrangular Targets: Algorithm and Evaluation [J].IEEE Tran. On PAMI, 1995, 17(5):534-538.
    [76]Yuan J S. A general photogrammetric method for determining object position and orientation [J].IEEE Trans on Robotics and Automation.TPAMI,1995, 17(5):534-538.
    [77]Hu Z.Y., WU F.C, A Note on the Number Solution of the Non-coplanar P4P Problem[J].IEEE Transaction on Pattern and Analysis and Machine Intelligence,2002,24(4),550-555.
    [78]R.Horaud, B.Conio, O.Leboulleux. An Analytic Solution for the Perspective 4-Point Problem[J].CVGIP,1989,47:33-44.
    [79]胡占义,雷成,吴福朝.关于P4P问题的一点讨论[J].自动化学报,2001,27(6):770-774.
    [80]Jianliang Tang. Some Necessary Conditions on the Number of Solutions for the P4P Problem [J]. IWMM-GIAE,2005,LNCS 3519:56-64.
    [81]Radu Horaud, Bernard Conio, Olivier Leboulleux. An Analytic Solution for the perspective 4-Point Problem[J].Computer Vision ,Graphics and Image Processing,1989,47:33-44.
    [82]Long QUAN, Zhongdan LAN. Linear N≥4-Point Pose Determination[J]. http://www.inrialpes.fr/movi/people/Quan/Quan-iccv98.ps.gz
    [83]苏成,徐迎庆,李华等.判定P3P问题正解数目的充要条件[J].计算机学报,1998,21(12):1084-1095.
    [84]Z.Y.Hu, F.C.Wu. A Note on the Number of Solutions of the Noncoplanar P4P Problem[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2002,24(4):550-555.
    [85]吴福朝,胡占义.摄像机未标定的P5P问题研究[J].计算机学报,2001,24(11):1321-1326.
    [86]梁栋,吴福朝.从5个控制点确定摄像机内参数与方位[J].中国科学技术大学学报,2002,32(2):194-201.
    [87]吴福朝,胡占义.关于P5P问题的研究[J].软件学报,2001,12(5):768-775.
    [88]吴福朝,胡占义. PnP问题的线性求解算法[J].软件学报,20031,14(3):682-688.
    [89]YIHONG WU,ZHANYI HU. PnP Problem Revisited [J].Journal of Mathematical Imaging and Vision, 2006,24:131-141.
    [90]吴毅红.摄像机标定与三维重建[D].中国科学院自动化所博士后研究报告,2003.
    [91]Long Quan,Zhongdan Lan. Linear N-Point Camera Pose Determiantion[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1999,21(8):774-780.
    [92]David Nister. An Efficient Solution to the Five-Point Relative Pose Problem[R].Sarnoff Corporation CN5300, Princeton,NJ08530.
    [93]JOSEPH S.C. A General Photogrammetric Method for Determining Object Position and Orientation[J]. IEEE Transactions on Robotics and Automation,1989,5(2):129-142.
    [94]Y.H.WU, Z.Y.HU. The Invariant Representations of a Quadric Cone and a Twisted Cubic[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2003,25(10):1-5.
    [95]Jeffrey Mendelsohn. A New Metric for Object Pose Estimation[R].Department of Computer & Information Science Technical Reports, 1998.
    [96]汤建良.透视n点定位(PnP)问题研究[D].博士论文,北京:中国科学院数学与系统科学研究院,2003.
    [97]于起峰,陆宏伟,刘肖琳.基于图像的精密测量与运动测量[M].北京:科学出版社,2002.
    [98]张贤达.矩阵分析与应用[M].北京:清华大学出版社,2004.
    [99]Timoshenko S,Young D H.Advanced dynamics.New York:McGraw-Hill,1948.
    [100]刘延柱.高等动力学[M].北京:高等教育出版社,2001.
    [101]DAVID A.FORSYTH,JEAN PONCE.计算机视觉-一种现代方法[M].北京:电子工业出版社,2004.
    [102]Lenz R K, Tsai R Y. Calibrating a Cartesian Robot with eye-on-hand Configuration Independent of eye-to-hand Relationship[J].IEEE Trans on PAMI,1989,11.
    [103]Tsai R Y. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses[J]. IEEE Journal of Robotics and Automation, 1987, 13(4):323-344.
    [104]Weng J Y, Cohen P, Herniou M. Camera Calibration with Distortion Models and Accuracy Evaluation. IEEE Trans on PAMI,1992,14(10):965-980.
    [105]K.B.Atkison Ed. Developments in Close Range Photogrammetry[M].London,1980.
    [106]O.D.Faugeras. Three-Dimensional Computer Vision[M]. Cambridge: the MIT Press, 1993.
    [107]Joaquim Salvi, Xavier Armangue. A comparative review of camera calibrating methods with accuracy evaluation[J]. Pattern Recognition, 2002, 35(10):1617-1635.
    [108]Zhang, Z. Camera calibration with one-dimensional objects [J]. IEEE Trans. Pattern Anal. Mach. Intell.2004, 25(7):892-899.
    [109]N. Thomas, H. Keller. Photometric calibration of the Halley Multicolor Camera [J]. Appl. Opt. 1990, 29:1503-1516.
    [110]Tsai,R.Y., Lenz,R.K. Reviewof RAC-Based Camera Calibration[J].Vision,1989,5(3).
    [111]Tsai,R.Y. Synopsis of Recent Progress on Camera Calibration Technique for 3D Machine Vision[C].IBM Research Report,RC13954, January,1988.
    [112]Tsai,R.Y. An Efficient and Accurate camera Calibration for 3D Machine Vision[J].IEEE CVPR,1986,5.
    [113]Lenz,R.K.,Tsai,R.Y. Techniques for Calibration of Scale Factor and Image Center for High Accuracy 3-D Machine Vision Metrology[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1988, 10(5):713-720.
    [114]Ganapathy S. Decomposition of transformation matrics for robot vision[C]. Proc IEEE Int Conf Robotics Automat. 1984, 130-139.
    [115]张毅,罗元,郑太雄等.移动机器人技术及其应用[M].北京:电子工业出版社,2007.
    [116]Zhengyou Zhang. A Flexible New Technique for Camera Calibration. Microsoft Research Report, 1998, http://research.microsoft.com/?zhang.
    [117]M. Ito. Robot vision modelling - camera modelling and camera calibration[J]. Advanced Robotics, 1991, 5(3):321-335.
    [118]G.-Q. Wei, S. De Ma. Implicit and explicit camera calibration: Theory and Experiments [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1994, 16: 469-480.
    [119]O. D. Faugeras, G. Toscani. The calibration problem for stereo[C] . Proceedings of IEEE Computer Vision and Pattern Recognition, 1986, 15-20.
    [120]J. Salvi, J. Batlle, E. Mouaddib. A robust-coded pattern projection for dynamic 3D scene measurement[J]. International Journal of Pattern Recognition Letters, 1998, 19:1055-1065.
    [121]E. L. Hall, J. B. K. Tio, C. A. Measuring curved surfaces for robot vision[J]. Computer Journal, 1982, 12: 42-54.
    [122]Jean-Yves Bouguet,Pietro Perona. Visual navigation using a single camera[c].USA,ICCV 95 Proceedings.
    [123]Abdel-Aziz YI, Karara HM. Direct linear transformation into space coordinates in close-range photogrammetry[C]. Symposium on Close-Range Photogrammetry, Urbana, Illinois: 1-18.
    [124]Bopp H. Krauss H.An orientation and calibration method for non-topographic applications[J]. Photogrammetric Engineering and Remote Sensing, 1978,44(9):1191-1196.
    [125]Ganapathy S. Decomposition of transformation matrices for robot vision[J].Pattern Recognition Letters, 1984,2:401-412.
    [126]权铁汉,于起峰.摄影测量系统的高精度标定与修正[J].自动化学报,2000,26(6):748-755.
    [127]邱志强,于起峰.一种高精度摄像机标定法[J].工程图学学报,2001,1-5.
    [128]Faugeras OD, Luong QT, Maybank SJ. Camera self-calibration: theory and experiments [J].Proc.2nd European Conference on Computer Vision, 1992, Italy, 321-334.
    [129]Faugeras OD, Lustman F, Toscani G. Motion and structure from motion from point and line matches[C]. Proc.1st International Conference on Computer Vision, London, England, 25-34.
    [130]Hartley R I. Estimation of relative camera position for uncalibrated cameras[C]. Proceedings of the ECCV92, Italy: Santa Margherita Ligure.579-587.
    [131]Herferty J P, Zhang C, Mclennan G et al. Video-endoscopic distortion correction and its application to virtual guidance of endoscopy[J]. IEEE Trans. Medical Imaging, 2001, 20(7):605-617.
    [132]Ma S D. A self-calibration technique for active vision system [J].IEEE Trans. Robotics and Automation, 1996, 12(1):114-120.
    [133]孟晓桥,胡占义.摄像机自标定方法的研究与进展[J].自动化学报, 2003 ,29(1):110-124.
    [134]Hartley R. Euclidean reconstruction and invariants from multiple images. IEEE Transaction on Pattern Analysis and Machine, 1994, 16(10):1036-1041.
    [135]TriggsB. Calibration and the absolute quadric[C]. Proceedings of Computer Vision and Pattern Recognition, 1997, 609-614.
    [136]Pollefeys M, Oosterlinck A, VanGool L. The Modulus Constraint: A New Constraint for Self-Calibration [C].Proceedings of International Conference of Pattern Recognition, Vienna, 1996, 349-353.
    [137]Melen T. Geometrical Modelling and Calibration of Video Cameras for Underwater Navigation[D].Institute for teknisk kybernetikk, Norges tekniske hogskole, Throndheim,1994.
    [138]Wei GQ, Ma SD. A complete two-plane camera calibration method and experimental comparisons [C]. 1993, Proc. 4th International Conference on Computer Vision, Berlin, Germany, 439-446.
    [139]Wei GQ, Ma SD. Implicit and explicit camera calibration: Theory and experiments[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1994, 16(5): 469-480.
    [140]S. Chen , J. Weng. Calibration for peripheral attenuation in intensity images[C]. First IEEE International Conference on Image Processing, Austin, Texas, pp. 992-996, Nov. 13-16, 1994.
    [141]邱茂林,马颂德,李毅.计算机视觉中摄像机定标综述[J].自动化学报, 2000, 26(1):43-55.
    [142]Zhengyou Zhang. Flexible Camera Calibration By Viewing a Plane From Unkown Orientations[J].Microsoft Research Report, 1999, http://research.microsoft.com/~zhang.
    [143]Fabio Remondino, Clive Fraser. Digital Camera Calibration Methods: Considerations and Comparisons[C].2006, ISPRS Commission V Symposium "Image Engineering and Vision Metrology′.
    [144]Shishir Shah, J.K.Aggarwal. Intrinsic parameter Calibration Procedure for a Fish-eye Lens Camera with Distortion Model and Accuracy Estimation [J].Pattern Recognition.1996, 29(11):1775-1788.
    [145] M.A .Penna. Camera Calibration: A Quick and Easy Way to Determine the Scale Factor [J].IEEE Trans.PAMI.1991, 13(12):1240-1245.
    [146]Yoshihiko Nomura, Miehihiro Sagara, et al. Simple Calibration Algorithm forHigh- distortion-lens Camera [J].IEEE Transactions on Pattern Analysis and Machine Intelligence.19 92, 14 (11):1095-1099.
    [147]Min-Hong Han and Sangyong Rhee. Camera Calibration for Three-dimensional Measurement [J].Pattern Recognition.1992, 25 (2):155-164.
    [148]E. Bruzzone, F .Mangili. Calibration of a CCD camera on a Hybrid Coordinate Measuring Machine for Industrial Metrology [J].SPIE, 1526, Industrial Vision Metrology.1991:96-112.
    [149]侯成刚,杨文献,屈梁生.一种快速检测圆心的抗噪声亚像素算法[J].光学学报,1998,18(4):481-485.
    [150]陈天华.数字图像处理[M].2007,北京:清华大学出版社.
    [151]Maria Petrou, Panagiota Bosdogianni.数字图像处处理疑难解析[M].北京::机械工业出版社,2005.
    [152]Jean-Yves Bouguet. Visual methods for three-dimensional modeling[D]. Pasadena, California: California Institute of Technology, 1999.
    [153]罗希平,田捷,诸葛婴等.图像分割方法综述[J].模式识别与人工智能,1999,12(3):300-312.
    [154]Sahoo P K,Saltani S,Wang A K C, et al. A Survey of Thresholding Techniques[J].Computer Vision Graphics and Image Processing,1988,41:233-260.
    [155]Pal N R,Pal S K.A review on image segmentation techniques[J].1-33.Pattern Recognition,1993,26(9):1277—1294.
    [156]West G A, Clark T A. A Survey and Examination of Subpixel Measurement Techniques[C].Close-Range Photogrammetry Meets Machine Vision, 1990,SPIE1395:456-462.
    [157]Tsai W C.Moment-Preserving Thresholding: A New Approach. Computer Vision, Graphics and Image Processing [J].1995, 29: 377-393.
    [158]J Kittler. On the accuracy of the Sobel edge detector [J]. Image and Vision Computing, 1983,1: 37-42.
    [159]J Kittler, J Illingworth. On threshold selection using clustering criteria[J]. IEEE Transactions on Systems, Man, and Cybernetics, 1985, 15:652-655.
    [160]N Ostu. A threshold selection method from gray level histograms[J]. IEEE Transactions on Systems, Man, and Cybernetics, 1979, 9:62-66.
    [161]Canny J, A Computational Approach to Edge Detection [J].IEEE Trans Pattern Analysis & Machine Intelligence, 1986, 8(6):679-698
    [162]Jensen K. Subpixel edge localization and interpolation of stereo images [J].IEEE Trans on PAMI, 1995, 17(6):629-634.
    [163]Ghosal S., Mehrotra R, Orthogonal Moment Operators for Subpixel Edge Detection [J].Pattern Recognition, 1993, 26(2):295-306
    [164]黄桂平.圆形标志中心子像素定位方法的研究与实现[J].武汉大学学报,2005,30(5):388-391.
    [165]Edward PI, Owen R M. Subpixel measurement using a moment-based edge operator [J].IEEE Trans on PAMI, 1989,11(2):1293-1309.
    [166]Cox J A. Evaluation of peak location algorithms with subpixel accuracy for mosaic focal planes[C].Processing of images and Data from Optical Sensors, 1981, SPIE292:288-299.
    [167]Tabatabai A J, Mitchell O R. Edge Localization to Subpixel Values in Digital Imagery [J].IEEE Trans on PAMI,1984, 6(2):188-201.
    [168]Csatleman K R.数字图像处理.北京:清华大学出版社,1998.
    [169]Guangjun Zhang, Zhenzhong Wei. A position-distortion model of ellipse centre for perspective projection [J]. Measurement Science and Technology, 2003, 14:1420-1426.
    [170]Ahn S J, Rauh W, Recknagel M. Ellipse fitting and parameter assessment of circular object targets for robot vision[C].IEEE Conf. on Intelligence Robots and Systems, 1999: 525-530.
    [171]Stefano L D, Boland F. A New phase extraction algorithm for phase profilometry [J]. Mach.Vis.Appl. 1997, 10:188-200.
    [172]Edward PI, Owen R M. Subpixel measurement using a moment-based edge operator [J].IEEE Trans on PAMI, 1989, 11(2):1293-1309.
    [173]李金海.误差理论与测量不确定度评定[M].北京:中国计量出版社,2003.
    [174]费业泰.误差理论与数据处理[M].北京:机械工业出版社,2000.
    [175]费业泰.现代误差理论及基本问题[J].宇航计测技术,1996,16(4).
    [176]Warren. E. Smith, Nimish Vakil, Seth. A. Maislin. Correetion of distonion in endoscope images. IEEE Trans. Med. Imag., 1992, MI-11(1): 117-122.
    [177]Hideaki Haneishi, Yutaka Yagihashi, Yoichi Miyake. A new method for distortion correetion of eleetronic endoscope images. IEEE Trans. Med. Imag., 1995, MI-14(3): 548-555.
    [178]Yoshihiko Nomura, Michihiro Sagara, Hiroshi Naruse et al. Simple calibration algorithm for high-distortion-lens camera. IEEE Trans. Pattern Anal. & Matine Intell., 1992, PAMI-14(11): 1095-1099.
    [179]Sheng-Wen Shin, Yi-ping Hung,Wei-Song Lin. When should we consider lens distortion in camera calibration. Pattern Recognition, 1995, 28(3): 447-461.
    [180]Sheng-Wen Shin, Yi-Ping Hung, Wei-Song Lin. Accurate Linear technique for camera calibration considering lens distonion by solving an eigenvalue problem. Opt. Engng., 1993, 32(1): 138-149.
    [181]孙风梅,黄风荣,胡占义等.视觉测量误差的研究与讨论[J].北方工业大学学报,2005,17(1):61-64.
    [182]何春燕,周付银.双目立体视觉系统定标误差分析[J].中国体视学与图像分析,2003,8(4):230-234.
    [183]刘佳音,王忠立,贾云得.一种双目立体视觉系统的误差分析方法[J].光学技术,2003,29(3):354-360.
    [184]陶闯,林宗坚,卢健.摄像系统标定中尺度因子的鲁棒测定[J].模式识别与人工智能.1995,8(3):248-254.
    [185]Beyer H A. Some Aspects of the Geometric Calibration of CCD Cameras [C]. Proc of ISPRS Conf on“Fast Processing of Photogrammetric Data”, Swithzerland, 1987, 68-81.
    [186]Fryer J G. Mason S Q. Rapid Lens Calibration of a Video Camera [J].Photogrammetric Engineering and Remote Sensing, 1989, 55(4):437-442.
    [187]吴晓波,安文斗,杨钢.图像测量系统中的误差分析及提高测量精度的途径[J].光学精密工程,1997,5(1): 133-140.
    [188]West G A, Clarke T A. A Survey and Examination of Subpixel Measurement Techniques [J].Proc SPIE, 1990,1395,456-463.
    [189]Luca L. Geometric Calibration of Digital Cameras through Multi-view Recitification [J].Image and Vision Computing, 2005, 23:517-539.
    [190]Patra J C. An ANN-based smart capacitive pressure sensor in dynamic environment [J]. Sensors and Acutuators, 2000, 86:26-38.
    [191]张广军.机器视觉[M].2006,北京:科学出版社.
    [192]JEFFREY J. RODRIGUEZ J. K.AGGARWAL. Stochastic Analysis of Stereo Quantization Error[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990, 12(5):467-470.
    [193]高立志,林志航.一种摄像机成像误差的模型化修正方法[J].机器人,1999, 21(4):266-271.
    [194]E.S.Mcvey, et al. Some Accuracy and Resolution Aspects of Computer Vision Distance Measurements. IEEE Trans.on Pattern Analysis and MachineIntelligence.1982, 4(6):646-649.
    [195]S.D.Blostein, et al. Error analysis in Stereo Determination of 3-D Point Positions. IEEE Trans.on Pattern Analysis and Machine Intelligence.1987, 9(6):752-765.
    [196]J.J.Rodrigues, et al. Stochastic Analysis of Stereo Quantization Error. IEEE Trans.on Pattern Analysis and Machine Intelligence. 1990, 12(5):467-470.
    [197]魏振忠.基于机器视觉的在线柔性三坐标测量系统的研究[D].2003,北京航空航天大学博士研究生学位论文.
    [198]周富强.双目立体视觉检测的关键技术研究[D].2002,北京航空航天大学博士后研究工作报告.
    [199]Heikkila J, Silven O. A four-step camera calibration procedure with implicit image correction[C].Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1997,1106-1112.
    [200]Kirsch R A. Computer determination of the constituent structure of biological image[J].Computers in Biomedical Research ,1971,4:315-328.
    [201]Zhang G, Wei Z. A position-distortion model of ellipse for perspective projection[J]. Measurements Science and Technology, 2003, 14:1420-1426.
    [202]Reza Safaee-Rad. Three-Dimensional Location Estimation of Circular Features for Machine Vision[J].IEEE Transactions on Robotics and Automation, 1992, 8(5):624-640.
    [203]Ganapathy S. Decomposition of transformation matrices for robot vision[C]. Proc. Int. Conference on Robotics and Automation, 1984: 130-139.
    [204]Heikkila J. Geometric camera calibration using circular control points[J].IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22:1066-1076.
    [205]D.M.Tsai. A Machine Vision Approach for Detecting and Inspecting Circular Parts[J].The International Journal of Advanced Manufacturing Technology, 1999,15: 217-221.
    [206]王庆有.CCD应用技术[M].2000,天津:天津大学出版社.
    [207]梁晋文,陈林才,何贡。误差理论与数据处理[M].北京:中国计量出版社,1988.
    [208]张晓芳,俞信,蒋诚志等.无导轨空间坐标测量系统的光笔优化设计[J].北京理工大学学报,2003,23(6):744-748.
    [209]孙长库.激光测量技术[M].2000,天津:天津大学出版社.
    [210]卢汉清.计算机图像分析及其应用[D].1994,博士后出站工作报告.
    [211]易大义,陈道琦.数值分析引论[M].1998,浙江:浙江大学出版社.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700