基于双目视觉的弧焊机器人焊缝三维信息获取研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
现有的焊接机器人系统从整体上看基本都属于第一代的示教再现型或部分第二代的离线编程型,这两类工业生产中服役的机器人都不具有适应焊接环境和作业条件变化的能力。使焊接机器人能够根据一定的传感信息而具有自动调节、自主规划的功能,对实现焊接机器人的自主焊接具有十分重要的现实意义。为了增强弧焊机器人的自主能力和自适应能力,提高机器人智能化水平,本文模拟焊工在焊接过程对环境的观察和适应能力,将两个CCD摄像机安装在机器人末端,在较大范围内观察焊接环境,利用双目视觉原理实现初始焊接位置的识别和导引,获取焊缝的三维空间信息,为进一步实现焊接机器人自主焊接奠定了基础。
     基于视觉传感的焊缝初始位置定位和三维信息获取是一个视觉三维重建的过程。双目立体视觉是三维重建的重要方法之一,本文将两个CCD成一定角度配置,安装在机器人末端,使得焊接环境处于双目的有效视场内,获取宏观环境下的图像,在较大范围内自主寻找初始焊接位置和计算整个焊缝在机器人坐标系下的三维坐标,控制机器人运动到初始焊接位置并能在焊前沿焊缝路径运动。
     图像的识别是视觉定位的基础,针对焊接环境本身的特点,本文提出了一套宏观环境下初始焊接位置和焊缝识别算法。该算法首先进行焊缝的识别,然后利用识别后的图像对初始焊位进行图像定位。焊缝的识别包括反光区域的去除、图像增强、边缘检测及后续处理等步骤,本文提出的自适应分区模糊增强算法(简称ARFIE),根据归一化的相对图像模糊对比度对图像进行自适应分区,根据分区级别信息进行模糊增强,可对不同对比度的图像获得良好的增强效果。焊缝的整体识别算法可很好的识别不同材料和不同焊接环境下的工件图像。对初始焊位的图像定位,本文提出以焊缝与工件边缘交点做初始值,局部范围内角点提取定位的算法,可精确的识别初始焊位。
     标定是从二维图像信息到三维空间信息的桥梁,本文对建立双目视觉系统进行了标定,包括双目各自的内外参数标定,相互位置关系的标定和手眼关系的标定,系统配置一旦确定后,摄像机的内参数就不再变化,而手眼关系可能在作业中发生变化,本文直接利用工件图像的信息进行了手眼关系的标定。
     立体匹配是立体视觉中的关键和难题之一。本文提出一种一般性配置摄像机校正不变量优化校正算法,将任意配置的双目系统校正到理想的平行配置情况。该算法对理论投影区域进行估计,利用该区域内的信息设计了优化校正关系。试验表明,该算法能减小校正图像的变形和图像信息的丢失,提高图像分辨率,大大提高了图像对校正的质量。针对校正后的立体图像对提出一种由粗到细的多信息匹配算法(简称CTFMIMM),该算法利用初始焊接位置等焊接环境中的特殊点给出匹配搜索范围,确定匹配候选集合,然后利用边缘特征信息的强度和方向信息作为约束条件并充分利用原图像的灰度信息,根据提出的灰度相似性参数DOGs最后确定唯一正确的匹配点。该算法是一种基于结构化的边缘特征信息、兴趣点和灰度相关的协同匹配方法。
     对焊缝信息进行了三维重建,并将其转换到机器人坐标系下,便于控制机器人的运动。分析了机器人误差对三维视觉计算的影响――包括机器人重复定位精度和TCP控制点的影响。试验表明,机器人重复定位精度对视觉计算的影响标准误差不大于0.3mm,当TCP标定最大误差超过1mm后需要进行重新标定。详细分析视觉系统配置对视觉计算的影响,试验测试进一步证明了分析推导结果。
     在介绍试验系统的基础上给出了根据本文方法进行的几种典型焊缝初始焊位导引与焊缝三维信息获取的试验。试验表明,不含机器人运动误差的纯视觉计算误差小于0.56mm。对空间焊缝初始焊接位置的导引,x、y、z方向上的最大误差小于1.1mm。对于整条焊缝三维信息的计算,平面焊缝的最大平面距离误差和高度误差分别是1.2mm,1.3mm,空间焊缝的最大平面距离误差和高度误差分别是1.2mm,1.6mm;采用模块化编程方法,将与硬件相关的程序及三维视觉计算程序分离,能方便实现不同机器人之间和视觉设备更换后的算法移植。
     初始焊接位置的自主定位导引和焊缝三维信息的获取是实现智能化焊接的技术基础,对环境具有较强的适应能力,能替代当前的示教在线和基于CAD的离线编程方法,对重要工件的焊接和危险环境下的焊接具有尤为重要的意义。
Generally, the current welding robot systems almost belong to the 1st generation’s teach and play back and few belong to the 2nd generation’s off-line programming welding robot. These two types of robots cannot adapt to the changes of environment and working conditions. It is very important for welding robot to have the ability to adjust itself and autonomous plan according to some sensor information, which is important in practical production to realize autonomous welding. In order to improve the intelligent level and reliability of robots, this dissertation fixed two CCD cameras on the end-effecter of robot, which simulated the function of welder’s two eyes, to observe the welding environment in a large extent. We realized the autonomous recognition and positioning of the initial welding position (IWP) and guiding the robot to IWP. We also realized the acquisition of three-dimensional (3D) coordinates of spatial weld seam using the principal of binocular vision. The research is the foundation to realize autonomous robot welding.
     Positioning of IWP and acquisition of 3D seam information is a procedure of visual 3D reconstruction. Binocular vision is an important method in this field. We placed two CCD cameras in a certain angel and fixed them on the end-effecter of the robot. The welding environment should be in the two cameras’common field of view, thus we can capture the work-pieces images in large welding environment. The aim of this thesis is finding and positioning the IWP in a relative large extent. And the 3D coordinates of welding seam in robot coordinates system is also calculated. Using the information, the robot is guided to the initial welding position and controlled to move along the seam path.
     Image recognition is the first step in 3D reconstruction. This thesis proposed an algorithm to recognize the IWP and the whole seam in relative larger welding environment. The algorithms first recognized the whole seam then IWP. The recognition of the whole seam including the procedure of pre-processing, such as the removing of effect of light, filter et al., image enhancement, edge detection and post-processing. This thesis proposed an algorithm named ARFIE (Adaptive Regional Fuzzy Image Enhancement), it defines the parameters named normalized relative fuzzy contrast as standard to part the image to different regions. The image is enhanced according to the level of region in fuzzy space. This method can enhance image of different contrast effectively. The whole procedure recognizes well the seam of work-pieces for different material and environment. The thesis also proposed a method to recognize IWP, it took the intersection point of seam and work-pieces boundaries as the initial value, and detected the corner in a window which take the initial value as the center. The method utilized both the edge and grey information, which assure the accuracy of recognition of IWP.
     Calibration is a bridge from 2D image to 3D spatial information. The thesis calibrated the binocular vision system, including the calibration of two cameras’intrinsic parameters, external parameters, relative relationship of the two cameras and hand-eye relationship. If the cameras were selected and placed, their intrinsic parameters are unchangeable, but the hand-eye relationship may change in the working because the collision, we adopt the online hand-eye relationship calibration algorithms.
     Stereo matching is a key and difficult problem in stereo vision. The thesis proposed an invariable transformation optimized image rectifying algorithms, which can rectify the image in general placement to ideal parallel placement. The algorithms calculated theory projection area of rectified images, proposed rectified relationship using information in this area. Experiments show that the algorithms reduce or avoid image distortion and the loss of image information, increase image resolution. The images have higher quality after the optimized rectification. An algorithm named CTFMIMM ( Coarse to Fine Multi-Information Matching Method)is proposed to solve the corresponding problem according to the characters of welding environment. The searching range is defined according to the special point such as IWP in welding image, and the structured edge information of processed image and grey information of original image are considered in the matching searching process, which ensured the rapid and accurate matching.
     The 3D information were reconstructed and transformed to the robot coordinates system. We can control the robot moving along the coordinates’data. This thesis analyzed the effect of robot error to 3D information’s calculating, including the effect of TCP calibration and repeat positioning precision. Experiments showed that the TCP should re-calibration when its error is more than 1mm. The effect of placement of the vision system is also analyzed and the corresponding experimental results are given.
     The experimental system is introduced and the experimental results are given at last. Some types of typical planar and spatial seam are selected for the experiments. Experiments show the visual computing error is less than 0.56mm in camera coordinates system, which don’t including the effect of robot moving. For guiding of IWP, the error is less than 1.1mm when the experimental conditions are satisfied. For the acquisition of 3D information, the planar distance error and height error is less than 1.2mm, 1.3mm for the planar seam, respectively. And the planar distance error and height error is less than 1.2mm, 1.6mm for the spatial seam. The program is organized in blocking model, the program related to hardware and 3D visual computing are in different model, which is convenient for the application of the algorithms in different robots and situation.
     The research is the technology foundation to realize autonomous welding. It has the ability to adapt to the changes of environment, which is a good method to replace work type of teach and playback, offline programming based on CAD, it is very important especially for the danger welding environments.
引文
[1].Manufacturing in America A Comprehensive Strategy to Address the Challenges to U.S. Manufacturers. In: Commerce USD; 2004.
    [2].Vision for Welding Industry. In: Society AW, ed.; 1998.
    [3].林尚扬, 关桥. 《我国制造业焊接生产现状与发展战略研究》总结报告: 中国工程院机械与运载学部; 2003.
    [4].陈善本, 林涛等. 智能化焊接机器人技术. 北京: 机械工业出版社; 2006.
    [5]. Ward P. Improving productivity with robotic welding [J]. Welding J.2001, 80(8): 52-54.
    [6]. Bolmsjo G., Olsson M., Cederberg P. Robotic arc welding?trends and developments for higher autonomy [J]. Industrial Robot. 2002, 29(2): 98-104.
    [7]. J.S. Tian, Wu Lin. A General Algorithm of Rotating/tilting Positioner Inverse Kinematics for Robotic Arc Welding Off-line Programming[J]. J of Materials Science & Technology. 2001, 17(1): 161-162.
    [8].X.Tang. Entwicklung eines interaktiven sensor unterstuetzten Programmier und Simulations werkzeugs fuer das Mehrlagen-Schweissen mit Industrierobotern. Deutschland: Shaker Verlag; 2003
    [9].Kortus Matin.An alternative approach to off-line programming[J].Industrial Robot.1996,20(4): 17-20.
    [10]. Hollingum Jack. Programming complex welds fast[J]. Industrial Robot. 1995, 22(3): 35-36.
    [11]. K Hubert. Rampersad. Introducing a robotic arc welding system[J].Robotics Today.1992, 5(4): 1-3.
    [12].吴林, 陈善本. 智能化焊接技术: 国防工业出版社; 2000.
    [13].T.Andersen. Robot welding in shipbuilding. In: Advanced Techniques and Low Cost Automation-Proc. of the ? 94 Int. Conf. of Int. Institute of Welding,; 1994; Beijing; p. 149-151.
    [14].林尚扬, 陈善本, 李成桐. 焊接机器人及其应用. 北京: 机械工业出版社; 2001.
    [15]. B.F.Kuvin. Welding for Space[J]. Welding Journal. 1992, 71(3): 41-44.
    [16].朱振友. 焊接机器人初始焊位视觉识别与导引研究[D]. 上海: 上 海 交 通 大 学; 2004.
    [17].郭振民. 基于视觉伺服的焊接机器人初始焊位导引研究 [D]. 哈: 哈尔滨工业大学; 2002.
    [18].巴顿 Б.Е., Γ.А.斯佩努, Б.Γ.季莫申科. 焊接机器人. 北京: 机械工业出版社; 1981.
    [19].殷际英, 何广平. 关节型机器人. 北京: 化学工业出版社; 2003.
    [20]. Norrish J., Gray D. Computer simulationa nd off-line programming in integrated welding systems[J]. Weld Met Fabr. 1992: 119-120,122.
    [21].Kusiak A. Programming, off-line languages, in: R.C. Dorf, S.Y.Nof (Eds.), Int. Encyclopaedia of Robotics: Applications and Automation. New York: Wiley; 1988.
    [22]. Millen B. Programming of welding robots[J]. Weld Met Fabr. 1993: 174,176,178.
    [23]. Bien C. Simulation a necessity in safety engineering[J]. RobotWorld. 1992, 10(4): 22-27.
    [24]. O Madsen, B Sorensen C. A system for complex robotic [J]. Industry Robot. 2002, 29(2): 127-131.
    [25].Bre′at J.L., etal. ACT WELD, a unique off-line programming software tailored for robotic welding applications. In: Proc TWO Conf Computer Technology in Welding; 1994; Paris,France; 1994.
    [26].Craig John J. Arc Welding Simulation Simplifies Programming[J].Robotics World.1987, (3): 24-25.
    [27]. Owens John. Task Planning in Robot Simulation[J]. Industrial Robot. 1996, 23(5): 21-24.
    [28]. Zeghloul S., al et. SMAR: A Robot Modeling and Simulation System. . 1997, 15(1): 63-73[J]. Robotica. 1997, 15(1): 63-73.
    [29]. Howie Phil. Graphic Simulation for Off-line Robot Programming [J]. Robotics Today. 1984, 6(2): 63-65.
    [30]. Carvalho G.C., Siqueira M.L., Absi-Alfaro S.C. Off-line programming of flexible welding manufacturing cells[J]. J.of Materials Processing Technology. 1998, 78(1998): 24-28.
    [31].Goh K.H., Middle J.E. WRAPS Welding Robot Adaptive off-line Programming and Expert Control System. In: Proc. of the 2nd Int. Conf. in Automated and Robotic Welding; London; 1987. p. 33-42.
    [32]. Gini Maria. The future of robot programming[J]. Robotica. 1987, (5): 235-246.
    [33]. 张钹. 智能机器人的理想与现实Cont rol Conf.; Seattle , Washington; 1995.
    [38]. Kim J-W, Shin J-H. A study of a dual-electromagnetic sensor system for weld seam tracking of I-butt joints[J]. Proc.s of the Institution of Mechanical Engineers, Part B: J.of Engineering Manufacture 2003, 217(9): 1305 - 1313
    [39].Marc.Ferretti. Vision sensors give eyes to the welding robot[J].Welding Journal.1999, 78(7): 51-52.
    [40].Chen S. B, Zhang Y., et al. Welding Robotic Systems with Vision Sensing and Self-learning Neuron Control of Arc Weld Dynamic Process [J]. J of Intelligent and Robotic Systems,. 2003, 36(2): 191-208.
    [41]. Chen S.B., Chen X.Z., Li J.Q. The Research on Acquisition of Welding Seam Space Position Information for Arc Welding Robot Based on Vision[J]. J.of Intelligent & Robotic Systems. 2005, 43(1): 77-97.
    [42]. Chen X. Z., Zhu Z. Y., et al. Vision-based recognition and guiding of initial welding position for arc-welding robot[J]. Chinese J.of mechanical engineering. 2005, 18(3): 382-384.
    [43]. Jeng J.Y., Mau T.F., Leu S.M. Gap inspection and alignment using a vision techniques for laser butt joint welding[J]. Int. J. of Advanced Manufacturing Technology. 2000, 16: 212-216.
    [44]. Zh.Y.Zhu, T.Lin, Y.J.Piao. Recognition of the initial position of the weld based on the image pattern match technology. [J]. The Int. J. of Advanced manufacturing technology. 2005.
    [45].Kim Min, Young Kim et al. Visual sensing and recognition of welding environment for intelligent shipyard welding robots. In: Proc.s 2000 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems ; Takamatsu, Japan: IEEE, Piscataway, NJ, USA; 2000. p. 2159-2165.
    [46].Xu De, Wang Linkun, Tan Min. Image Processing and Visual Control Method for Arc Welding Robot. In: Proc. of the 2004 IEEE Int. Conf. on Robotics and Biomimetics; Shenyang, China; 2004. p.727-732.
    [47].Wang Linkun, Xu De, Tan Min. Robust Detection For The Weld Seam Shaped Zigzag Line. In: Robotics and Biomimetics, 2004 ROBIO 2004 IEEE Int. Conf. on; p. 721- 726.
    [48]. Rooks B. Vision seam tracking extends to TIG[J]. Industrial Robot. 1987, 14(4): 207-208.
    [49]. P.Sicard, M.D.Levine. Joint Recognition and Tracking for Robotic Arc Welding [J]. IEEE Trans Syst Man Cybern. 1989, 19(4): 714-728.
    [50]. Suga Y Sano Y, Naruse M, et al. Recognition of the weld line by a visual sensing system and weld line tracking in automatic welding of thin aluminum plates [J]. Welding Int.. 1993, 7(4): 273-279.
    [51].Suga YMachida, A. Detection of weld line and automatic seam tracking by ultrasonic sensing robot for underwater wet welding. In: INT. SOCIETY OF OFFSHORE AND POLAR ENGINEERS (ISOPE); GOLDEN, CO (USA); 1994. p. 86-91.
    [52]. YASUO SUGA, AKIFUMI MUTO, MORIYASU KUMAGAI. Automatic Tracking of Welding Line by Autonomous Mobile Robot for Welding of Plates. Tracking of Linear and Angled Welding Lines[J]. Transactions of the Japan Society of Mechanical Engineers. 1997, 63(612): 2918-2924.
    [53]. J. Y. Yu and S. J. Na. A study on vision sensors for seam tracking of height-varying weldment. Part 2: Applications[J]. Mechatronics. 1998, 8: 21-36.
    [54].Wilson M. The role of seam tracking in robotic welding and bonding [J]. Industrial Robot. 2002, 29(2): 132-137.
    [55]. Kovacevic R., Zhang Y. M., Ruan S. Sensing and control of weld pool geometry for automated welding[J]. ASME J Eng Ind. 1995, 117: 210-222.
    [56]. Zhang Yu. M., Kovacevic Radovan, Adaptive Lin Li. Control of Full Penetration Gas Tungsten Arc Welding[J]. IEEE Transaction on Control System Technology. 1996, 4(4): 394-403.
    [57]. R. Kovacevic, Zhang Y.M. Real-Time Image Processing for Monitoring of Free Weld Pool Surface[J]. J.of Manufacturing Science and Engineering. 1997, 119(5): 161-169.
    [58]. S.B.Chen Y. J .Lou, L.Wu and D.B.Zhao. Intelligent methodology for sensing, modeling and control of pulsed GTAW part1-butt joint welding[J]. Welding Journal. 2000, 79(6): 151-163.
    [59]. S.B.Chen D.B.Zhao, L.Wu and Y.J.Lou. Intelligent methodology for sensing, modeling and control of pulsed GEAW part2-butt joint welding [J]. Welding Journal. 2000, 79(6): 164 -174.
    [60]. J.E.Agapakis. Approaches for Recognition and Interpretation of Workpiece Surface Features Using Structured Lighting [J]. Int J Robotics Res. 1990, 9(5): 3-16.
    [61]. Umeagukwu Charles, McCormick James. Investigation of an array technique for robotic seam tracking of weld joints[J]. IEEE transaction on Industrial electronics. 1991, 38(3): 223-229.
    [62]. J.E.Agapakis, J.M.Katz, J.M.Friedman. Vision-aided robotic Welding: An approach and a flexible implementation[J]. Int JRobotics Res. 1989, 9(5): 17-34.
    [63].J.E.Agapakis, N.Wittels, K.Masubuchi. Automated Visual Weld Inspection for Robotic Welding Fabrication. In: Proc Int Conf on Automation and Robotization of Welding and Allied Processes; Oxford: Pergmon Press; 1985. p. 151-160.
    [64]. 李原, 徐德, 李涛等. 一种基于激光结构光的焊缝跟踪视觉传感器[J]. 传感技术学报. 2005, 18(3): 488-492.
    [65].Gonz′alez-Galv′an Emilio J., Loredo-Flores Ambrocio,et al. An Optimal Path-Tracking Algorithm for Unstructured Environments based on Uncalibrated Vision. In: Proc. of the 2005 IEEE Int. Conf. on Robotics and Automation; Barcelona, Spain; 2005.
    [66].Kuno Y., Numagami H., Ishikawa M.. Three-dimensinal vision trchniqurs for an advanced robot system. In: Proc. Robotics and Automation 1985 IEEE Int. Conf. on; 1985; p.11-16.
    [67]. Nitzan David. three-dimensional vision structure for robot applications[J]. IEEE Trans. on Pattern analysisi and machine intelligentce 1988, 10(3): 291-309.
    [68]. 席文明, 郑梅生, 颜景平. 视觉引导下的机器人跟踪复杂焊缝的研究[J]. 东南大学学报. 2000, 30(2): 79-83.
    [69].李金泉.基于视觉弧焊机器人焊缝空间位置信息获取技术的研究. 哈: 哈尔滨工业大学; 2003.
    [70].Chen X.Z., Chen S. B, T.Lin. Practical method to locate the initial weld position using visual technology[J]. Int. J. Of Adv. Manuf. Tec.. 2005, (DOI: 10.1007/s00170-005-0104-z ).
    [71].L.G.Roberts. Machine Perception of Three-Dimension Solids. In: J.t. Tippett ea, editor. Optical and Electro-Optimal Information Processing; 1965; Cambridge, MA: MIT Press; 1965.
    [72].D.Marr 姚国正译. 视觉计算理论. 北京: 科学出版社; 1985.
    [73]. Barrow H G, Tenenbaum J M. Computational vision [J]. Proc of the IEEE. 1981, 69(5): 572-595.
    [74]. 何克忠, 郭木河. 智能机器人技术研究[J]. 机器人技术与应用. 1996, (2): 11-13.
    [75]. 静思. 各国地面军事机器人研究情况[J]. 机器人技术与应用. 1996, (5): 1-13.
    [76]. 王天珍. 计算机视觉与重建-技术及视觉理论界引人注目的争论[J]. 模式识别与人工智能.1998, 11(3): 300-304.
    [77]. 吴立德. 计算机视觉需要更扎实更艰苦的工作[J]. 模式识别与人工智能. 1992, 5(4): 261-265.
    [78]. 王晓军,吴立德. 当前国际有关计算机视觉研究的争论[J]. 模式识别与人工智能. 1992, 5(4): 287-292.
    [79].宣国荣. 计算机视觉有益的争论 [J]. 模式识别与人工智能. 1992, 5(4): 276-278.
    [80].高文, 陈熙霖. 计算机视觉: 清华大学出版社,广西科学技术出版社; 2000.
    [81].Avidan Shai, Shashua Amnon. Trajection triangulation:3D Reconstruction of Moving Points from a Monocular Image Sequence[J]. IEEE Trans. PAMI. 2000, 22(4): 348-357.
    [82].马颂德, 张正友. 计算机视觉. 北京: 科学出版社; 1998.
    [83].Ramesh Jain Rangachar Kasturi, Brian G.Schunck. Machine vision (gravure ). BeiJing: China Machine Press; 2003.
    [84].T.Poggio, H.Voorhees, A.Yuille. A Regularized Solution to Edge Detection. Tech. Rep. MA, Rep. AIM-833: MIT Artificial Intell. Lab.; 1985.
    [85]. Grimson. Computational Experiment With a Feature Based on Stereo Algorithm[J]. IEEE Trans PAMI. 1985, 7(1): 17-33.
    [86].D. Mumford, J. Shah. Boundary detection by minimizing functionals In: Proc IEEE Conf Computer Vision and Pattern Recognition; 1985; San Francisco; 1985.
    [87].Julez B. A Method of Coding TV Signals Based on Edge Detection[J]. Bell System Tech. 1959, 38(4): 1001-1020.
    [88]. Canny John. A Computational Approach to Edge Detection[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1986, PAMI-8(1): 679-697.
    [89].Jain R., Kasturi R., et al. Machine vision (gravure). BeiJing: China Machine Press; 2003.
    [90].Gonzalez Rafael C., Woods Richard E. Digital Image Processing. 北京: 电子工业出版社; 2003.
    [91]. Fram J.R., Deutsch E. S. On the quantitative evaluation of edge detection shceme and their comparison with human performance[J]. IEEE Trans Computers. 1975, C-24(6): 616-628.
    [92]. Heath M, Sarkar S, Sanoki T. A robust visual method for assessing the relative performance of edge-detection algorithms[J]. IEEE Trans Pattern Anal Machine Intell. 1998, 19(12): 1338-1359.
    [93]. Ando S. Consistent Cradient Operators[J]. IEEE Trans Pattern Anal Machine Intell. 2000, 22(3): 252-265.
    [94].成礼智, 王红霞, 罗永. 小波理论与应用. 北京: 科学出版社; 2004.
    [95].程正兴. 小波分析算法与应用. 西安: 西安交通常大学出版社; 1997.
    [96]. Mallat. Wavelet for Vision[J]. Proc IEEE. 1996, 84(4): 605-614.
    [97].Lee J., Haralick R., Shapiro L. Morphologic edge detection[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1987, PAMI-9(2): 142-156.
    [98]. S.Ghosal R.Mehrotra. Orthogonal Moment Operators for Subpixel Edge Detection[J]. Pattern Recognition. 1993, 26(2): 295-306.
    [99].Hough P.V.C., inventor Method and Means for Recognizing Complex Patterns. U.S. 1962.
    [100]. Lo R. C., Tsai W. H. Gray-Scale Hough Transform for Thick Line Detection in Gray-Scale Images[J]. Pattern Recog. 1995, 28(5): 647-661.
    [101]. Guil N., Villalba J., Zapats E.L. A Fast Hough Transform for Segment Detection[J]. IEEE TransImage Processing. 1995, 4(11): 1541-1548.
    [102]. Guil N., Zapata E.L. Lower Order Circle and Ellipse Hough Transform[J]. Pattern Recog. 1997, 30(10): 1729-1744.
    [103].Daul C., Graebling P., Hirsch E. From the Hough Transform to a New Approach for the Detection of and Approximation of Elliptical Arcs [J]. Computer Vision and Image Understanding. 1998, 72(3): 215-236.
    [104].A. ROSENFELD, E JOHNSTON. Angle detection on digital curves[J]. IEEE Trans Comput 1973, (22): 875-878.
    [105].P. MORAVEC H. Towards automatic visual obstacle avoidance. In: IEEE Int. Conf. on Robotics and Automation; 1977; 1977. p. 584-596.
    [106]. WANG HAN, BRADY MICHAEL. Real - time corner detection algorithm for motion estimation[J]. Image and Vision Computing. 1995, 13(9): 695-703.
    [107].TRAJKOVICM, HEDLEY.M. Fast corner detection[J].Image and Vision Computing.1998,16(1): 75-87.
    [108]. SCHMID CORDELIA, MOHR. RRGER, Bauckhage Christian. Evaluation of interest point detectors[J]. Int. J.of Computer Vision. 2000, 37(2): 151-172.
    [109].J COOPER, SVETHA, L KITECHEN. Early jump-out corner detectors[J]. IEEE Trans. on PAMI. 1993, 15: 823-828.
    [110]. TENG HSIN, HU W. C. A rotationally invariant two-phases cheme for corner detection [J]. Pattern Recognition1996, 28(5): 819-829.
    [111]. Mokhtarian Farzin, Suomela Riku. Robust image corner detection through curvature scale space[J]. IEEE Trans. on Pattern Analysis and Machine Intelligence. 1998, 20(12): 1376-1381.
    [112].L Kitchen, A Rosenfeld. Gray level corner detection [J]. Pattern Recognition Letters. 1982, 1(1): 95-102.
    [113].H Wang, M Brandy. A practical solution to corner detection. In: Proc Int. Conf. on Image Processing; 1994; Austin, Texas ,USA; 1994. p. 919-923.
    [114].C Harris, M Stephens. A combined corner and edge detector. In: Proc of the 4th Alvey Conf Vision 1988; Manchester, England: Manchester University Press; p. 189-192.
    [115]. M Smith S, M Brady J. SUSAN-A new approach to low level image processing[J]. Int J Computer Vision. 1997, 23(1): 45-78.
    [116]. Bae Sun Cheol, Kweon In So, Yoo Choong Don. A new corner detector [J]. Pattern Recognition Letters. 2002, 23(11): 1349-1360.
    [117].Zheng Z. Q., Wang H., Teoh E.K. Analysisi of gray level corner detection[J]. Pattern Recog Letters. 1999, 20: 149-126.
    [118].马颂德,张正友. 计算机视觉: 科学出版社; 1998.
    [119].吴立德. 计算机视觉. 上海: 复旦大学出版社; 1993.
    [120].贾云得. 机器视觉. 北京: 科学出版社; 2000.
    [121].袁野. 摄像机标定方法及边缘检测和轮廓跟踪算法研究. 大连: 大连理工大学; 2001.
    [122].Chesi Graziano, Hashimoto Koichi. A self-calibration technique for visual servoing. In: Proc.s of the 41st IEEE Int. Conf.on Decision and Control; Las Vegas,Nevada USA; 2002.
    [123].Pauwels T. Moons,L. Van Gool et al. Affine Reconstruction from Perspective Image Pairs with a Relative Object-camera Translation in BetweenI[J].IEEE Trans. on PAMI.1996, 18(1): 77-83.
    [124].Abdel-AzizYI, KararaHM. Direct linear transformation into object space coordinates in close-range photogrammetry. In: In:Proc Symp Close-Rang Photogrammetry; 1971; 1971. p. 1-18.
    [125].Q Zhuang H, K Wang, S Roth Z. Simultaneous calibration of a robot and a hand-mounted camera. In: 1993 IEEE Int Conf. on Robotics and Automation; 1993; Atlanta, GA, USA; p. 149-154
    [126].Weng J Cohen P, HerniouM. . Camera calibration w ith distort ion models and accuracy evaluat ion. IEEE Trans.PAMI [J]. 1992, 14(10): 965~ 980.
    [127].Y Tsai R. An efficient and accurate camera calibrat ion technique for 3D machine vision. In: In: Proc CV PR ? 86; 1986; p. 364~374.
    [128].R. Y. Tsai R. K. Lenz. New Technique for Fully Autonomous And Efficient 3D Robotic Hand/Eye Calibration [J]. IEEE Trans on R&A. 1989, 5(1): 345-358.
    [129].Ma S. D. A Self-Calibration Technique for Active Vision Systems[J]. IEEE Transactions on robotics and automation. 1996, 12(1): 114-120.
    [130].J Angeles, G Soucy, P Ferrie F. The online solution of the hand-eye problem[J]. IEEE Trans Robot Automat. 2000, 16(6): 720-731.
    [131].H Shi F, H Wang J, C Liu Y. An approach to improve online hand eye calibration. In: Proc of IbPRIA 2005; Estoril , Portugal;. p. 647-655.
    [132].Pla Filiberto, Marchant John A. Matching feature points in image sequences through a region-based method[J]. Computer Vision and Image Understanding 66(3): 271 - 285
    [133].Ji Chuan-Xiang, Zhang Zao-Pu. Stereo match based on linear feature. In: 9th Int. Conf. on Pattern Recognition; 1988; Rome, Italy; p. 875-878.
    [134].I Barnea D, F Silverman H. A class of algorithms for fast digital image registration[J]. IEEE Trans on Computers. 1972, C-21: 179-186.
    [135].T Kanade, M Okutomi. A stereo matching algorithm with an adaptive windows[J]. PAMI. 1994, 16(9): 920-931.
    [136].D Rosenholm. Multi-Point matching using the least-squares technique for an evaluation of three-dimensional models[J]. Photogrammatic Engineering and Remote Sensing. 1987, 53(6): 1214-1218.
    [137].T Vincent, Laganiere R. Matching feature points for telerobotics. In: IEEE Int. Workshop 2002 HAVE; 2002; Ottawa, Canada; 2002. p. 13-18.
    [138].J You, A Bhattacharya P. Wavalet-based coarse-to-fine image matchiing scheme in a parallel virtual machine environment[J]. IEEE Trans on Image Processing. 2000, 9(9): 1547-1559.
    [139].Vincent E., Lagani`ere R. Matching feature points in stereo pairs: A comparative study of some matching strategies[J]. Machine GRAPHICS & VISION. 2001, 10(3): 237-259.
    [140].J Fleet D, D Jepson A, M Jenkin M R. Phase-based disparity measurement[J]. CVGIP:Image undestanding. 1991, 53(2): 198-210.
    [141].D Cai L, J Mayhew. A note on some phase differencing algorithms for disparity estimation[J]. Int J Computer Vision. 1997, 22(2): 111-124.
    [1].马颂德, 张正友. 计算机视觉. 北京: 科学出版社; 1998.
    [2].T.Poggio, H.Voorhees, A.Yuille. A Regularized Solution to Edge Detection. Tech. Rep. MA, Rep. AIM-833: MIT Artificial Intell. Lab.; 1985.
    [3]. Julez B. A Method of Coding TV Signals Based on Edge Detection[J]. Bell System Tech. 1959, 38(4): 1001-1020.
    [4].L.G.Roberts. Machine Perception of Three-Dimension Solids. In:Optical and Electro-Optimal Information Processing; 1965; Cambridge, MA: MIT Press.
    [5]. Jeng J.Y., Mau T.F., Leu S.M. Gap inspection and alignment using a vision techniques for laser butt joint welding[J]. International Journal of Advanced Manufacturing Technology. 2000, 16: 212-216.
    [6].Kim Min Young; Kuk-Won Ko; Hyung Suck Cho et al. Visual sensing and recognition of welding environment for intelligent shipyard welding robots. In: Proc. 2000 IEEE/RSJ Int.Conf. on Intelligent Robots and Systems (IROS 2000) Takamatsu, Japan: IEEE, Piscataway, NJ, USA; 2000. p. 2159-2165.
    [7].郭振民. 基于视觉伺服的焊接机器人初始焊位导引研究 [D]. 哈尔滨: 哈尔滨工业大学; 2002.
    [8]. Sherrir R H, Johnson G A. Regionally adaptive histogram equalization of the chest[J]. IEEE Trans Med Imag. 1987, 6: 1-7.
    [9]. Ji T L, Sundareshan M K, Roehrig H. Adaptive image contrast enhancement based on human visual Properties[J]. IEEE Trans Medical Imaging. 1994, 13(4): 573 -586.
    [10]. K Pal S, A King R. Image enhancement using smoothing with fuzzy sets [J]. IEEE Transaction on Sys Man Cybern. 1981, 11(7): 494-501.
    [11]. 陈武凡. 彩色图像边缘检测新算法[J]. 中国科学(A辑). 1995, 25(2): 219-224.
    [12].Gonzalez R.C. Digital image processing. second ed: publishing house of electronic industry; 2002.
    [13]. Q Deng T, H Heijmans. Grey-scale morphology based on fuzzy logic [J]. Journal of mathematics imaging and vision. 2000, 16(2): 155-171.
    [14]. Canny John. A Computational Approach to Edge Detection[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1986, PAMI-8(1): 679-697.
    [15].A Zuniga O, M Haralick R. Corner detection using the facet model. In: Proc Conf Computer Vision and Pattern Recognition; 1983; Wash ington, D. C.; 1983. p. 30-37.
    [16].SM Sm ith, M Brady J. SUSAN —A new app roach to low level image processing. Chertsey: Defence Research Agency; 1995.
    [17]. T Sheu H, C HuW. A rotationally invariant two phase scheme for corner detection[J]. Pattern recognion. 1996, 29(5): 819~ 828.
    [18].Q Wu Z. A Filtered projections as an aid in corner detection [J]. Pattern Recognion.1983,16(1): 31-38.
    [19].H Freeman, S Davis L. A corner finding algorithm for chain code curves[J]. IEEE Trans on Computers. 1997, 26: 297~303.
    [1]. Zhang Z. A flexible new technique for camera calibration [J].IEEE Trans on Pattern Analysis and Machine Intelligence. 2000, 22(11): 1330-1334.
    [2].Jain R., Kasturi R., G.Schunck. Brian. Machine vision (gravure). BeiJing: China Machine Press; 2003.
    [3].马颂德, 张正友. 计算机视觉. 北京: 科学出版社; 1998.
    [4].Y. Tsai R. An efficient and accurate calibration technique for 3D machine vision. In: Proc IEEE Conf on Computer Vision and Pattern Recognition; 1986; Miami Beach, FL; 1986. p. 364-374.
    [5]. Shiu Y.C., Ahmad S. . Calibration of Wrist-Mounted Robotic Sensors by Solving Homogenous Transform Equations of the Form AX=XB[J]. IEEE Trans on RA. 1989, 5(1): 16-29.
    [6]. Zhuang Hanqi, Shiu Yui Cheung. A Noise-Tolerant Algorithm for Robotic Hand-Eye Calibration With or Without Sensor Orientation Measurement[J]. IEEE Trans on System, Man and Cybernetics. 1993, 23(4): 1168-1175.
    [7].Remy S., Dhome M., Lavest J.. Hand-eye calibration. In: Proc IEEE/RSJ Int Conf on Intelligent Robots and Systems; 1997; 1997. p. 1057-1065.
    [8]. J Angeles, G Soucy, P Ferrie F. The online solution of the hand-eye problem[J]. IEEE Trans Robot Automat. 2000, 16(6): 720-731.
    [1].Sonka Milan, Hlavac Vaclav, Boyle Roger. Image processing, analysis and machine vision(gravure). Second ed: Posts & telecommunications press, BeiJing, China; 2002.
    [2].O.D.Faugeras. Three Dimensional Computer Vision: A Geometric Viewpoint: MIT Press; 1993.
    [3].陈文杰. 局部自主智能焊接机器人系统及其远程控制方法研究[D]. 上海: 上海交通大学; 2005.
    [4]. N Ayache, F Lustman. Trinocular stereo vision for robotics [J]. IEEE Trans Pattern Anal Mach Intell. 1991,13: 73-85.
    [5].M Pollefeys, R Koch, L van Gool. A simple and efficient rectification method for general motion. In: Proc of the Int Conf on Computer Vision; 1999; Los Alamitos: IEEE Press; 1999. p. 496~501.
    [6].R Hartley, R Gupta. Computing matched-epipolar projections. In: CVPR93; NJ; 1993. p.549-555.
    [7].L Robert, C Zeller, O Faugeras. Applications of nonmetric vision to some visually guided robotics tasks In: Aloimonos Y(ed) Visual Navigation: From Biological Systems to Unmanned Ground Vehicles: Lawrence Erlbaum Assoc.; 1997:89-134.
    [8].Hartley R., Zisserman A. Multiple View Geometry in Computer Vision:Cambridge University Press; 2000.
    [9]. R Hartley. Theory and practice of projective rectification [J]. Int J Comput Vision 1999, 35(2): 1-16.
    [10].SM Seitz. Image-based transformation of viewpoint and scene appearance: University of Wisconsin Madison;1998.
    [1].Jain R., Kasturi R., G.Schunck. Brian. Machine vision (gravure). BeiJing: China Machine Press; 2003.
    [2].蔡自兴. 机器人学: 清华大学出版社; 2000.
    [3].费业泰. 误差理论与数据处理. 北京: 机械工业出版社; 2000.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700