并联机器人双目主动视觉监测平台及理论研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
机器视觉融合视觉传感技术、图像理解技术、模式识别技术以及计算机控制技术,用机器代替人眼来做测量和判断,是实现系统智能化、自动化、信息化的先进技术。机器视觉系统的应用,能大大提高装备使用效率、精度及可靠性。针对目前并联机器人无法感知工作状态、环境的变化,处于“盲”工作状态的问题,本文从并联机器人机构的特点和机器人工作空间视觉监测出发,提出并建立了一种基于圆形导轨的双目主动视觉监测平台,在国家高技术研究发展计划(863计划)的支持下,研制了并联机器人双目主动视觉监测平台的实验样机,并对视觉监测平台的机构运动理论、视觉测量计算理论和关键应用技术等问题进行了深入研究。主要研究内容如下:
     (1)提出了新型并联机器人双目主动视觉监测平台的设计,建立平台机构的运动学模型。打破传统视觉监测平台两摄像机基线相对固定的设计,首次提出双摄像机在圆形轨道灵活运动的视觉机构设计,可以实现基线可调、光轴可调、大范围多角度观察、灵活避障的双目视觉的能力;针对实验样机,建立了视觉系统的机构学模型,给出了视觉系统的位置、速度、加速度运动(正、反解)模型,实验测试结果验证了模型的正确性。
     (2)建立了并联机器人双目主动视觉系统二级标定体系。其中,以平面棋盘靶作为一级标定靶,采用非线性摄像机标定算法,用于视觉系统摄像机内外参数的标定,实现系统的初始化;为解决并联机器人操作过程中出现机器人操作部件和加工部件对棋盘靶的遮挡问题,满足视觉系统动态标定的需要,设计了一种均匀分布在圆形导轨内侧,由六组立体标靶块构成的二级标定靶。然后,利用建立的标定体系,对并联机器人双目主动视觉机构的各结构参数、控制当量以及机构精度进行了测定。
     (3)通过两支链的协调控制,并联机器人双目主动视觉监测平台实现对环境和目标的双目最佳观测。为此,定义了系统正视观测模式,提出了摄像机观测注意点转移的支链控制策略,建立了正视模式下实现视觉跟踪的双目协调运动控制模型。实验结果验证了该观测模式和控制策略的有效性和可行性。
     (4)针对视觉监测系统在并联机器人中的应用,研究了视觉导航和运动监测问题。首先,以并联雕刻机器人刀具导向理想加工点为背景,提出了利用两摄像机采集的刀具的图像信息,直接计算刀具运行轨迹的简化方法。其次,为实现并联机器人运动检测,提出了基于圆台型标靶的加工刀具的位姿检测和基于扩展卡尔曼滤波的运动参数分析方法,并实验验证方法的有效性和可行性。
     (5)提出了并联机器人双目主动视觉系统软件的总体结构设计方案。软件系统由前台电气控制系统和后台视觉服务系统组成,前台电气控制系统负责视觉平台的电气设备控制和图像采集控制,后台视觉服务系统负责图像分析、控制策略生成,前后台计算机通过共享格式化文本文件实现信息交换。
Machine vision is a comprehensive technology fusing vision sensor, image understanding, pattern recognition and computer control etc. Using machine vision to replace human eyes with measurement and judgment, It is advantageous to achieve system intelligence, automation and information. The application of machine vision system can greatly improve the level of intelligence and automation, and enhance efficiency, accuracy and reliability of equipments. To solve the problem that parallel robot is unable to perceive the changes in working conditions and environment, this paper puts forward a binocular active visual monitoring platform based on circular orbit adapt to the parallelism of parallel mechanism and the requirement of robot workpace monitoring. With the support of the National High Technology Research and Development Program(863), a prototype machine of binocular active visual monitoring platform of parallel mechanism is developed, and the problems including kinematic theory, vision measurement and others key technologies have been deeply studied. The main research contents are as follows:
     (1)A new design of binocular active visual monitoring platform of parallel Mechanism is proposed, and its motion model is also established, which breaks the traditional design that the baseline of the camera in monitoring platform must be fixed. The design that the two cameras freely move along circle orbit is presented for the first time, which can realize more flexible binocular vision such as baseline adjustable capacity, optical axis adjustable capacity, large-scale multi-angle observations and vision avoidance. Aiming at experimental prototype, the mechanism model of visual system is established, and the position, velocity, acceleration kinematics model (forward and inverse) of vision system is also given. The experimental results verify the correctness of the model. (2)Two levels calibration architecture of binocular active visual monitoring platform of parallel mechanism is established. A planar chess target is the first level calibration, and a non-linear calibration algorithm is proposed to solve intrinsic parameters and extrinsic parameters of cameras and realize system initialization. To solve the occlusion issue caused by the end effectors and machined body in the calibration process by a planar chess target, the second calibration target, which consist of six groups of 3D target distributing uniformly inboard circular orbit, is designed to satisfy the requirement of dynamic calibration for vision system. Then using this two level calibration architecture, it is implemented measuring structural parameters and control equivalent and machining accuracy of binocular active visual monitoring platform of parallel mechanism.
     (3)Binocular active visual monitoring platform of parallel mechanism can achieve the best observation for the object in the common view of two cameras though harmoniously adjusting its two chains. For this purpose, the binocular emmetropization mode is defined, and A single chain control strategy is improved to transfer a camera attentive focus to the object position, and the binocular cooperative kinematics model is developed to implement visual tracking in binocular emmetropization mode. The experimental results show the effectiveness and feasibility of the observation model and control strategy.
     (4)Considering some application questions of binocular active visual monitoring platform for parallel mechanism, visual navigation and motion detection are preliminarily explored. Firstly, based on guiding cutting tool of engraving parallel robot to expected positions, a simplified direct method is proposed to plan the path of the cutting tool using its extracted visual information from two cameras. Secondly, the pose detection method of the cutting tool base on a truncated cone-shapetarget and motion parameters analysis method of the cutting tool based on Extended Kalman Filter are successively developed. The experimental results show that the methods are effective and practicable.
     (5)The overall software design of binocular active visual system of parallel mechanism is given. The whole system is constituted by a foreground control system and a background visual service system front electrical control system. The foreground control system provides some functions of electric equipment control and image acquisition. The background visual service system provides some functions of Image analysis and control strategy generating. Information exchange is implemented through sharing the same format Text file between the foreground and background computers.
引文
1 D. Stewart. A Platform with Six Degrees of Freedom. The Institution of Mechanical Engineers, 1965, 180(15): 371-386
    2 K. H. Hunt. Kinematic Geometry of Mechanisms. New York: Oxford University Press,1978: 304-307
    3 K. H. Hunt. Structural Kinematics of In-Parallel-Actuated Robot Arms. AMSE Journal of Mechanisms, Transmissions and Automation in Design, 1983, 105(4): 705-712
    4 H. MacCallion, D. T. Pham. The Analysis of a Six Degree of Freedom Work Station for Mechanized Assembly. Proc. 5th World Congress on Theory of Machines and Mechanisms, 1979: 611-616
    5黄真,孔令富,方跃法.并联机器人机构学理论及控制.北京:机械工业出版社, 1997: 1-394
    6张世辉.并联机器人汉字雕刻技术的研究. [燕山大学博士学位论文]. 2004: 1~16
    7杨灏泉,赵克定,吴盛林等.飞行模拟器六自由度运动系统的关键技术及研究现状,系统仿真学报, 2002, 14(2): 84-87
    8张曙, U. Heisel.并联运动机床.北京:机械工业出版社, 2003: 1-27
    9陈恳,李嘉,董怡等.并联微操作手的运动学分析.中国机械工程, 1998, 9(7):57-59
    10孙立宁,安辉,蔡鹤皋.压电陶瓷驱动并联微动机器人的研究.高技术通讯, 1997, 7(3): 29-31
    11 D. Marr. Vision, W. H. Freeman, Company, et al.视觉计算理论.姚国正,刘磊,汪云九译.北京:科学出版社,1988: 145-180
    12马颂德,张正友.计算机视觉——计算理论与算法基础.北京:科学出版社, 1998: 9-52
    13 R. Bajcsy. Active Perception. Proceeding of IEEE, 1988, 76(8): 996-1005
    14 A. Dankers, A. Zelinsky. Cedar: A Real-world Vision System: Mechanism, Control and Visual Processing. Machine Vision and Applications, 2004, 16: 47-58
    15罗翔,席文明,颜景平.一种双目主动立体视觉系统的目标定位算法.东南大学学报(自然科学版), 2002, 32(1): 59-63
    16高文,陈熙霖.计算机视觉——算法与系统原理.北京:清华大学出版社,1999: 133-178
    17 K. Pahlavan, J. Eklundh. Heads, Eyes and Head-eye Systems. Proc. SPIE on applications of AIX,Machine Vision and Robotics, Orlando, 1992:14-25
    18 R. Peters, M. Bishay. Centering Peripheral Features in An Indoor Environment Using A Binocular Log-polar 4 DOF Camera Head. Journal of Robotics and Autonomous Systems, 1996, 18(1): 271-282
    19 A. Brooks, G. Dickins, A. Zelinsky, et al. A high performance camera platform for real-time active vision. Proceedings of the First International Conference on Field and Service Robotics, Canberra, Australia, 1997: 559–564
    20 A. Brooks, G. Dickens, A. Zelinsky, et al. A high-performance camera platform for real-time active vision. Field and Service Robotics, 1998: 527-532
    21 K. Pahlavan, J. Eklundh. A head-eye system: Analysis and design. CVGIP: Image Understanding, 1992, 56(1): 41–56
    22 N.J. Ferrier, J.J. Clark. The Harvard binocular head. The lnternational Journal of Pattern Recognition and AI, 1993, 7(1): 9-31
    23 H.I. Christensen. A low-cost robot camera head. The International Journal of Pattern Recognition and AI. 1993, 7(1): 69-87
    24 H. Truong, S. Abdallah, S. Rougeaux, et al. A Novel Mechanism for Stereo Active Vision. Conference on Robotics and Automation (ACRA2000), Melbourne Australia, August, 2000: 1-8
    25 D.H. Ballard. Animate vision. Artificial Intelligence Journal, 1991, 48(1): 57-86
    26 Y. Nakabo, N. Fujikawa, T. Mukai, et al. High-speed and bio-mimetic control of a stereo head system. SICE Annual Conference, Sapporo, August, 2005: 2371–2376
    27 M. Okutomi, T. Kanade. A multiple-baseline stereo. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1993, 15(4): 353–363
    28 T. S. Huang, A. Netravali. Motion and structure from feature correspondences: a review. Proc IEEE, 1994, 82(2): 252–268
    29 S. D. Blostein, T. S. Huang. Error Analysis in Stereo Determination of 3-d Point Positions, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1987, 9(6): 752-765
    30 J. Batista, P. Peixoto, H. Araùjo. Real-Time Visual Behaviors with a Binocular Active Vision System. Proc of the 1996 IEEE/SICE/RSJ international Conference on Multi-sensor Fusion and Integration for Intelligent Systems, 1996: 663~670
    31 X. Roca, J. Vitriá, M. Vanrell , et al. Gaze Control in A Binocular Robot Systems,EmergingTechnologies and Factory Automation, 1999. Proceedings. ETFA '99. 1999: 479-485
    32 A. Dankers, A. Zelinsky. A real-world vision system, mechanism, control and visual processing. Machine Vision and Applications, 2004, 116: 47–58
    33 C. Brown. Gaze controls with interactions and delays. IEEE Trans. on Systems, Man and Cybern, 1990, 20(2): 518-527
    34 E. Rivlin, H.Rotstein. Control of a Camera for Active Vision: Foveal Vision, Smooth Tracking and Saccade. International Journal of Computer Vision, 2000, 39(2): 81–96
    35 Noah J. Cowan, Joel Weingarten, Daniel E. Koditschek. Visual servoing via navigation functions. IEEE Transactions on Robotics, 2002, 18(4): 521-533
    36 M. Asada, T. Tanaka, and K. Hosoda. Adaptive Binocular Visual Servoing for Independently Moving Target Tracking, Proc of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, 2000: 2076-2081
    37 A. Hauck, T. Schenk. What Can be Learned from Human Reach-To-Grasp Movements for the Design of Robotic Hand-Eye System?Proc of the 1999 IEEE international Conference on Robotics & Automation,Detroit, Michigan, 1999: 2521~2526
    38 C. Tomasi, T. Kanade. Detection and tracking of point features. Carnegie Mellon University Technical Report CMU-CS-91-132, 1991:1-20
    39 J.A. Piepmeier, G.V. McMurray, H. Lipkin, et al. Tracking a Moving Target with Model Independent Visual Servoing a Predictive Estimation Approach. Proceedings of the 1998 IEEE International Conference on Robotics& Automation Leuven, Belgium. 1998: 2652-2657
    40 T. Bandyopadhyay, M. H. Ang, et al. Motion Planning for 3-D Target Tracking among Obstacles. IN Int. Symp. on Robotics Research (ISRR), 2007: 1-12
    41管业鹏,童林夙.双目立体视觉测量方法研究.仪器仪表学报, 2003, 24(6):581-584
    42李明富,付艳,李世其等.基于双目视差和主动轮廓的机器人手眼协调控制技术研究.机器人, 2008, 38(3): 248-253
    43吴福朝,李华,胡占义.基于主动视觉系统的摄像机自定标方法研究.自动化学报, 2001, 27(6): 752-760
    44高庆吉,洪炳熔,阮玉峰.基于异构双目视觉的全自主足球机器人导航.哈尔滨工业大学学报, 2003, 35(9): 1029-1032
    45 E., Samson, D. Laurendeau, M. Parizeau, et al. The agile stereo pair for active vision. MachineVision and Applications, 2006, 17(1): 32-50
    46 C. M. Gosselin, E. S.Pierre. Development and experimentation of a fast three-degree of freedom camera-orienting device. The International Journal of Robotics Research, 1997, 16(5): 619-630
    47 S. Eric, L. Denis, P. Marc, Sylvain, et al. The Agile Stereo Pair for Active Vision. MVA, 2006, 17(1): 32-50
    48赵晓光,谭民,汪建华等.一种五自由度立体视觉监控装置:中国, 200410068998.5, 2004.07.15
    49杜欣,赵晓光,谭民.五自由度立体视觉仿真平台设计与建模.计算机仿真, 2006, 23(9):194-197
    50杜欣,赵晓光,谭民.五自由度立体视觉机器人平台建模与物体跟踪.北京交通大学学报(自然科学版), 2006, 30(5): 105-108
    51孔令富,王月明,赵立强.雕刻并联机器人刀具导向期望加工位置研究.高技术通讯, 2007, 17(10): 1039-1043
    52孔令富,王月明,赵立强.并联机器人双目主动视觉目标定位的研究.计算机集成制造系统, 2007, 13(11): 2284-2288
    53 L. Q. Zhao, L. F. Kong, Y. M. Wang. Error analysis of binocularactive hand-eye visual system on parallel mechanisms. Proc. of the 2008 IEEE International Conference on Information and Automation, Zhangjiajie, Hunan, China, 2008: 95-100
    54 L. Itti, C. Koch, E. Niebur. A Model of Saliency-based Visual Attention for Rapid Scene Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998, 20(11): 1254-1259
    55 L. Itti, C. Koch. Computational Modelling of Visual Attention. Nature Reviews Neuroscience, 2001, 2(3): 194-203
    56 L. Itti, P. F. Baldi. Bayesian Surprise Attracts Human Attention, Vision Research, 2009, 49(10): 1295-1306
    57 D. Walther, U. Rutishauser, C. Koch, et al. Selective Visual Attention Enables Learning and Recognition of Multiple Objects in Cluttered Scenes. Computer Vision and Image Understanding, 2005, 100(1-2): 41-63
    58 D. Walther, L. Itti, M. Riesenhuber, et al. Attentional selection for object recognition: a gentle way, in: Lecture Notes in Computer Science, vol. 2525, Springer, Berlin, Germany, 2002: 472–479
    59窦燕,孔令富.基于视觉注意的并联雕刻机器人工作场景分析机制,第三届全国先进制造装备与机器人技术高峰论坛, 2007: 68-74
    60窦燕,孔令富,王柳锋.基于视觉熵的视觉注意计算模型.光学学报, 2009, 29(9): 2511-2515
    61 Z. Zhang, O. Faugeras. Estimation of Displacements from Two 3-D Frames Obtained from Stereo, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1999, 14(12): 1141-1156
    62 Z. Zhang, C. Loop. Estimating the Fundamental Matrix by Transforming Image Points in Projective Space. Computer Vision and Image Understanding, 2001, 82(2): 174-180
    63 Z. Zhang, G. Xu. Epipolar Geometry in Stereo, Motion and Object Recognition: A Unified Approach. Kluwer Academic Publishers, 1996: 167-189
    64 O. Avni, F. Borrelli, G. Katzir, et al. Scanning the Environment with Two Independent Cameras - Biologically Motivated Approach. Proc. IEEE/RSJ Inte. Conf. on Intelligent Robots and Systems, Beijing, China, 2006: 5297-5302
    65 D. Mayne, J. Rawlings, C. Rao, et al. Constrained model predictive control: Stability and optimality,. Automatica, 2000, 36(6): 789-814
    66 F. Borrelli, P. Falcone, T. Keviczky, et al. MPC-Based Approach to Active Steering for Autonomous Vehicle Systems. International Journal of Vehicle Autonomous Systems, 2005, 3(2/3/4): 265-291
    67 P.Renaud, N. Andreff, P. Martinet, et al. Kinematic Calibration of Parallel Mechanisms: A Novel A pproach Using Legs Observation. IEEE Transactions on Robotics, 2005, 21(4): 529-538
    68 P. Renaud, N. Andreff, J. Lavest, et al. Simplifying the Kinematic Calibration of Parallel Mechanisms Using Vision-Based Metrology. IEEE Transactions on Robotics. 2006, 22(1): 12-22
    69 P. Renaud, N. Andreff, F. Marquet, P. Martinet. Vision-based Kinematic Calibration of a H4 parallel mechanism. In: Proc. IEEE Int. Conf. Robotics and Automation, ICRA'03. Taipei, Taiwan, 2003: 1191-1196
    70 P. Renaud, N. Andreff, P. Martinet, et al. Kinematic Calibration of Parallel Mechanisms: A Novel Approach Using Legs Observation. IEEE Transactions on Robotics, 2005, 21(4): 529-538
    71 P. Renaud, N. Andreff, J. Lavest, et al. Simplifying the Kinematic Calibration of Parallel Mechanisms Using Vision-Based Metrology. IEEE Transactions on Robotics, 2006, 22(1): 12-22
    72 N.Andreff, P. Martinet. Vision Servoing of a Gough -Stewart Parallel Robot without Proprioceptive Sensors. Fifth International Workshop on Robot Motion and Control, 2005: 225-230
    73 N. Andreff, T. Dallej, P. Martinet. Image-based Visual Servoing of a Gough-Stewart Parallel Manipulator using Leg Observations. The International Journal of Robotics Research. 2007, 26(7): 677-687
    74 N.Andreff, P. Martinet. Unifying kinematic modeling identification and control of a Gough-Stewart parallel robot into a Vision-based framework. IEEE Transactions on Robotics. 2006, 22(4): 1077-1086
    75 ABB. Vision and software to make packaging applications easy. Packaging Magazine, 2008: 26-27
    76张世辉,孔令富.一种新型6-PUS并联机构雕刻机.机器人, 2005, 27(4): 313-318
    77张世辉,孔令富.并联机器人汉字球面雕刻刀路规划研究.计算机仿真, 2004, 21(3): 27-30
    78孔令富,张世辉,杨广林.并联机器人汉字球腔内面雕刻刀路规划算法研究.机械设计与研究, 2004, 20(1): 29-31
    79黄真,赵永生,赵铁石.高等空间机构学.北京:高等教育出版社, 2006: 86-105
    80孔令富,赵立强,窦燕.适用精密机械加工的双目主动视觉监测装置.中国发明专利ZL200610048110.0, 2009: 1-9
    81孔令富,窦燕,赵立强.一种并联机器人双目主动视觉监测机构.中国实用新型专利ZL200620127470.5, 2008:1-10
    82 Y. Kojima, T. Fujii, M. Tanimoto. New multiple camera calibration method for a large number of cameras. Proceedings of the SPIE, 2004, 5665: 156-163
    83 X. Q.Meng, H. Li, Z. Y. Hu. A new easy camera calibration technique based on circular points. In Proceedings of the BMVC’2000. Bristol, UK. 2000: 496-505
    84 J. Heikkil?, O. Silvén. A four-step camera calibration procedure with implicit image correction. Proc. CVPR’97, IEEE, 1997: 1106-1112
    85马扬飚,钟约先,郑聆,等.三维数据拼接中编码标志点的设计与检测.清华大学学报(自然科学版),2006, 46(2): 169-171
    86 Z. Zhang. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-1334
    87 Z. Zhang. Flexible Camera Calibration By Viewing a Plane From Unknown Orientations. International Conference on Computer Vision (ICCV'99), Corfu, Greece, 1999: 666-673
    88 J.G. Fryer. D.C. Brown. Lens Distortion for Close-Range Photogrammetry. PhotogrammetricEngineering and Remote Sensing, 1986, 52(1): 51-58
    89 D.C. Brown. Decentering Distortion of Lenses, Photometric Engineering, 1966, 32(3): 444-462
    90 M. Lampton. Damping-Undamping Strategies for the Levevberg-Marquardt Nonlinear Least-Squares Method. Computers in Physics Fournal, 1997, 11(1): 110-115
    91 M. Lu, L. Zhu, L. Ling, et al. Distributed model calibration using Levenberg-Marquardt algorithm. Proc. of SPIE. 2007, 6520: 1-7
    92 C. Harris, M. Stephens. A Combined Corner and Edge Detector. Proc. Alvey Vision Conf., Univ. Manchester, 1988: 147-151
    93 M. Trajkovic, M. Hedley. Fast Corner Detection. Image and Vision Computing, 1998.16(2): 75-87
    94 G. Medioni, Y. Yasumoto. Corner detection and curve representation using cubic B-splines. Computer Vision, Graphics and Image Processing, 1987, 39: 279-290
    95 G. Loy, A. Zelinsky. A fast radial symmetry transform for detecting points of interest. 7th Euproean Conference on Computer Vision, Springer, 2002: 358-368
    96 D. G. Lowe. Distinctive image features from scale invariant key-point. International Journal of Camputer Vision, 2004, 60(2): 91-110
    97 Shakarji C M. Least-squares fitting algorithms of the NIST algorithm testing system. Journal of Research of the National Institute of Standards and Technology, 1998, 103 (6):633-641
    98 The SSfM programmer, EUROMETROS. http://www.eurometros.org/eurometros_menu.php
    99 D. Zambarbieri, E.Schmif. Models of Gaze Control in Man.Annual International Conferernce of the IEEE Engineering in Mdeicine and Biology Society, 1990, 12(3): 979-980
    100 J. Tanabe, J. Tregellas, D.Miller, et al. Brain Activation during Smooth-Pursuit Eye Movements. NeuroImage, Elsevier Science (USA), 2002, 17:1315-1324
    101 R. J. Krauzlis. Recasting the Smooth Pursuit Eye Movement System. Journal of Neurophysiology. 2004, 91: 591-603
    102 J. Batista, P. Peixoto, H. Araujo. Real-time vergence and binocular gaze control. Intelligent Robots and Systems, IROS '97, Proceedings of the 1997 IEEE/RSJ International Conference, 1997, 3: 1348 -1354
    103 D. Coombs, C. Brown. Real-time binocular smooth pursuit. Int. Journal of Computer Vision, 1993, 11(2):147-164
    104 K. Kanatani. Geometric Computation for Machine Vision. OxfordUniversity Press, Oxford, U.K. June 1993: 216-241
    105 Lingfu Kong, Shihui Zhang. A Novel Parallel Engraving Machine Based on 6-PUS Mechanism and Related Technologies. Advanced Robotic Systems Scientific Book, Published by the PIV pro literature Verlag Robert Mayer-Scholz DAAAM International, Vienna, Austria.2005,223-242
    106 ShiHui Zhang, LingFuKong. Research on Judgment Method for Structure type of Chinese Characters. The 5th International Conference on Computer-Aided Industrial Design and Conceptual Design. October 2003: 925-928
    107 T. J. Broida, S. Chanrashekhar, R. Chellappa. Recursive 3-D Motion Estimation from a Monocular Image Sequence. IEEE Trans. Aerospace and Electronic Systems, 1990, 26(4): 639–656
    108 Z. Zhang. On the Optimization Criteria Used in Two-View Motion Analysis. IEEE Transactions on Pattern analysis and Machine Intelligence, 1998, 20(7): 717-729
    109 S. D. Blostein, T. S. Huang. Estimating 3-D Motion from Range Data. Proc. First Conf. Artif. Intell. Applications, Denver, CO, 1984: 246-250
    110 Z. C. Lin, H. Lee, T. S. Huang. Finding 3-D Point Correspondence in Motion Estimation. The 8th International Conference on Pattern Recognization, Paris, France, 1986: 303-305
    111 Z. Zhang. O. Faugeras. 3D Dynamic Scene Analysis: A Stereo Based Approach. Springer Verlag, Berlin, New York, 1992: 234-253
    112 Z. Zhang. O. D. Faugeras. Determining Motion from 3D Line Segments: A Comparative Study. International Journal of Image and Vision Computing, 1991, 9(1):10-19
    113 M. Al-Baali, R. Fletcher. Variational Methods for Non-Linear Least-Squares. J. Opl. Res. Soc. 1985, 36(5): 405–421
    114 D. Marquardt. An Algorithm for Least Squares Estimation on Nonlinear Parameters. SIAM J. APPL. MATH, 1963, 11: 431–441
    115 M. E. Ragab, K. H. Wong. Extended Kalman filter based pose estimation using multiple cameras, Internal report, the CSE department, the Chinese university of Hong Kong, 2007: 1-32
    116 V. Lippiello, B. Siciliano, L. Villani. Position and Orientation Estimation Based on Kalman Filtering of Stereo Images. Proceedings of the 2001 IEEE International Conference on Control Applications. Mexico City, September 5-7, 2001: 702-707
    117 A. Shademan, F. Janabi-Sharifi. Sensitivity analysis of EKF and iterated EKF pose estimation for position-based visual servoing. Proceeding of 2005 IEEE CCA on Control Applications, Toronto, Canada, 2005: 755-760

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700