应用于足球机器人的彩色全向视觉关键技术研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
机器人足球是目前机器人控制、机器视觉、人工智能、多机器人协调等领域研究的一个热点,也是一个难点。中型组机器人足球比赛(Middle-Size League)是机器人足球世界杯(RoboCup)中的一个重要项目,为彩色全向视觉的相关技术研究提供了一个标准测试环境。本文针对彩色全向视觉系统在中型组机器人应用中的颜色分类、特征提取和传感器模型建立等关键技术进行了研究,研究结果得到了较好的实验验证。
     本文首先根据中型组足球机器人系统的需要,分析、设计并实现了一套彩色全向视觉系统。该系统的主要特点是使用了组合等比例反射镜面和数字式彩色摄像机,可有效获得比赛环境的彩色数字全景图像。论文分析了该系统的设计原则,介绍了该系统各主要组成部分的特点、主要参数的测量方法,确定了彩色全向视觉系统的距离测量有效范围。
     其次,针对提高彩色全景图像的颜色分类处理能力,本文提出了一种基于线性分类器的混合颜色空间查找表颜色分类方法。该方法主要解决已有的颜色查找表分类方法的区分能力受颜色空间选择、阈值确定等因素影响而难以区分近似颜色的问题,将模式识别中的线性分类器思想应用于颜色查找表映射关系的建立,并通过同时使用HSI空间与YUV空间的方法提高查找表对近似颜色的区分能力。通过实验证明,基于线性分类器的混合空间查找表颜色分类方法具有查找表建立原则简单、效果直观的特点,对比赛环境中的近似颜色有较强的区分能力,查找表建立快捷、可靠,能够满足机器人足球比赛中彩色全景图像分类的实时性要求。
     再次,研究了机器人足球比赛环境中,彩色全景图像的特征检测方法。本文分析了光照条件在空间和时间上的不均匀性对机器人足球比赛场地特征检测的影响,通过检测颜色过渡,对场地上的白色标示线实现了与颜色分类结果无关的可靠检测;对蓝/黄色球门与场地的接触点实现了在颜色分类结果较差情况下的可靠检测。其主要思想是利用不同颜色在颜色空间中分布的位置相对关系,结合扫描线检测方法对亮度(白色标示线)或色调(蓝/黄色球门)变化进行判别,根据场地颜色结构化信息,判断变化的性质。论文中分别对两种场地特征给出了判别条件和判别算法,通过实验验证了算法的有效性和鲁棒性。
     基于上述颜色分类和特征检测的结果,本文结合粒子滤波定位方法,研究了机器人足球比赛环境下的彩色全向视觉传感器的建模方法。将彩色全向视觉系统作为粒子滤波定位方法的环境观测器,将特定的颜色过渡点到机器人的距离作为观测信息,建立了彩色全向视觉传感器模型。对模型的形式进行了分析,给出了模型的数学描述和主要参数计算方法。通过将依靠传感器模型得到的机器人位置、朝向信息与真实位姿进行比较,分析了影响模型精确度以及模型使用效果的主要因素。实验结果表明,本文建立的彩色全向视觉传感器模型能够克服观测信息部分失效的影响,具有一定的鲁棒性,通过使用双查找表方法,还可以进一步提高粒子点置信度的计算速度,从而增强模型的实用性。
The robot soccer, as a hard nut to crack, is a research focus in the field of the robot control, machine vision, artificial intelligence, and multi-robots systems. Middle-Size League, as a crucial competition in RoboCup, provides a standard testing environment for the color omni-directional vision techniques. The dissertation made an investigation on the color classification, the feature extraction, and the sensor model establishment in the application of the color omni-directional vision system in the field of the Middle-Size RoboCup competition.
     Firstly, the dissertation designed and realized a color catadioptric omni-directional vision system according to the need of Middle-Size League in RoboCup. The system features in the use of combined distortionless omni-mirror and the digital color camera in order to obtain digital color panoramic image of the competition environment effectively. Accordingly, the dissertation expounded the principles, the characteristics of the main components and the measurement of the main parameters, and decided the effective range of distance measurement of the system.
     Secondly, aiming at enhancing the ability of color classification for the color panoramic image, the dissertation proposed a combined color space CLUT (Color Look-Up Table) color classification method based on the linear classifiers. The method solved the problem that the current CLUT method is hard to distinguish the similar colors due to the influence of inaccurate choices of color space and threshold. The improved CLUT method applied the linear classifier in pattern recognition to the establishment of CLUT mapping relationship. Meanwhile, HSI and YUV color spaces were synchronously employed to increase the similar colors classification ability. The experiments indicated that the combined color space classification method, based on the linear classifiers, is convenient to establish CLUT and to take effect. Additionally, the method satisfied the real-time requirement of the color panoramic image classification in RoboCup with an effective way to distinguish similar colors.
     Thirdly, the dissertation investigated the feature extraction method of the color panoramic image in the RoboCup competition. It analyzed the effects of the inconsistent light condition in area and time on the feature extraction of the RoboCup competition field. Furthermore, by way of the color transition detection, it demonstrated that the white lines on the field can be detected reliably without color classification, and the touch points between the blue or yellow gate and the field can also be detected reliably with the unsatisfied color classification results. The principle of the two kinds of detection was to make use of the relative distribution of various colors in the color space, to decide the change of luminance (for the white lines) and the hue (for the blue or yellow gate) with the scan-line detection method, and to estimate the character of the change according to the color structuralized information of the field. The dissertation, for the two field features respectively, proposed the principles and the arithmetic which were proved effective and robust in the experiments.
     Finally, based on the results of the color classification and the feature extraction, the dissertation explored the way to establish the color omni-vision sensor model in the RoboCup competition with the particle filter localization method. The color omni-vision sensor, as the environment sensor of particle filter localization method, detected the chromatic transitions of interest as the observation information. Moreover, the dissertation analyzed the color omni-vision sensor model, indicated the mathematics description and the calculation method of the main parameters. In addition, the dissertation clarified the factors to affect the precision of the sensor model by comparing the robot localization information from the sensor model and the real position and orientation. The experiment results indicated that the color omni-vision sensor model, with great robustness, can overcome the invalidation of partial observed information. In particular, the employment of double look-up tables increased the speed of particle belief calculation and the practicability of the model.
引文
[1]A.K.Mackworth.On Seeing Robots.University of British Columbia.Technical Report.TR-93-05.1993
    [2]A.K.Mackworth.On Seeing Robots.Computer Vision:Systems,Theory,and Applications.World Scientific Press,Singapore,.1993:1-13
    [3]RoboCup official home page.http://www.robocup.org
    [4]FIRA official home page.http://www.fira.net/
    [5]H.Kitano,M.Asada,Y.Kuniyoshi.RoboCup:The Robot World Cup Initiative.Proceedings of the International Conference on Autonomous Agents,1997:340-347.
    [6]RoboCup Middle Size League.http://www.er.ams.eng.osaka-u.ac.jp /robocup-mid/index.cgi
    [7]Brainstormers-Tribots.http://1s1-www.cs.uni-dortmund.de/~asg
    [8]CoPS Stuttgart.http://www.informatik.uni-stuttgart.de/ipvr/bv/projekte/cops/
    [9]EIGEN KEIO UNIV.http://www.yoshida.sd.keio.ac.jp/~robocup/index.htm
    [10]Hibikino-Musashi.http://robocup.is.env.kitakyu-u.ac.jp/
    [11]上海交通大学交龙机器人足球队.http://robocup.sjtu.edu.cn/
    [12]国防科技大学NuBot机器人足球队.http://www.nubot.com.cn
    [13]B.Ryad,S.B.Kang.Panoramic Vision:Sensors,Theory and Applications.New York:Springer-Verlag,2001.
    [14]Y.Yagi,S.Kawato.Panorama Scene Analysis with Conic Projection.Proceedings of IEEE International Workshop on Intelligent Robots and Systems,1990,1:181-187.
    [15]J.Hong,X.Tan,B.Pinette,et al.Image-based Homing.Proceedings of IEEE International Conference on Robotics and Automation,1991,1:620-625.
    [16]M.Barth,C.Barrows.A Fast Panoramic Imaging System and Intelligent Imaging Technique for Mobile Robots.Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems 1996,2:626-633.
    [17]C.Pegard,E.M.Mouaddib.A Mobile Robot Using A Panoramic View.Proceedings of 1996 IEEE International Conference on Robotics and Automation,1996,1:89-94.
    [18]M.Jogan,A.Leonardis.Robust Localization Using Panoramic View-based Recognition.Proceedings of 15th International Conference on Pattern Recognition,2000,4:136-139.
    [19]T.Matsui,H.Asoh,S.Thompson.Mobile Robot Localization Using Circular Correlations of Panoramic Images.Proceedings of 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems,2000,1:269-274.
    [20]R.A,C.R.A Robot Self-localization System Based on Omnidirectional Color Images.Robotics and Autonomous Systems,2001,34(1):23-38.
    [21]C.F.Marques,P.U.Lima.A Localization Method for a Soccer Robot Using a Vision-Based Omni-Directional Sensor.RoboCup 2000:Robot Soccer World Cup Ⅳ.LNCS2019.Springer-Verlag,Berlin.2001:96
    [22]L.Paletta,S.Frintrop,J.Hertzberg.Robust Localization Using Context in Omnidirectional Imaging.Proceedings of IEEE International Conference on Robotics and Automation,2001,2:2072-2077.
    [23]J.Junhong,G.Indiveri,P.Ploeger,et al.An Omni-Vision Based Self-Localization Method for Soccer Robot.Proceedings of IEEE Intelligent Vehicles Symposium,2003:276-281.
    [24]E.Menegatti,A.Pretto,E.Pagello.A New Omnidirectional Vision Sensor for Monte-Carlo Localization.RoboCup 2004:Robot Soccer World Cup Ⅷ.LNCS3276.Springer-Verlag,Berlin.2005:97-109
    [25]E.Menegatti,A.Pretto,A.Scarpa,et al.Omnidirectional Vision Scan Matching for Robot Localization in Dynamic Environments.IEEE Transactions on Robotics,2006,22(3):523-535.
    [26]D.W.Rees.Panoramic Television Viewing System.U.S.Patent.3505465.April 1970
    [27]S.Baker,S.K.Nayar.A Theory of Catadioptric Image Formation.Sixth International Conference on Computer Vision,1998:35-42.
    [28]S.K.Nayar,S.Baker.Catadioptric Image Formation.Proceeding of DAPAR Image Understanding Workshops,1997:1431-1437.
    [29]S.K.Nayar.Catadioptric Omnidirectional Camera.Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition,1997:482-488.
    [30]S.Baker,S.K.Nayar.A Theory of Single-Viewpoint Catadioptric Image Formation.International Journal of Computer Vision,1999,35(2):175-196.
    [31]S.Rahul,D.G.Michael,K.N.Shree.Non-Single Viewpoint Catadioptric Cameras:Geometry and Analysis.International Journal of Computer Vision,2006,66(3):211-229.
    [32]K.Yamazawa,Y.Yagi,M.Yachida.Omnidirectional Imaging with Hyperboloidal Projection.Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems,1993,2:1029-1034.
    [33]曾吉勇.折反射全景立体成像.[博士学位论文].四川大学.2003
    [34]V.Peri,S.K.Nayar.Generation of Perspective and Panoramic Video from Omnidirectional Video.Proc.of DARPA Image Understanding Workshop,1997:243-246.
    [35]T.Sogo,H.Ishiguro,M.M.Trivedi.Real-time Target Localization and Tracking by N-ocular Stereo.Proceedings of IEEE Workshop on Omnidirectional Vision,2000:153-160.
    [36]皮文凯.基于全方位视觉的人体运动检测与跟踪.[硕士学位论文].北京大学.2004
    [37]P.Sturm.A Method for 3D Reconstruction of Piecewise Planar Objects from Single Panoramic Images.Proceedings of IEEE Workshop on Omnidirectional Vision,2000:119-126.
    [38]M.Aditi,W.B.Seales,M.Gopi,et al.Immersive Teleconferencing:A New Algorithm to Generate Seamless Panoramic Video Imagery.Proceedings of the Seventh ACM International Conference on Multimedia(Part 1),1999:169-178.
    [39]E.Menegatti,A.Pretto,S.Tonello,et al.A Robotic Sculpture Speaking to People.2007 IEEE International Conference on Robotics and Automation,2007:3122-3123.
    [40]Trackies.http://www.er.ams.eng.osaka-u.ac.jp/robocup/trackies/
    [41]S.Suzuki,Y.Takahashi,E.Uchibe,et al.Vision-based Robot Learning towards RoboCup:Osaka University "Trackies".RoboCup-97:Robot Soccer World Cup I.LNCS 1395.Springer-Verlag,Berlin.1998:305-319
    [42]Team questionnaires(TQ) submitted by teams for Robocup 2005 middle-size worlds competition,http://old.disco.unimib.it/robocup05ms1/TQv4.html
    [43]E.Pagello,E.Menegatti,A.Bredenfeld,et al.Overview of RoboCup 2003Competition and Conferences.RoboCup 2003:Robot Soccer World Cup Ⅶ.LNCS3020.Springer Berlin.2004:1-14
    [44]P.Lima,L.Custodio.RoboCup 2004 Overview.RoboCup 2004:Robot Soccer World Cup Ⅷ.LNCS3276.2005:1-17
    [45]I.Noda,S.Suzuki,H.Matsubara,et al.Overview of RoboCup-97.RoboCup-97:Robot Soccer World Cup Ⅰ.LNCS1395.1998:20-41
    [46]M.Asada,M.Veloso,M.Tambe,et al.Overview of RoboCup-98.RoboCup-98:Robot Soccer World Cup Ⅱ.LNCS1604.1999:1
    [47]M.Asada,G.A.Kaminka.An Overview of RoboCup 2002 Fukuoka/Busan.RoboCup 2002:Robot Soccer World Cup Ⅵ.LNCS2752.2003:1-7
    [48]S.Peter,A.Minoru,R.B.Tucker,et al.Overview of RoboCup-2000.RoboCup 2000:Robot Soccer World Cup Ⅳ.LNCS2019.Springer-Verlag.2001:1-28
    [49]P.Lima,T.Balch,M.Fujita,et al.RoboCup 2001.IEEE Robotics &Automation Magazine,2002,9(2):20-30.
    [50]M.V.Manuela,K.Hiroaki,P.Enrico,et al.Overview of RoboCup-99.RoboCup-99:Robot Soccer World Cup Ⅲ.LNCS1856.Springer-Verlag.2000:1-34
    [51]G.D.Cubber,H.Sahli,H.Ping,et al.A Color Constancy Approach for Illumination Invariant Color Target Tracking.IARP Workshop on Robots for Humanitarian Demining,Vienna,Austria,November 2002.
    [52]R.Wiemker.The Color Constancy Problem:An illumination invariant mapping approach.Computer Analysis of Images and Patterns.LNCS970.Springer Berlin.1995:950-955
    [53]C.G(o|¨)nner,M.Rous,K.-F.Kraiss.Real-Time Adaptive Colour Segmentation for the RoboCup Middle Size League.RoboCup 2004:Robot Soccer World Cup Ⅷ.LNCS3276.Springer-Verlag,Berlin.2005:402-409
    [54]C.Stanton,M.-A.Williams.A Novel and Practical Approach Towards Color Constancy for Mobile Robots Using Overlapping Color Space Signatures.RoboCup 2005:Robot Soccer World Cup Ⅸ.LNCS4020.Springer-Verlag,Berlin.2006:444-451
    [55]M.Sridharan,P.Stone.Towards Illumination Invariance in the Legged League.RoboCup 2004:Robot Soccer World Cup Ⅷ.LNCS3276.Springer-Verlag,Berlin.2005:196-208
    [56]E.H.Land.An Alternative Technique for the Computation of the Designator in the Retinex Theory of Color Vision.Proceedings of the National Academy of Sciences of the United States of America,1986,73:3078-3080.
    [57]D.H.Brainard,B.A.Wandell.An Analysis of The Retinex Theory of Color Vision.J.Opt.Soc.Amer.A,1986,3:1651-1661.
    [58]G.Mayer,H.Utz,G.Kraetzschmar.Towards Autonomous Vision Self-calibration for Soccer Robots.IEEE/RSJ International Conference on Intelligent Robots and System,2002,1:214-219.
    [59]G.Mayer,H.Utz,G.K.Kraetzschmar.Playing Robot Soccer under Natural Light:A Case Study.RoboCup 2003:Robot Soccer World Cup Ⅶ.LNCS3020.Springer-Verlag,Berlin.2004:238-249
    [60]E.Grillo,M.Matteucci,D.G.Sorrenti.Getting The Most from Your Color Camera in A Color-Coded World. RoboCup 2004: Robot Soccer World Cup VIII. LNCS3276. Springer-Verlag, Berlin. 2005: 221-235
    [61] Y. Takahashi, W. Nowak, T. Wisspeintner. Adaptive Recognition of Color-Coded Objects in Indoor and Outdoor Environments. RoboCup Symposium 2007, Atlanta, America, July 9-10, 2007.
    [62] H. Fujiyoshi, S. Shimizu, Y. Nagasaka, et al. A Method of Pseudo Stereo Vision from Images of Cameras Shutter Timing Adjusted. RoboCup 2004: Robot Soccer World Cup VIII. LNCS3276. Springer-Verlag, Berlin. 2005: 443-450
    [63] T. Kikuchi, K. Umeda, R. Ueda, et al. Improvement of Color Recognition Using Colored Objects. RoboCup 2005: Robot Soccer World Cup IX. LNCS4020.Springer-Verlag, Berlin. 2006: 537-544
    [64] L. Locchi. Robust Color Segmentation Through Adaptive Color Distribution Transformation. RoboCup 2006: Robot Soccer World Cup X. LNCS4434.Springer-Verlag, Berlin. 2007: 287-295
    [65] M. J(?)ngel. Using Layered Color Precision for A Self-Calibrating Vision System.RoboCup 2004: Robot Soccer World Cup VIII. LNCS3276. Springer-Verlag,Berlin. 2005: 209-220
    [66] M. Sridharan, P. Stone. Autonomous Planned Color Learning on a Legged Robot. RoboCup 2006: Robot Soccer World Cup X. LNCS4434.Springer-Verlag, Berlin. 2007: 270-278
    [67] K. Gunnarsson, F. Wiesel, R. Rojas. The Color and the Shape: Automatic On-Line Color Calibration for Autonomous Robots. RoboCup 2005: Robot Soccer World Cup IX. LNCS4020. Springer-Verlag, Berlin. 2006: 347-358
    [68] D. Cameron, N. Bames. Knowledge-Based Autonomous Dynamic Colour Calibration. RoboCup 2003: Robot Soccer World Cup VII. LNCS3020.Springer-Verlag, Berlin. 2004: 226-237
    [69] A.d. Cabrol, P. Bonnin, T. Costis, et al. A New Video Rate Region Color Segmentation and Classification for Sony Legged RoboCup Application.RoboCup 2005: Robot Soccer World Cup IX. LNCS4020. Springer-Verlag,Berlin. 2006: 436-443
    [70] M. J(?)ngel, J. Hoffmann, M. Lotzsch. A Real-Time Auto-Adjusting Vision System for Robotic Soccer RoboCup 2003: Robot Soccer World Cup VII.LNCS3020. Springer-Verlag, Berlin. 2004: 214-225
    [71] D. Herrero-Perez, H. Martinez-Barbera. Robust and Efficient Field Features Detection for Localization. RoboCup 2006: Robot Soccer World Cup X.LNCS4434. Springer-Verlag, Berlin. 2007: 347-354
    [72] P. Heinemann, F. Sehnke, F. Streichert, et al. Towards a Calibration-Free Robot:The ACT Algorithm for Automatic Online Color Training. RoboCup 2006:Robot Soccer World Cup X. LNCS4434. Springer-Verlag, Berlin. 2007:363-370
    [73] N. Vlassis, Y. Motomura, I. Hara, et al. Edge-based Features from Omnidirectional Images for Robot Localization. Proceedings of IEEE International Conference on Robotics and Automation, 2001, 2: 1579-1584.
    [74] C.K. David, Yuen, B.A. MacDonald. Natural Landmark Based Localisation System Using Panoramic Images. Proceedings of IEEE International Conference on Robotics and Automation, 2002, 1: 915-920.
    [75] N. Winters. A Holistic Approach to Mobile Robot Navigation using Omnidirectional Vision. [PhD disserttion]. University of Dublin, Trinity College. 2001
    [76]朱志刚.视觉导航中环境建模的研究.[博士学位论文].清华大学.1997
    [77]马建光.基于全向视觉的移动机器人定位和路径规划研究.[博士学位论文].北京理工大学.2003年
    [78]F.V.Hundelshausen,M.Schreiber,R.Rojas.A Constructive Feature Detection Approach for Robotic Vision.RoboCup 2004:Robot Soccer World Cup Ⅷ.LNCS3276.Springer-Verlag,Berlin.2005:72-83
    [79]F.V.Hundelshausen,R.Rojas.Tracking Regions.RoboCup 2003:Robot Soccer World Cup Ⅶ.LNCS3020.Springer-Verlag,Berlin.2004:250-261
    [80]F.V.Hundelshausen,R.u.Rojas.Tracking Regions and Edges by Shrinking and Growing.Proceedings of the Computer Vision Winter Workshop,2003:33-38.
    [81]F.V.Hundelshausen,M.Schreiber,F.Wiesel.MATRIX:A Force Field Pattern Matching Method for Mobile Robots.Freie Universit.Technical Report.B-09-03.2003
    [82]M.Laner,S.Lange,M.Riedmiller.Calculating the Perfect Match:An Efficient and Accurate Approach for Robot Self-localization.RoboCup 2005:Robot Soccer World Cup Ⅸ.LNCS4020.Springer-Verlag,Berlin.2006:142-153
    [83]S.Lange,M.Riedmiller.Evolution of Computer Vision Subsystems in Robot Navigation and Image Classification Tasks.RoboCup 2004:Robot Soccer World Cup Ⅷ.LNCS3276.Springer-Verlag,Berlin.2005:184-195
    [84]A.Merke,S.Welker,M.Riedmiller.Line Based Robot Localization under Natural Light Conditions.Workshop on Agents in Dynamic and Real Time Environments,Valencia,Spain,August 23-27,2004.
    [85]E.Menegatti,A.Pretto,E.Pagello.Testing Omnidirectional Vision-based Monte Carlo Localization under Occlusion.Proceedings of IEEE/RSJ 2004International Conference on Intelligent Robots and Systems,2004,vol.3:2487-2493.
    [86]王景川,陈卫东,曹其新.基于全景视觉与里程计的移动机器人自定位方法研究.机器人,2005,27(1):42-45.
    [87]何泽宇.中型足球机器人运动控制技术研究.[硕士学位论文].上海交通大学.2003
    [88]周金良.多智能体协作的移动机器人控制系统研究.[硕士学位论文].上海交通大学.2006
    [89]缪寿洪,曹其新,黄怡等.一种全方位中型足球机器人设计.第十一届中国人工智能学术年会,2005:1582-1585.
    [90]黄彦文,曹其新,陈卫东等.一种Robocup中型组足球机器人的运动控制策略 马斯特杯2003年中国机器人大赛及研讨会,北京,2003年8月.
    [91]方正,佟国峰,徐心和.基于贝叶斯滤波理论的自主机器人自定位方法研究.控制与决策,2006,21(8):841-847.
    [92]祖春山.基于全向视觉的自主式足球机器人定位系统研究与设计.[硕士学位论文].东北大学.2006
    [93]邓锐军.Robocup中型组足球机器人视觉系统的研究.[硕士学位论文].华南理工大学.2005
    [94]惠佳星.Robocup中型足球机器人决策系统研究与实现.[硕士学位论文].华南理工大学.2006
    [95]张祺.基于视觉的机器人足球比赛系统研究.[博士学位论文].广东工业大学.2003
    [96]钟碧良.机器人足球系统的研究与实现.[博士学位论文].广东工业大学.2003
    [97]卢惠民.机器人全向视觉系统自定位方法研究.[硕士学位论文].国防科技大学.2005
    [98]刘伟.RoboCup中型组机器人全景视觉系统设计与实现.[硕士学位论文].国防科技大学.2004
    [99]柳林.多机器人系统任务分配及编队控制研究.[博士学位论文].国防科技大学.2006
    [100]海丹.全向移动平台的设计与控制[硕士学位论文].国防科学技术大学.2005
    [101]刘玉鹏.多传感器系统设计及其在机器人定位中的应用.[硕士学位论文].国防科学技术大学.2005
    [102]R.Cutler,Y.Rui,A.Gupta,et al.Distributed Meetings:A Meeting Capture and Broadcasting System.Proceedings of the Tenth ACM International Conference on Multimedia 2002:503-512.
    [103]ASTRO Sensor Series.http://www.viewplus.co.jp/products/sos/astro-e.html
    [104]J.J.Kumler,M.L.Bauer.Fish-eye Lens Designs and Their Relative Performance.Proceedings of SPIE-Volume 4093,2000:360-369.
    [105]X.Yalin,K.Turkowski.Creating Image-based VR using a Self-calibrating Fisheye Lens.Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition,1997:237-243.
    [106]贾云得,吕宏静,徐岸等.一种鱼眼镜头成像立体视觉系统的标定方法.计算机学报,2000,23(11):1215-1219.
    [107]贾云得,吕宏静,刘万春.鱼眼变形立体图像恢复稠密深度图的方法.计算机学报,2000,23(12):1332-1336.
    [108]H.Ishiguro.Development of Low-cost Compact Omnidirectional Vision Sensors and Their Application.Proc.Int.Conf.Information Systems,Analysis and Synthesis,1998:433-439.
    [109]J.S.Chahl,M.V.Srinivasan.Reflection Surfaces for Panoramic Imaging.Appl.Opt.,1997,36(31):8276-8285.
    [110]J.Gaspar,C.Decco,J.Okamoto,Jr.,et al.Constant Resolution Omnidirectional Cameras.Proceedings of IEEE Workshop on Omnidirectional Vision,2002:27-34.
    [111]刘伟,刘斐,郑志强.用于机器人足球赛的全景视觉设计仿真.计算机仿真,2005,22(11):190-192.
    [112]T.Svoboda,T.Pajdla,V.Hlavac.Central Panoramic Cameras:Design and Geometry.Czech Technical University.Research Report.K335/97/147.1997
    [113]徐岸,贾云得,王忠立.基于双曲面镜的全向摄像机视觉实现方法.北京理工大学学报,2004,24(1):82-85.
    [114]曾吉勇,苏显渝.双曲面折反射全景成像系统.光学学报,2003,23(9):1138-1142.
    [115]曾吉勇,苏显渝,金国藩.基于双曲面折反射相机的柱面全景立体成像.光电子.激光,2006,17(6):728-732.
    [116]曾吉勇,苏显渝.抛物面折反射全景成像系统.光电子.激光,2003年,14(5):485-488.
    [117]F.M.Marchese,D.G.Sorrenti.Mirror Design of A Prescribed Accuracy Omnidirectional Vision System.Proceedings of IEEE Workshop on Omnidirectional Vision,2002:136-142.
    [118]F.M.Marchese,D.G.Sorrenti.Omni-directional Vision with A Multi-part Mirror.RoboCup 2000:Robot Soccer World Cup Ⅳ LNCS2019.Springer-Verlag,Berlin.2001:179
    [119]卢惠民,刘斐,郑志强.一种新的用于足球机器人的全向视觉系统.中国图象图形学报,2007,12(7):1243-1248.
    [120]卢惠民,刘伟,刘斐等.全向视觉装置.中华人民共和国知识产权局.发明专利.申请号:200510032556.X.2006.06.14
    [121]D.A.Forsyth,J.Ponce著.计算机视觉—一种现代方法.林学訚.王宏等译,北京:电子工业出版社,2004.
    [122]J.Bruce,T.Balch,M.Veloso.Fast and Inexpensive Color Image Segmentation for Interactive Robots.Proceedings of 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems,2000,3:2061-2066.
    [123]T.Bandlow,M.Klupsch,R.Hanek,et al.Fast Image Segmentation,Object Recognition,and Localization in a RoboCup Scenario.RoboCup-99:Robot Soccer World Cup Ⅲ.LNCS 1856.Springer-Verlag,Berlin.2000:111-128
    [124]T.R(o|¨)fer,T.Laue,D.Thomas.Particle-Filter-Based Self-localization Using Landmarks and Directed Lines.RoboCup 2005:Robot Soccer World Cup Ⅸ.LNCS4020.Springer-Verlag,Berlin.2006:608-615
    [125]F.Dellaert,D.Fox,W.Burgard,et al.Monte Carlo Localization for Mobile Robots.Proceedings of 1999 IEEE International Conference on Robotics and Automation,1999,2:1322-1328.
    [126]D.Fox,W.Burgard,F.Dellaert,et al.Monte Carlo Localization:Efficient Position Estimation for Mobile Robots.Proceedings of the Sixteenth National Conference on Artificial Intelligence,Orlando,Florida,July 18-22,1999.
    [127]S.Thrun,D.Fox,W.Burgard,et al.Robust Monte Carlo Localization for Mobile Robots.Artificial Intelligence,2000,101:99-141.
    [128]J.u.Wolf,A.Pinz.Particle Filter for Self Localization Using Panoramic Vision.Proceedings of the 26th Workshop of the Austrian Association for Pattern Recognition,2003:157-164.
    [129]N.J.Gordon,D.J.Salmond,A.F.M.Smith.Novel Approach to Nonlinear/non-Gaussian Bayesian State Estimation.Radar and Signal Processing of IEE Proceedings-F,1993,140(2):107-113.
    [130]J.Carpenter,P.Clifford,P.Feamhead.Improved Particle Filter for Nonlinear Problems.IEE Proceedings on Radar,Sonar and Navigation,1999,146(1):2-7.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700