基于光流的无人直升机自主着陆运动估计
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
无人直升机的自主着陆是其飞行过程中的重要阶段,对导航与控制能力都提出了很高的要求。目前常用的INS/GPS导航已不能满足其精度要求,利用计算机视觉完成该任务就成为当前的发展趋势。本文对目前国内外在自主着陆方面采用的各种计算机视觉技术进行了综述,并分析了各自的优缺点。
     本文在着陆场景为一平面假设的基础上,建立了无人直升机和平面着陆场景间相对运动的数学模型,以及摄像机的透视成像模型,以此为依据提出了一种无人直升机的运动估计方法:利用单目机载摄像机对着陆场景拍摄的图像序列,根据摄像机和着陆场景的相对运动模型和运动投影关系,由像平面上的2D运动恢复出直升机和摄像机的3D运动。
     像平面上的2D运动可通过计算连续图像帧间的光流得到。针对光流计算常用Horn-Schunk方法和Lucas-Kanade方法在计算较大像素运动时的不足之处,本文引入了基于图像金字塔结构的多分辨率、层次化的L-K迭代计算方法,对估计速度较大的运动能取得较好的效果。
     本文以Vega Prime视景仿真环境为平台,模拟直升机、着陆场景以及它们之间的相对运动,运用光流算法得到对运动参数的估计,并分析了图像频谱、直升机高度、摄像机视场等因素对算法的影响。在此基础上,对该算法的实时性、准确性、可靠性进行验证,指出了算法和实验设计的不足之处,最后,提出了一些改进意见并对下一步的工作做了展望。
The autolanding of unmanned helicopters is an important stage of their flying process.it requires good capability of navigation and control.Now the common used INS/GPS navigation can’t satisfy its requirement of accuracy.The application of computer vision has become a development trend.This paper summarizes the details of a variety of computer vision technologies which are researched and applied in the autolanding of unmanned helicopters at home and abroad.
     On the basis of hypothesis that the autolanding scene is a plane, the mathematical model of relative motion between helicopter and the plane scene is built, also the perspective imaging model is built under the condition of same hypothesis. According to this, a motion estimation method is proposed.The image photos of landing scenarios is gotten with an onboard camera.Utilizing the relative motion and projection relation between the camera and landing scene, the 3D motion of the helicopter and camera is recovered from the 2D motion in the imaging plane.
     The 2D motion in the imaging plane can be obtained by calculating the optical flow between two successive image frames.The common methods used in computing optical flow such as Horn-Schunk method and Lucas-Kanade methd have much limitation in computing big pixel motion, so we introduce a multi-resolution, hierarchical L-K iteration method based on the pyramid structure of images.Thus good results can be achieved in estimation of high speed motion.
     In this paper, Vega visual simulation environment is adopted as experiment platform,which is used to simulate the helicopter,autolanding scene and the relative motion between them.The motion parameters are estimated applying the algorithm based on optical flow.The influence of some effecting factors to this algorithm is analysized, including image’s frequency, helicopter’s height, camera’s field of view, etc.On this basis, the correctness and reliability of this algorithm are verified, the lack and shortcoming in the algotithm and experiment design is also pointed out.At last, some improvement suggestions will be proposed, and we will make a prospect for future work.
引文
[1] Sharp C S, Naffin D J, Sukhatme G S. Autonomous Flying Vehicle Research at the University of Southern California [C].Proceedings of the 1st International Workshop on Multi-Robot Systems. Netherland: Kluwer. 2002
    [2] Koo T J, Shim D H, Shakernia O, et al. Hierarchical Hybrid System Design on Berkeley UAV [C]. Proceedings of the International Aerial Robotics Competition. Richland, WA. 1998
    [3] Johnson E N, Scharge D P. The Georgia Tech Unmanned Aerial Research Vehicle: GTMax [EB/OL]. http://www.ae.gatech.edu/~ejohnson/gtmax_paper.pdf
    [4] Amidi O, Kanade T, Miller R. Vision-Based Autonomous Helicopter Research at CARNEGIE MELLON Robotics Institute 1991-1997 [C].Proceedings of the American Helicopter Society International Conference. Heli, Japan. 1998
    [5] Woodley B R, Jones H, Frew E, etal. A Contestant in the 1997 International Aerial Robotics Competition: Aerospace Robotics Laboratory Stanford University [EB/OL]. http://sun-valley. stanford.edu/papers/WoodleyJFLR:97.pdf
    [6] Johnson E N, DeBitetto P A, Trott C A, et al. The 1996 MIT / Boston University / Draper Laboratory Autonomous Helicopter System [C]. Proceedings of the AIAA/IEEE Digital Avionics Systems Conference. Atlanta, GA: IEEE. 1996, 381-386
    [7] Granlund G, Nordberg K, Wiklund J, et al. WITAS: An Intelligent Autonomous Aircraft Using Active Vision [EB/OL]. www.isy.liu.se/cvl/ScOut/Publications/Papers/uav00.pdf
    [8] Ollero A, Hommel G, Gancet J, et al. COMETS: A Multiple Heterogeneous UAV System [C].Proceedings of the IEEE International Workshop on Safety, Security and Rescue Robotics. Bonn Germany: IEEE. 2004
    [9] Buskey G, Robert J, Corke P, et al. The CSIRO Autonomous Helicopter Project [EB/OL]. www.itee.uq.edu.au/~wyeth/Publications/iser_article_aff.pdf
    [10]丘力为,宋子善,沈为群。无人直升机自主着舰的计算机视觉算法。北京航空航大学学报,2003,29(2):99-102
    [11]刘士清,胡春华,朱纪洪。基于计算机视觉的无人直升机位姿估计方法研究。计算机工程与设计,2004年04期
    [12] Courtney S. Sharp,Omid Shakernia,S. Shankar Sastry. A Vision System for Landing an Unmanned Aerial Vehicle. Robotics and Automation,2001
    [13] Saripalli S, Montgomery J F, Sukhatme G S. Vision-Based Autonomous Landing of an Unmanned Aerial Vehicle [C].Proceedings of the 2002 IEEE International Conference on Robotics and Automation. Washington, DC: IEEE. 2002, 2799-2804
    [14] Merz T, Duranti S, Conte G. Autonomous Landing of an Unmanned Helicopter Based on Vision and Inertial Sensing [C].Proceedings of the International Symposium on Experimental Robotics. Singapore: ISER. 2004
    [15] Saripalli S, Montgomery J F, Sukhatme G S. Vision-Based Autonomous Landing of an Unmanned Aerial Vehicle [C].Proceedings of the 2002 IEEE International Conference on Robotics and Automation. Washington, DC: IEEE. 2002, 2799-2804
    [16] Woodley B R, Jones H, LeMaster E A, et al. Carrier Phase GPS and Computer Vision for Control of an Autonomous Helicopter [C]. Proceedings of the Institute of Navigation GPS-96 Conference. Kansas City, MO: ION. 1996, 461-465
    [17] Amidi O, Kanade T, Fujita K. A Visual Odometer for Autonomous Helicopter Flight [J]. Journal of Robotics and Autonomous Systems. 1999, 28: 185-193
    [18] Proctor A A, Johnson E N. Vision-Only Aircraft Flight Control Methods and Test Results [C].Proceedings of the AIAA Guidance, Navigation, and Control Conference. Rhode Island: AIAA. 2004
    [19] Mejias L, Saripalli S, Cervera P, et al. Visual Servoing for Tracking Features in Urban Areas Using an Autonomous Helicopter [J]. Journal of Field Robotics. 2006, 23(3-4): 185-199
    [20] Proctor A A, Johnson E N. Vision-Only Aircraft Flight Control Methods and Test Results [C]// Proceedings of the AIAA Guidance, Navigation, and Control Conference. Rhode Island: AIAA. 2004
    [21] Johnson E N, Calise A J, Tannenbaum A R. Active-Vision Control Systems for Complex Adversarial 3-D Environments [C].Proceedings of the 2005 American Control Conference. Portland, OR: ACC. 2005
    [22] Michael Carsten Bosse. A Vision augmented navigation system for an autonomous helicopter. M.S. thesis, Boston University, 1997
    [23] Kelly J, Saripalli S, Sukhatme G S. Combined Visual and Inertial Navigation for an Unmanned Aerial Vehicle [C].Proceedings of the International Conference on Field and Service Robotics. Chamonix Mont-Blanc, France: FSR. 2007
    [24] Proctor A A, Johnson E N. Vision-Only Approach and Landing [C]. Proceedings of the AIAA Guidance, Navigation, and Control Conference. San Francisco, CA: AIAA. 2005
    [25] Johnson E N, Calise A J, Tannenbaum A R. Active-Vision Control Systems for Complex Adversarial 3-D Environments [C]. Proceedings of the 2005 American Control Conference. Portland, OR: ACC. 2005
    [26] Ivey G F, Johnson E N. Investigation of Methods for Target State Estimation Using Vision Sensors [C]. Proceedings of the AIAA Guidance, Navigation, and Control Conference. San Francisco, CA: AIAA. 2005
    [27] Wu A D, Johnson E N, Proctor A A. Vision-Aided Inertial Navigation for Flight Control [J]. AIAA Journal of Aerospace Computing, Information, and Communication. 2005,2(9): 348-360
    [28] Yu Zhenyu, Nonami K, Shin J, et al. 3D Vision Based Landing Control of a Small Scale Autonomous Helicopter [J]. International Journal of Advanced Robotic Systems. 2007, 4(1): 51-56
    [29] Sharp C S, Shakernia O, Sastry S S. A Vision System for Landing an Unmanned Aerial Vehicle [C]. Proceedings of the 2001 IEEE International Conference on Robotics and Automation. Seoul, Korea: IEEE. 2001
    [30] Shakernia O, Sharp C S, Vidal R, et al. Multiple View Motion Estimation and Control for Landing an Unmanned Aerial Vehicle [C]. Proceedings of the 2002 IEEE International Conference on Robotics and Automation. Washington, DC: IEEE. 2002, 2793-2798
    [31] Amidi O, Kanade T, Fujita K. A Visual Odometer for Autonomous Helicopter Flight [J]. Journal of Robotics and Autonomous Systems. 1999, 28: 185-193
    [32] Saripalli S, Montgomery J F, Sukhatme G S. Vision-Based Autonomous Landing of an Unmanned Aerial Vehicle [C]. Proceedings of the 2002 IEEE International Conference on Robotics and Automation. Washington, DC: IEEE. 2002, 2799-2804
    [33] Koo T J, Shim D H, Shakernia O, et al. Hierarchical Hybrid System Design on Berkeley UAV [C].Proceedings of the International Aerial Robotics Competition. Richland, WA. 1998
    [34]徐锦法.无人飞行器分布式控制系统集成新技术[J].系统仿真学报. 2003, 15(3): 437-440
    [35]高正,陈仁良.直升机飞行动力学[M].北京:科学出版社. 2003
    [36] D. Forsyth, J. Ponce.Computer Vision: A Modern Approach.Prentice Hall, US Ed edition, 2002. 5
    [37] Robert M.Haralick and Linda G.Shapro.Computer and Robot Vision, volume II,Addison-Wesley, 1993
    [38] Kenichi Kanatani. Geometric Computation for Machine Vision. Oxford Science Publications, 1993
    [39]章毓晋。图像工程(下册):图像理解与计算机视觉。清华大学出版社,2000年8月
    [40]高文,陈熙霖。计算机视觉:算法与系统原理。清华大学出版社,1999年2月
    [41]何斌。Visual C++数字图像处理。人民邮电出版社,2001年4月
    [42] B. Horn, B. Schunck..Determining optical flow. Artificial Intelligence, 1981, 17:185-203
    [43] Lucas B,Kanade T. An iterative image registration technique with an application to stereo vision[C].in Proceedings of the International Joint Conference on Artificial Intelligence,1981: 674-679
    [44] Bouguet, J.Y.Pyramidal Implementation of the Lucas Kanade Feature Tracker. Intel Corporation, Microprocessor Research Labs ,2000
    [45] Rafael C.Gonzalez, Richard E. Woods.Digital Image Processing.Prentice Hall, 2nd edition, 2002
NGLC 2004-2010.National Geological Library of China All Rights Reserved.
Add:29 Xueyuan Rd,Haidian District,Beijing,PRC. Mail Add: 8324 mailbox 100083
For exchange or info please contact us via email.