基于仿生视觉的无人机编队飞行技术研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
现代无人机(Unmanned Aerial Vehicle)技术经过几十年的发展已经相对成熟,在军事和民用领域发挥了越来越重要的作用,正逐渐受到各国的广泛重视。以赶超国外先进水平为目标,面向UAV开展的多学科交叉的基础性技术研究,有着非常重要的现实意义。
     多UAV协同编队飞行(Coordinated Formation Flight)是国外近年来提出的一种新思想,目的是提高UAV完成任务的效率,拓宽其使用范围。基于视觉信息的UAV编队飞行方法来自于鸟类仿生机理,随着计算机视觉方法和图像处理软硬件的发展,采用视觉方法来实现UAV的编队飞行成为了可能。本文的主要研究工作包括:
     1)围绕UAV编队视频序列特征点的提取,开展了图像转换,图像滤波去噪,图像腐蚀膨胀以及图像的阈值分割等图像预处理方法的研究。
     2)深入讨论了基于LED标志灯获取长机位置和姿态信息的LHM算法。该方法采用LED标志灯标记长机,通过比较LED灯的颜色、个数和安放位置等因素,分析了LED灯对UAV编队的影响。在此基础上针对LHM实时性差的缺陷,提出了改进的LHM算法,并融合最小二乘图像匹配算法的思想,提出了LHM-LSIM算法。
     3)在基于LED标志灯的方法上,提出了用单目视觉信息进行双机间相对运动状态以及距离的测量方法。分析了视觉测量原理和摄像机标定对测量的影响,建立了双机编队问题的动力学模型以及视觉状态方程和测量方程。在硬件平台上进行距离测量的验证仿真。
     4)在多UAV编队飞行中不仅要确定长机之间的位置关系,还需要确定与其它飞机之间的关系,然而视觉照相机在捕捉视觉信息的视角范围上存在局限性,可能会存在视觉死角和信息丢失的情况。因此根据仿生学的思想理念,设计出一种双曲全景仿生眼视觉系统。研究了仿生眼的机理,并进行仿真验证。
The technology of modern Unmanned Aerial Vehicles (UAVs) have been relatively mature after decades of development, which played an increasingly important role in the military and civil fields, and is being concerned by various countries.In order to catch up with foreign advanced level of the UAV, there is a very practical significance to carry out more interdiscipline basis technology research.
     UAVs Coordinated Formation Flight is recently proposed abroad as a new idea to improve the efficiency of UAV’s completing tasks and to wide the scope of its application. The vision-based UAVs Coordinated Formation Flight (CFF) method is from biomimetic mechanism of birds. With the development of computer visual methods and the processing of software and hardware, it’s possible to achieve the UAVs CFF with visual methods. The main research work of this paper is as follows:
     1) Several popular algorithms of image pretreated processing, such as image conversion, image noise filtering, the corrosion expansion of the image, image thresholding are studied.
     2) LHM algorithm is deeply discussion in order to obtain the information of Leader’s location and attitude. The method uses LED lights to mark on the leader.Then the comparison of LED colors, number and the choice of location is analyzed for the infection of formation flight. On this basis, LHM-LSIM algorithm is presented with the integration of the improved LHM and the Least Squares Image Match algorithm to solve the defect of poor real-time in LHM.
     3) On the basis of the theory above, a measurement based on the monocular vision is presented in order to obtain the information of relative motion and distance between the two UAVs.The infection of measurement by the principle of visual measurement and the calibration of vidicon is analyzed. At the same time, the dynamic model and the state and measurement equation are established.The simulation is achieved in the hardware platform.
     4) In the UAVs CFF, it is needed to establish the position between the wingman and other airplanes. For the limit of visual camera in capturing visual information on the scope of perspective, dead angle and the loss of information may exist. Therefore, in accordance with bionics ideas, a hyperbolic panoramic bionic eye is studied and designed. Lastly, the mechanism of the bionic eye is studied and some simulations are made to validate.
引文
[1]朱战霞,袁建平.无人机编队飞行问题初探.飞行力学, 2003,21(2):5~7.
    [2] A. Proud, M. Pachter, J. J. Dazzo. Close formation control. AIAA guidance navigation and control conference, Portland, OR, August, 1999.
    [3] C. J. Schumacher, R. Kumar. Adaptive control of UAVs in close-coupled formation flight. American control conference, chicago, IL, June, 2000, 849~853.
    [4] M. R. Anderson, C. R. Robbins. Formation flight as a cooperative game. AIAA guidance navigation and control conference, Reston, VA, August 1998.
    [5] Sinopoli, Bruno, M. Micheli, et al. Vision based navigation for an unmanned air vehicle. Proceedings of the IEEE international conference on robotics and automation, May, 2001, 1757~1765.
    [6] E. Bonabeau, M. Dorigo and G. Theraulaz. Swarm intelligence: from natural to artificial systems. Oxford University Press, 1999.
    [7] M. Mataric. Interaction and intelligent behavior. PhD thesis. MIT, EECS, 1994.
    [8] J. D. Wolfe, D. F. Chichka, J. L. Speyer. Decentralized controllers for unmanned aerial vehicle formation flight. AIAA Guidance Navigation and Control Conference. AIAA—96.3833, 1996.1~9.
    [9] W. Proud, M Pachter, J. D’Azzo. Close formation flight control. AIAA Guidance Navigation and Control Conference,AIAA—99.4207, 1999, 1 231~1 246.
    [10] C. W. Renolds, Flocks, herds and schools: a distributed behavioral model. Computer graphics, 1987, 21(4):71~87.
    [11] A. V. Das, R. Fierro, V. Kumar, et al. A vision-based formation control framework. IEEE trans on robotics and automation, October 2002, 18(5):813~825.
    [12] E. N. Johnson, A. J. Calise, R. Sattigeri, et al. Approaches to Vision-Based Formation Control. Proceedings of the IEEE Conference on Decision and Control, December 2004, 1643~1648.
    [13] A. Huster, M. Rock. Relative position estimation for manipulation tasks by fusing vision and inertial measurements. In Proc MTS/IEEE oceans conf, 2001, 2(1):1025~1031.
    [14] E. N. Johnson, A. J. Calise, Y. Watanabe, et al. Real-time vision-based relative aircraft navigation. AIAA Journal of Aerospace Computing, Information, and Communication, March 2007, 4(4):707~738.
    [15] Sattigeri R, Calise A J, and Evers J H. An adaptive approach to vision-based formation control. In AIAA GNC conf and exh, 2003.
    [16] Anderson W R, A. C. Robbins. Formation flight as a cooperative game. AIAA guidance, navigation and control conference, restonm, VA, August, 1998.
    [17] Wu A D, Johnson E N, and A. A. Proctor. Vision-Aided Inertial Navigation. Proceedings of the AIAA Guidance, Navigation, and Control Conference, August 2005.
    [18] Polana R. Low level recognition of human motion. In Workshop on Motion of Non-Rigid and Articulated Objects. October 1994, Austin, USA.
    [19] Amat J, M C, Frigola M. Stereoscopic System for Human Body Tracking in Natural Scenes. In International Workshop on Modeling People at ICCV. September, 1999, Corfu,Greece.
    [20] Nakazawa,H K, S.Inokuchi. Human Tracking Using Distributed Video Systems. In International Conference on Pattern Recognition, 1998.
    [21] Kameda Y, M M. A Human Motion Estimation Method Using 3-Successive Video Frames. In International Conference on Virtual System and Multimedia, 1996.
    [22] Jain R C. Difference and accumulative difference pictures in dynamic scene analysis. Image and vision computing, 1984, 2(2):99~108.
    [23] Cucchiara R, Piccardi, and Mello P. Image analysis and rule-based reasoning for a traffic monitoring system. IEEE transactions on intelligent transportation system, 2000,1(2):119~130.
    [24] Lipton A.J, et.al. Moving target classification and tracking from real-time video. IEEE Workshop on application of Computer vision. Princeton, NJ, USA, 1998, 8~14.
    [25] R.T.Collins, et.al. A system for video survieillance and monitoring. VASM final report, CMU-RI-TR-00-12. Technical report, CMU, 2000.
    [26] M. Yamamoto, K.K. Human Motion Analysis Based on A Robot Arm Model. In Conference on Computer Vision and Pattern Recoginition, 1991.
    [27] H. Gu,Y.s., M. Asada. MDL-Based Spatiotemporal segmentation from Motion in Long Image Sequence. Conference on Computer Vision and Pattern Recognition, 1994.
    [28] Bregler, C. Learning and Recognizing Human Dynamics in Video Sequence. Conference on Computer Vision and Pattern Recognition, 1997.
    [29] Paln R, Pal S K. A review on image segmentation techniques. Pattern Recognition, 1993, 26(9):1277~1294.
    [30] Chakraborty A, Lawrence H, Staib,et al. Deformable boundary finding in medical images by integrating gradient and region information. IEEE trans medical imaging, 1996,15(6):90~92.
    [31] Fermandez G, Zryd J P. Muti-soectral image analysis of plant cells. Electronics Letters, 1999, 8(35).
    [32] Wu H, Barbaj S, Gilj J. A parametric fitting algorithm for segmentation of cell images. IEEE transactions on biomedical engineering, 1998, 3(45):400.
    [33] Xu L, Oja E, and Kultanena P. A new curve detection method: randomized hough transform. Pattern recognition letters, 1990, 11(5):331~338.
    [34] Yip K K, Tam P K and Leung D N. Modification of hough transform for circles and ellipse detection using 2-dimensional array[. Pattern recognition, 1992, 25(9):1007~1022.
    [35] Lei Y, Wong K C. Ellipse detection based on the symmetry. Pattern recognition letters, 1999, 20(1):41~47.
    [36] Duda R O, Hart P E. Use of the hough transform to detect lines and curves in pictures. Commun ACM, 1972, 15(1):11~15.
    [37]徐杰.智能车辆视觉导航中道路边界识别技术的研究,[硕士学位论文].吉林:吉林大学, 2001.
    [38] Kittler J, Illingworth J. On Threshold Slection Using Clustering Criteria. IEEE Trans, on SMC, 1985, 15(1):652~655.
    [39] Cho S, Haralick R M, Yi S. Improvement of Kittler and Illingworth’s Minimum Error Thresholding. Pattern Recognition, 1989, 22(5):609~617.
    [40] Ostu N. A threshold selection method from gray-level histogram. IEEE Trans, 1979, 9(1):62~69.
    [41]景晓军,蔡安妮等.一种基于二维最大类间方差的图像分割算法.通信学报,2001,22(4):71~76.
    [42]梁光明,刘东华等.二维Otsu自适应阈值分割算法的改进.自动化技术与应用,2002,21(5):43~47.
    [43]郝颖明,朱枫.二维Otsu自适应阈值的快速算法.中国图象图形学报, 2005,10(4):484~488.
    [44]王海禹.基于图像处理技术的药剂智能检测系统的研制,[硕士学位论文].南京:南京航空航天大学,2006.
    [45]崔屹.图像处理与分析——数学形态学方法及应用.北京:科学出版社,2000.
    [46] Haralick R M, Sternberg S K, Zhuang X. Image analysis using mathematical morphology, IEEE. Transactions on Pattern Analysis and Machine Intelligence, 1987, 9(4):532~550.
    [47]韩龙.编队飞行航天器相对状态的立体视觉测量研究,[博士学位论文].安徽:中国科学技术大学,2007.
    [48]李幼平,禹秉熙,王玉鹏,等.成像光谱仪辐射定标影响量的测量链与不确定度.光学精密工程,2006,14(5):822~828.
    [49]马文坡.光学遥感器星上定标的新进展.航天返回与遥感,1996,17(3):28~31.
    [50]麦镇强,李凤有,任建伟,等.星上定标LED光源长期工作的稳定性.发光学报, 2007,28(5): 753~758.
    [51] Lorenzo Pollini, Mario Innocenti, Roberto Mati. Vision Algorithms for Formation Flight and Aerial Refueling with Optimal Marker Labeling. AIAA Modeling and Simulation Technologies Conference and Exhibit, 2005.
    [52]曾素娣.基于单目视觉的测距算法研究,[硕士学位论文].云南:昆明理工大学,2007.
    [53] Lorenzo Pollini, Roberto Mati, Mario Innocenti, et al. A Synthetic Environment for Simulation of Vision-Based Formation Flight. AIAA Modeling and Simulation Technologies Conference and Exhibit, 2003.
    [54]周会成,陈吉红,王平江,等.用激光线光源实现快速测量.计量技术,1998(7):6~8.
    [55]郑顺义,孙明伟.基于物体成像轮廓的视觉测量与重建.测绘学报,2006,35(4):353~356.
    [56]胡宝洁,曾峦,熊伟,等.基于立体视觉的目标姿态测量技术.计算机测量与控制,2007,15(1):27~28.
    [57]王力超,熊超,王晨毅,等.基于竞争机制的简化双目立体视觉测距算法及系统设计.传感技术学报,2007,20(1):150~153.
    [58]应骏,何国锋,叶秀清.基于PCI总线的立体视觉系统实现非接触式实时测量.测控技术,2000,19(2):52~54.
    [59]郭磊,徐友春,李克强,等.基于单目视觉的实时测距方法研究.中国图像图形学报,2006,11(1):74~81.
    [60]王荣本,李斌,储江伟,等.公路上基于车载单目机器视觉的前方车距测量方法的研究.公路交通科技,2001,118(16):94~98.
    [61]徐岸,贾云得,王忠立.基于双曲面镜的全向摄像机视觉实现方法.北京理工大学学报,2004,24(1):82~85.
    [62]王亮.智能移动机器人定位技术研究,[硕士学位论文].哈尔滨:哈尔滨工程大学,2004.
    [63]吴自新.全景视觉系统设计与图象处理技术研究,[硕士学位论文].哈尔滨:哈尔滨工程大学,2005.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700