基于视觉的微小型四旋翼飞行机器人位姿估计与导航研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
微小型四旋翼无人飞行机器人(Quadrotor)是目前无人飞行机器人(Unmanned Aerial Vehicle, UAV)领域研究的热点问题,具有重要的理论意义和广阔的应用前景。相关的微小型四旋翼无人飞行机器人运动及位姿估计和定位导航,是其中极富挑战性的基础性和关键性研究课题。目前为止,世界上还没有一个适用于微小型四旋翼无人飞行机器人的视觉定位和导航系统可以满足实际的应用需求。本文的工作,围绕有限负载、动力及计算资源的微小型四旋翼无人飞行机器人平台展开,主要研究了视觉传感器主导,多传感器融合的微小型四旋翼无人飞行机器人运动及位姿估计和定位导航问题。
     微小型四旋翼无人飞行机器人的运动估计和位姿估计问题,是微小型四旋翼实现各种任务和应用的基础和关键。利用视觉传感器的运动及位姿估计方法,可以大致分为采用外部视觉的方法,或机载视觉的方法。外部视觉的方法通常更可靠更精确,但工作范围受到限制,可以用于微小型四旋翼无人飞行机器人的自主起降及精细控制。基于机载视觉的运动及位姿估计方法,可以使微小型四旋翼摆脱有限环境的限制,具有更好的灵活性,是当前研究的热点问题,可以广泛工作于更多样更复杂的任务和应用。
     实际中的众多相关应用,需要微小型四旋翼无人飞行机器人能够有效工作在一片较大范围的区域内。因此基于前面的相关技术积累,将有限范围内的基于视觉的位姿估计、运动估计及相关算法扩展到一定的较大的范围内,我们进行了基于机载视觉的微小型四旋翼无人飞行机器人定位导航问题的研究。本文的主要工作和创新点如下:
     (1)提出了基于外部视觉的,针对微小型四旋翼无人飞行机器人的鲁棒精确的位姿估计方法,并进行了实验验证。不同于现有大多数仅适用于室内环境的系统,我们的系统可以有效工作于室外环境。我们的一大特色是,以四旋翼飞行机器人的四个旋翼电机为视觉特征进行相关的位姿估计,在室外光照环境下,取得了比包括LED发光标签在内的彩色标签更加稳定和可靠的检测和定位效果。此外我们还研究了相关的位姿估计问题,提出一种解决共面点问题的快速精确的EMRPP位姿估计算法。该算法首先利用EPnP算法获得初始位姿估计,然后利用结合初步视觉结果的改进RPP算法得到精确的位姿估计结果。更进一步,针对微小型四旋翼的视觉特征的不完全观测情况,现有的方法都没有考虑这一问题,也得不到正确的位姿估计结果。我们结合视觉和机载IMU信息,提出了IMU+3P和IMU+2P位姿估计算法,有效解决了视觉特征不完全的问题,并得到准确的位姿估计结果。综合利用以上提出的方法,我们研发的位姿估计系统可用于微小型四旋翼无人飞行机器人的自主起降、机动控制和其它精细控制。
     (2)针对微小型四旋翼无人飞行机器人指定位置降落的特殊应用,我们提出一种基于平面地标的EIRPP位姿估计算法。该算法将IMU提供的飞行机器人部分姿态角信息,作为相关参数的初始估计代入EPnP算法中得到更精确的初始估计,并有效降低相关RPP算法迭代估计的未知量数目,实验结果表明,该算法可以获得快速精确的位姿估计结果。
     (3)充分利用微小型四旋翼无人飞行机器人的飞行特性,结合机载视觉和机载IMU,我们提出了一种基于自然环境特征的BRISK匹配的快速运动估计算法。针对目前微小型四旋翼的悬停实现多是基于光流方法实现,只能得到速度信息,且存在悬停点漂移等问题。我们引入基于特征匹配的快速运动估计算法,成功实现了微小型四旋翼无人飞行机器人的快速悬停功能。我们还针对实际应用中的一般情况,提出一种利用自然环境特征,结合机载视觉和机载IMU信息的机载情况下IMU+3P位姿估计算法。该算法可有效工作于非平面及平面场景情况,解决了单目视觉的初始化问题和绝对尺度估计问题,利用IMU提供的部分姿态角信息,有效降低了位姿估计问题的维度,获得了快速可靠的位姿估计结果。
     (4)在有限负载、动力及计算资源的条件下,探讨了微小型四旋翼无人飞行机器人平台的构建问题,提出了基于机载单目视觉、IMU和声纳的多传感器融合的单目视觉定位导航方案。该系统主要应用于GPS不可用的环境,以及无标识及无先验知识的环境。不同于其它大多数基于关键帧的系统,我们的系统同时利用了关键帧和关键点。为了保证精度和计算速度,使用基于GPU加速的SURF算法选择关键点。为了及时准确更新关键帧和关键点,提出基于快速运动估计和多级运动判决的更新方案,其在系统的较长时间执行过程中可以有效地减少误差累积。通常的单目视觉系统缺少绝对度量尺度,利用声纳传感器的距离信息完成初始化步骤,并较好地解决了绝对尺度估计问题。最后,通过综合利用特征点选择排序、RANSAC、局部光束平差法(Local Bundle Adjustmnet)等技术有效减少了系统误差及累积观测误差,实现了单目视觉主导的微小型四旋翼无人飞行机器人定位与导航功能。
Nowadays, the research about quadrotor has got much attention in the UAV(Unmaned Aerial Vehicle) field. It has great significance and wide prospect of applications. Vision based motion and pose estimation, localization and navigation for a quadrotor are the fundamental and key problems in research. Up to now, there have no satisfied monocular visual localization and navigation systems for quadrotors in the world. The main work in this paper is to have a research on the problems, which are vision based multi-sensor fusion motion and pose estimation, localization and navigation problems for a quadrotor with limited payload, power and computational resource.
     The motion estimation and pose estimation of the quadrotor are the foundation and key for various missions. The vision based motion estimation and pose estimation approaches are mainly external vision based algorithms and onboard vision based algorithms. The external vision based algorithms are more reliable and robust, but have limited region. They could be used for the autonomous take-off and landing of a quadrotor, or the precision control. Onboard vision based approaches for the quadrotor are paid much attention to because of the flexibility and getting rid of limited region. They could be used for various and more complex missions.
     In real applications, the quadrotor might work in a large-scale region. Based on our former algorithms, we extend the existing approaches in limited region to a larger region. We have some research on the vision base localization and navigation for the quadrotor. The main work and contribution of this paper are as follows:
     (1) We give the external vision based robust and accurate pose estimation system for the quadrotor, and perform in real experiments. Our system could work well in outdoor environments, while most existing systems could only perform in indoor environments. One key characteristic of our system is that we only use the quadrotor's own four rotors as the visual features, which is more reliable and robust than colored blobs and LEDs in outdoor environments. When all the four rotors of the quadrotor are observed rightly, we present the fast and accurate pose estimation algorithm EMRPP for the coplanar point problem. It gets the initial pose guess by non-iterative EPnP algorithm. By using the preliminary position result calculated by former vision step, we have modified the RPP algorithm and got the fast and accurate results of pose estimation. When the four rotors are observed partly, most of the existing approaches don't mention this case and could not get right results. By using the vision data and the onboard IMU, we propose the IMU+3P and IMU+2P algorithms which could resolve this case and get fast and right pose estimation results. Taking full advantage of our former proposed algorithms, the pose estimation system could be used for the autonomous take-off and landing of a quadrotor or the precision control.
     (2) Considering the pinpoint landing of the quadrotor, we present the landmark based fast and accurate pose estimation algorithm EIRPP. It makes use of onboard vision and IMU data, utilizing these data in EPnP to get initial guess and improve the RPP algorithms. We get fast and accurate pose estimation results in the end.
     (3) Making the best of the omni-directional flight characteristic of the quadrotor, we propose the BRISK based fast motion estimation algorithm which uses the natural features and the onboard IMU. Considering the hovering flight of the quadrotor, most approaches utilize the optical flow algorithms. The optical flow algorithms could only obtain the velocity information and the hovering point might drift along the time. Using the BRISK based fast motion estimation algorithm, we realize the fast spot hovering of the quadrotor. For the general conditions, we present the natural features based fast and accurate pose estimation algorithm for the quadrotor which has limited pay load and computational resource. This algorithm makes use of the onboard camera, IMU and sonar. It could work for both non-planar and planar scenes and solves the initialization problem and the metric scale estimation problem of the monocular system effectively. By using the data from the IMU, the pose estimation problem is simplified and obtains more fast and accurate results of pose estimation.
     (4) Considering the limited payload, power and computational resources, we discuss the hardware platform construction for the quadrotor. We have presented a multi-sensor fusion based monocular visual localization and navigation system for the quadrotor. This system with an IMU, a sonar and a monocular down-looking camera as its main sensor is able to work well in GPS-denied and markerless environments. Different from common keyframe-based system, our visual localization and navigation system is based on both keyframes and keypoints. Considering the accuracy and computational time, GPU-based SURF is performed for feature detection and feature matching. The fast motion estimation algorithm and the multilevel motion judgment rule are presented for updating the keyframes and keypoints. This is beneficial to hovering or near-hovering conditions and could reduce the error accumulation effectively. The general monocular visual systems usually lack the metric scale. By using sonar data, we solve the metric scale estimation problem and get good initialization of the navigation system. The good features selected, RANSAC, Local bundle adjustment and some other measures are performed to reduce the error accumulation and optimize the results.In the end, we have realized the monocular localization and navigation system for the quadrotor.
引文
[Abeywardena et al 2013] Abeywardena D, Wang Z, Kodagoda S, Dissanayake G.2013. Visual-inertial fusion for quadrotor micro air vehicles with improved scale observability[C]. Proceedings of the IEEE International Conference on Robotics and Automation,3148-3153.
    [Achtelik et al 2009] Achtelik M, Zhang T, Kuhnlenz K, Buss M.2009. Viusal tracking and control of a quadcopter using a stereo camera system and inertial sensors[C]. IEEE International Conference on Mechatronics and Automation,2863-2869.
    [Achtelik et al 2011] Achtelik Markus, Achtelik Michael, Weiss S, Siegwart R.2011. Onboard IMU and monocular vision based control for MAVs in unknown in-and outdoor environments[C]. IEEE International Conference on Robotics and Automation, China.3056-3063.
    [Ahrens et al 2009] Ahrens S, Levine D, Andrews G, How J P.2009. Vision-based guidance and control of a hovering vehicle in unknown, gps-denied environments[C]. Proceedings of the IEEE Internationa] Conference on Robotics and Automation,2643-2648.
    [Alahi et al 2012] Alahi A, Ortiz R, Vandergheynst P.2012. FREAK:Fast Retina Keypoint[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
    [Altug et al 2002] Altug E. Ostrowski J P, Mahony R.2002. Control of A Quadrotor Helicopter Using Visual Feedback[C]. Proceedings of the IEEE International Conference on Robotics and Automation, Washington, USA,72-77.
    [Altug et al 2003] Altug E, Ostrowski J P, Taylor C J.2003. Quadrotor control using dual camera visual feedback[C]. Proceedings of the IEEE International Conference on Robotics and Automation, pages 4294-4299.
    [Altug et al 2005] Altug E, Ostrowski J P. Taylor C J.2005. Control of a Quadrotor Helicopter Using Dual CameraVisual Feedback[J]. International Journal of Robotics Research,24(5): 329-341.
    [Ansar and Daniilidis 2003] Ansar A, Daniilidis K.2003. Linear pose estimation from points or lines[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,25(5):578-589.
    [Araujo et al 1998] Araujo H, Carceroni R, Brown C.1998. A Fully Projective Formulation to Improve the Accuracy of Lowe's Pose-Estimation Algorithm[J]. Computer Vision and Image Understanding,71(2):227-238.
    [ARDrone RefJ ARDrone Website. \\http://cdn.ardrone2.parrot.com.
    [AT Website] Ascending Technologies Website.\\http://www.asctec.de.
    [Bachrach et al 2009] Bachrach A, He R. Roy N.2009. Autonomous Flight in unstructured and unknown indoor environments[C]. Proceedings of the European Micro Aerial Vehicle Conference and Flight Competition.
    [Bachrach et al 2011] Bachrach A, Prentice S. He R, Roy N.2011. RANGE-Robust autonomous navigation in GPS-denied environments[J]. Journal of Field Robotics.28(5):644-666.
    [Bailey and Durrant-Whyte 2006] Bailey Tim, Durrant-Whyte Hugh.2006. Simultaneous Localization and Mapping (SLAM):Part Ⅱ[J]. IEEE Robotics & Automation Magazine,13(3): 108-117.
    [Bay et al 2008] Bay H, Tuytelaars T, Gool L V.2008. SURF:Speeded up robust features[J]. Computer Vision and Image Understanding,110(3):346-359.
    [Blosch et al 2010] Blosch M, Weiss S, Scaramuzza D, Siegwart R.2010. Vision based MAV navigation in unknown and unstructured environments[C]. Proceedings of the IEEE International Conference on Robotics and Automation, Alaska, USA,21-28.
    [Bouguet 2002] Bouguet J Y.2002. Pyramidal Implementation of the Lucas Kanade Feature Tracker, Intel Corporation, Microprocessor Research Labs.
    [Breitenmoser et al 2011] Breitenmoser A, Kneip L, Siegwart R.2011. A monocular vision-based system for 6d relative robot localization[C]. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 79-85.
    [Caballero et al 2007] Caballero F. Merino L, Ferruz J, et al.2007. Homography based Kalman filter for mosaic building. Applications to UAV position estimation[C]. ICRA,2004-2009.
    [Calonder et al 2010] Calonder M, Lepetit V, Strecha C, Fua P.2010. BRIEF:Binary robust independent elementary features[C]. Proceedings of the European Conference on Computer Vision, 778-792.
    [Carrillo et al 2012] Carrillo L R G, Dzul A, Lozano R.2012. Hovering Quad-Rotor Control:A Comparison of Nonlinear Controllers using Visual Feedback[J]. IEEE Transactions on Aerospace and Electronic Systems,48(4):3159-3170.
    [David et al 2004] David P, DeMenthon D, Duraiswami R, Samet H.2004. SoftPOSIT: Simultaneous Pose and Correspondence Determination [J]. International Journal of Computer Vision,59(3):259-284.
    [Davison 2003] Davison A.2003. Real-Time Simultaneous Localisation and Mapping with a Single Camera[C]. Proceedings of the International Conference on Computer Vision,1403-1410.
    [Davison et al 2007] Davison A J, Reid I D, Molton N D, Stasse O.2007. MonoSLAM: Real-Time single camera SLAM[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,29(6):1052-1067.
    [Debevec et al 1996] Debevec P E, Taylor C J, Malik J.1996. Modelling and rendering architecture from photographs:A hybrid geometry and image-based approach[C]. SIGGRAPH, 11-20.
    [DeMenthon and Davis 1995] DeMenthon D F, Davis L S.1995. Model-Based Object Pose in 25 Lines of Code[J]. International Journal of Computer Vision,15(1-2):123-141.
    [DF Website] DF Website.\\http://www.draganfly.com/uav-helicopter/draganflyer-x4p
    [Ding et al 2006] Ding M, Gao Y F, Guo L.2006. A method to recognize and track runway in the image sequences based on template matching[C]. International Symposium on Systems and Control in Aerospace and Astronautics,1218-1221.
    [Dissanayake et al 2001] Dissanayake G, Newman P. Durrant-Whyte H F, et al.2001. A solution to the simultaneous localisation and mapping (SLAM) problem[J]. IEEE Trans. Robotics and Automation.17(3):229-241.
    [DJI Website] DJI Website.\\http://www.dji.com.
    [Doucet et al 2000] Doucet A, Freitas N, Murphy K, et al.2000. Rao-blackwellised particle filtering for dynamic bayesian networks[C]. Proceedings of the Sixteenth Conference on Uncertainty in Artificial Intelligence.176-183.
    [Durrant-Whyte 1988] Durrant-Whyte H F.1988. Uncertain geometry in robotics[J]. IEEE Trans. Robotics and Automation,4(1):23-31.
    [Durrant-Whyte et al 1996] Durrant-Whyte H, Rye D, Nebot E.1996. Localization of automatic guided vehicles[C]. The 7th International Symposium on Robotics Research(ISRR'95), Springer, 613-625.
    [Durrant-Whyte and Bailey 2006] Durrant-Whyte Hugh, Bailey Tim.2006. Simultaneous Localization and Mapping:Part I[J]. IEEE Robotics & Automation Magazine,13(2):99-110.
    [Dusha et al 2007] Dusha D, Boles W, Walker R.2007. Attitude estimation for a fixed-wing aircraft using horizon detection and optical flow[C]. Conference of the Australian Pattern Recognition Society on Digital Image Computing Techniques and Applications,485-492.
    [Eade and Drummond 2007] Eade E, Drummond T.2007. Monocular SLAM as a graph of coalesced observations[C]. Proceedings of the International Conference on Computer Vision.
    [Engel et al 2012] Engel J, Sturm J, Cremers D.2012. Camera-based navigation of a low-cost quadrocopter[C]. Proceedings of the International Conference on Intelligent Robots and Systems, Algarve, Portugal,2815-2821.
    [Ettinger et al 2002] Ettinger S M, Nechyba M C, Ifju P G, et al.2002. Towards flight autonomy: Vision-based horizon detection for micro air vehicles[C]. ICRA, Washington DC.
    [Festo Website] Festo Website.\\http://www.festo.com/cms/en_corp/13165.htm.
    [Fischler and Bolles 1981] Fischler M, Bolles R.1981. Random Sample Consensus:A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography[J]. Communications of the ACM,24(6):381-395.
    [Fraundorfer et al 2010] Fraundorfer F, Tanskanen P, Pollefeys M.2010. A minimal case solution to the calibrated relative pose problem for the case of two known orientation angles[C]. Proceedings of the European Conference on Computer Vision, Springer,269-282.
    [Fraundorfer et al 2012] Fraundorfer F, Heng L, Honegger D, et al.2012. Vision-based autonomous mapping and exploration using a quadrotor MAV[C]. Proceedings of the International Conference on Intelligent Robots and Systems, Algarve, Portugal,4557-4564.
    [Faessler et al 2014] Faessler M, Mueggler E, Schwabe K, Scaramuzza D.2014. A Monocular Pose Estimation System based on Infrared LEDs[C]. IEEE International Conference on Robotics and Automation.
    [Gao et al 2003] Gao X S, Hou X R, Tang J L, Cheng H F.2003. Complete Solution Classification for the Perspective-Three-Point Problem [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence.25(8),930-943.
    [Grzonka et al 2009] Grzonka S, Grisetti G, Burgard W.2009. Towards a navigation system for autonomous indoor flying[C]. Proceedings of the IEEE International Conference on Robotics and Automation, Kobe, Japan,2878-2883.
    [Grzonka et al 2012] Grzonka S, Grisetti G, Burgard W.2012. A Fully Autonomous Indoor Quadrotor[J]. IEEE Transactions on Robotics,28(1),90-100.
    [Guivant and Nebot 2001] Guivant J E, Nebot E M.2001. Optimization of the simultaneous localization and map-building algorithm for real-time implementation[J]. IEEE Trans. Robotics and Automation,17(3):242-257.
    [Harris and Stephens 1988] Harris C, Stephens M.1988. A combined corner and edge detector[C]. Proceedings of the 4th Alvey Vision Conference,147-151.
    [Hartley and Zisserman 2004] Hartley R, Zisserman A.2004. Multiple View Geometry in Computer Vision (Second Edition)[M]. Cambridge University Press.
    [Herisse et al 2012] Herisse B, Hamel T, Mahony R, Russotto F-X.2012. Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow[J]. IEEE Transactions on Robotics,28(1):77-89.
    [How et al 2008] How J P, Bethke B, Frank A, Dale D, Vian J.2008. Real-time indoor autonomous vehicle test environment[J]. IEEE Control Systems Magazine,28:51-64.
    [Hu and Wu 2002] Hu Z Y, Wu F C.2002. A Note on the Number of Solutions of the Noncoplanar P4P Problem[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(4):550-555.
    [Julier and Uhlmann 1997] Julier S J, Uhlmann J K.1997. A new extension of the Kalman filter to nonlinear systems[C]. Proceeding of the SPIE, Bellingham, USA,182-193.
    [Kaess et al 2008] Kaess M, Ranganathan A, Dellaert F.2008. iSAM:Incremental smoothing and mapping[J]. IEEE Transactions on Robotics,24(6):1365-1378.
    [Kato and Billinghurst 1999] Kato H, Billinghurst.1999. Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System[C]. IEEE and ACM International Workshop on Augmented Reality,85-94.
    [Kim et al 2007] Kim C, Sakthivel R, Chung W K.2007. Unscented FastSLAM:A robust algorithm for the simultaneous localization and mapping problem[C]. IEEE International Conference on Robotics and Automation.2439-2445.
    [Klein and Murray 2007] Klein G, Murray D.2007. Parallel tracking and mapping for small AR workspaces[C]. Proceedings of the IEEE and ACM International Symposium on Mixed and Augmented Reality,225-234.
    [Klein and Murray 2008] Klein G, Murray D.2008. Improving the agility of keyframe-based SLAM[C]. Proceedings of the European Conference on Computer Vision. Marseille.802-815.
    [Klose et al 2010] Klose S, Wang J, Achtelik M, Panin G, et al.2010. Markerless, vision-Assisted Flight Control of a Quadrocopter[C]. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems,5712-5717.
    [Konolige and Agrawal 2008] Konolige K, Agrawal M.2008. FrameSLAM:From bundle adjustment to real-time visual mapping[J]. IEEE Transactions on Robotics,24:1066-1077.
    [Kottas et al 2013] Kottas D G, Wu K J, Roumeliotis S I.2013. Detecting and; Dealing with Hovering Maneuvers in Vision-aided Inertial Navigation Systems[C]. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems,3172-3179.
    [Kukelova et al 2008] Kukelova Z, Bujnak M, Pajdla T.2008. Polynomial Eigenvalue Solutions to the 5-pt and 6-pt Relative Pose Problems[C]. Proceedings of the British Machine Vision Conference,56.1-56.10.
    [Kukelova et al 2010] Kukelova Z, Bujnak M, Pajdla T.2010. Closed-form solutions to the minimal absolute pose problems with known vertical direction[C]. Proceedings of the Asian Conference on Computer Vision, Queenstown, New Zealand,216-229.
    [Kukelova et al 2012] Kukelova Z, Bujnak M, Pajdla T.2012. Polynomial Eigenvalue Solutions to Minimal Problems in Computer Vision[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,34(7):1381-1393.
    [Kushleyev et al 2013] Kushleyev A, Mellinger D, Kumar V.2013. Towards a swarm of agile micro quadrotors [J]. Autonomous Robots,35(4):287-300.
    [Leishman 2001] Leishman J G.2001. The breguet-richet quad-rotor helicopter of 1907. AHS International Directory,1-4.
    [Leonard and Durrant-Whyte 1991a] Leonard John J, Durrant-Whyte Hugh F.1991. Mobile robot localization by tracking geometric beacons[J]. IEEE Transactions on Robotics and Automation,7(3):376-382.
    [Leonard and Durrant-Whyte 1991b] Leonard J, Durrant-Whyte H F.1991. Simultaneous map building and localization for an autonomous mobile robot[C]. Proceedings of IEEE/RSJ International Workshop on Intelligent Robots and Systems, Osaka, Japan:1442-1447.
    [Lepetit et al 2008] Lepetit V, Moreno-Noguer F, Fua P.2008. EPnP:Accurate Non-Iterative O(n) Solution to the PnP Problem[J]. International Journal of Computer Vision,81(2):151-166.
    [Leutenegger et al 2011] Leutenegger S, Chli M, Siegwart R Y.2011. BRISK:Binary Robust Invariant Scalable Keypoints[C]. IEEE International Conference on Computer Vision,2548-2555.
    [Li and Hartley 2006] Li H, Hartley R.2006. Five-point motion estimation made easy[C]. Proceedings of the International Conference on Pattern Recognition, Hong Kong,630-633.
    [Lim et al 2011] Lim J, Pollefeys M. Frahm J M.2011. Online environment mapping[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
    [Lim et al 2012a] Lim H. Sinha S N. Cohen M F, Uyttendaele M.2012. Real-time Image-based 6-DOF Localization in Large-Scale Environments[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Providence, Rhode Island,1043-1050.
    [Lim et al 2012b] Lim H. Lee H, Kim H J.2012. Onboard Flight Control of a Micro Quadrotor Using Single Strapdown Optical Flow Sensor[C]. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems,495-500.
    [Lourakis and Argyros 2009] Lourakis M I A, Argyros A A.2009. SBA:A Software Package for Generic Sparse Bundle Adjustment[J]. ACM Transactions on Mathematical Software,36(1):1-30.
    [Lowe 1999] Lowe David G.1999. Object recognition from local scale invariant features[C]. Proceedings of the International Conference on Computer Vision,1150-1157.
    [Lowe 2004] Lowe David G.2004. Distinctive image features from scale-invariant keypoints[J]. International Journal of Computer Vision,60(2):91-110.
    [Lowire et al 1985] Lowire J M, Thomas M, et al.1985. The Autonomous Land Vehicle (ALV) Preliminary Road-Following Demonstration[C]. Proc. Intelligent Robots and Computer Vision.
    [Lu et al 2000] Lu C P, Hager G D, Mjolsness E.2000. Fast and globally convergent pose estimation from video images[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(6):610-622.
    [Lucas and Kanade 1981] Lucas B D, Kanade T.1981. An iterative image registration technique with an application to stereo vision[C]. Proceedings of the International Joint Conference on Artificial Intelligence,674-679.
    [Mahony et al 2012] Mahony R, Kumar V, Corke P.2012. Multirotor aerial vehicles, Modeling, estimation, and control of quadrotor[J]. IEEE Robotics & Automation Magazine,19(3),20-32.
    [Mair et al 2010] Mair E, Hager G D, Burschka D, et al.2010. Adaptive and generic corner detection based on the accelerated segment Test[C]. Proceedings of the European Conference on Computer Vision,183-196.
    [Malis and Vargas 2007] Malis E, Vargas M.2007. Deeper understanding of the homography decomposition for vision-based control[C]. Technical Report, Arobas IN IRA Sophia Antipolis.
    [McGee et al 2005] McGee T G, Sengupta R, Hedrick K.2005. Obstacle detection for small autonomous aircraft using sky segmentation[C]. ICRA. Barcelona.4679-4684.
    [Mei et al 2009] Mei C, Sibley G, Cummins M. et al.2009. A constant time efficient stereo SLAM system[C]. Proceedings of the British Machine Vision Conference (BMVC).
    [Meier et al 2011] Meier L, Tanskanen P, Fraundorfer F, Pollefeys M.2011. PIXHAWK:A System for Autonomous Flight using Onboard Computer Vision[C]. Proceedings of the IEEE International Conference on Robotics and Automation,2992-2997.
    [Meier et al 2012] Meier L, Tanskanen P, Heng L, et al.2012. PIXHAWK:A micro aerial vehicle design for autonomous flight using onboard computer vision[J]. Auton Robot,33:21-39.
    [Microdrones Website] Microdrones Website.\\http://ww\v.microdrones.com/index.php.
    [Montemerlo et al 2002] Montemerlo M, Thrun S, Koller D, et al.2002. Fast-SLAM:A factored solution to the simultaneous localization and mapping problem[C]. AAAI Conference on Artificial Intelligence,593-598.
    [Montemerlo et al 2003] Montemerlo M, Thrun S, Koller D, et al.2003. Fast-SLAM 2.0:An improved particle filtering algorithm for simultaneous localization and mapping that provably converges[C]. International Joint Conference on Artificial Intelligence,1151-1156.
    [Moreno-Noguer et al 2007] Moreno-Noguer F, Lepetit V, Fua P.2007. Accurate Non-Iterative O(n) Solution to the PnP Problem[C]. IEEE International Conference on Computer Vision.
    [Mouragnon et al 2006] Mouragnon E, Lhuillier M, Dhome M, Dekeyser F, Sayd P.2006. Real time localization and 3D reconstruction[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, New York, USA,363-370.
    [Murphy 1999] Murphy K.1999. Bayesian map learning in dynamic environments[C]. Neural Information Processing Systems,1015-1021.
    [Neira and Tardos 2001] Neira J, Tardos J D.2001. Data association in stochastic mapping using the joint compatibility test[J]. IEEE Transactions on Robotics and Automation,17(6):890-897.
    [Nister 2004] Nister David.2004. An Efficient Solution to the Five-Point Relative Pose Problem[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,26(6):756-770.
    [Oberkampf et al 1996] Oberkampf D, DeMenthon D F, Davis L S.1996. Iterative Pose Estimation Using Coplanar Feature Points[J]. Computer Vision and Image Understanding,63(3): 495-511.
    [潘翔等2010]潘翔,童丸丸,姜哲圣.2010.用于UAV视觉导航的跑道检测与跟踪[J].传感技术学报,23(6):820-824.
    [Park et al 2005] Park S, Won D H, Kang M S, et al.2005. RIC(Robust Internal-loop Compensator) Based Flight Control of a Quad-Rotor Type UAV[C]. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems,3542-3547.
    [Plinval et al 2011] Plinval H, Morin P, Mouyon P, et al.2011. Visual servoing for underactuated VTOL UAVs:A linear, homography-based approach[C]. ICRA,3004-3010.
    [邱力为等2003]邱力为,宋子善,沈为群.2003.用于无人直升机着舰控制的计算机视觉技术研究[J].航空学报,24(4):351-354.
    [Quan and Lan 1999] Quan L, Lan Z D.1999. Linear N-point camera pose determination[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,21(7):774-780.
    [Ramalingam et al 2011] Ramalingam S, Bouaziz S, Sturm P.2011. Pose Estimation using Both Points and Lines for Geo-Localization[C]. Proceedings of the IEEE International Conference on Robotics and Automation.
    [Rosten and Drummond 2005] Rosten E, Drummond T.2005. Fusing Points and Lines for High Performance Tracking[C]. Proceedings of the International Conference on Computer Vision.
    [Rosten and Drummond 2006] Rosten E, Drummond T.2006. Machine learning for high-speed corner detection[C]. Proceedings of the European Conference on Computer Vision,430-443.
    [Rosten et al 2009] Rosten E, Porter R, Drummond T.2009. Faster and better:a machine learning approach to corner detection[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence.
    [Rublee et al 2011] Rublee E, Rabaud V, Konolige K, Bradski G.2011. ORB:an efficient alternative to SIFT or SURF[C]. Proceedings of the International Conference on Computer Vision.
    [Ruijie et al 2010] Ruijie H, Abraham B, Michael A, et al.2010. On the design and use of a micro air vehicle to track and avoid adversaries[J]. International Journal of Robotics Research,29(5): 529-546.
    [Saripalli et al 2006] Saripalli S, Sukhatme G S.2006. Landing on a moving target using an autonomous helicopter[J]. Field and Service Robotics,24:277-286.
    [Schweighofer and Pinz 2006] Schweighofer G, Pinz A.2006. Robust pose estimation from a planar target[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence.28(12): 2024-2030.
    [Sebastian et al 2008] Sebastian S, Sanjiv S, Lyle C, Mike E.2008. Flying fast and low among obstacles:methodology and experiments[J]. International Journal of Robotics Research.27(5): 549-574.
    [Sharp et al 2001] Sharp C S, Shakernia O, Sastry S S.2001. A Vision System for Landing an Unmanned Aerial Vehicle[C]. Proceedings of IEEE International Conference on Robotics and Automation,1720-1727.
    [Shen et al 2012] Shen S J, Michael N, Kumar V.2012. Autonomous indoor 3D exploration with a micro-aerial vehicle[C]. Proceedings of the IEEE International Conference on Robotics and Automation, Saint Paul, USA,9-15.
    [Shen et al 2013] Shen S, Mulgaonkar Y, Michael N. Kumar V.2013. Vision-based state estimation for autonomous rotorcraft MAVs in complex environments[C]. Proceedings of the IEEE International Conference on Robotics and Automation, Germany,1758-1764.
    [Shi and Tomasi 1994] Shi J. Tomasi C.1994. Good Features to Track[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,593-600.
    [Shoemarker and Bomstein 1998] Shoemarker C M, Bomstein J A.1998. Overview of the Demo Ⅲ UGV Program[C]. SPIE Conference on Robotic and Semi-Robotic Ground Vehicle Technology, 202-211.
    [Smith and Cheesman 1987] Smith R, Cheesman P.1987. On the representation of spatial uncertainty [J]. Int. J. Robotics Research,5(4):56-68.
    [Strasdat et al 2010a] Strasdat H, Montiel J M M, Davison A J.2010. Scale drift-aware large scale monocular SLAM[C]. Proceedings of Robotics:Science and Systems (RSS).
    [Strasdat et al 2010b] Strasdat H, Montiel J M M. Davison A J.2010. Real-time Monocular SLAM:Why Filter?[C]. Proceedings of the IEEE International Conference on Robotics and Automation,2657-2664.
    [Strasdat et al 2011] Strasdat H, Davison A J, Montiel J M M, Konolige K.2011. Double Window Optimisation for Constant Time Visual SLAM[C]. Proceedings of the International Conference on Computer Vision.
    [Strasdat et al 2012] Strasdat H, Montiel J M M, Davison A J.2012. Visual SLAM:why filter?[J]. Image and Vision Computing,30:65-77.
    [Tang et al 2009] Tang F, Lim S H, Chang N L, Tao H. A Novel Feature Descriptor Invariant to Complex Brightness Changes[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
    [Tola et al 2008] Tola E, Lepetit V, Fua P.2008. A Fast Local Descriptor for Dense Matching[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
    [Tola et al 2010] Tola E, Lepetit V, Fua P.2010. DAISY:An Efficient Dense Descriptor Applied to Wide-Baseline Stereo[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(5):815-830.
    [Triggs et al 2000] Triggs W, McLauchlan P, Hartley R, Fitzgibbon A.2000. Bundle adjustment-A modern synthesis. In Triggs W, Zisserman A, Szeliski R, editors, Vision Algorithms:Theory and Practice, LNCS, Springer Verlag. pages 298-375.
    [VICON Website] VICON Website.\\http://www.vicon.com.
    [吴福朝等2003]吴福朝,胡占义.2003.PnP问题的线性求解算法[J].软件学报,14(3):682-688.
    [Wagner and Schmalstieg 2007] Wagner D, Schmalstieg D.2007. ARToolKitPlus for Pose Tracking on Mobile Devices[C]. Computer Vision Winter Workshop.
    [Wang and Zhang 2007] Wang X, Zhang H.2007. A UPF-UKF Framework For SLAM[C]. IEEE International Conference on Robotics and Automation, Roma, Italy.1664-1669.
    [Wang et al 2013] Wang F. Cui J Q, Chen B M, Lee T H.2013. A Comprehensive UAV Indoor Navigation System Based on Vision Optical Flow and Laser FastSLAM[J]. Acta Automatica Sinica,39(11):1889-1900.
    [Weiss et al 2012] Weiss S. Achtelik W M, Lynen S. et al.2012. Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments[C]. Proceedings of the IEEE International Conference on Robotics and Automation, SaintPaul. USA.957-964.
    [Wendel et al 2011] Wendel A, Irschara A, Bischof H.2011. Natural Landmark-based Monocular Localization for MAVs[C]. Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China,5792-5799.
    [Wendel et al 2012] Wendel A, Maurer M, Graber G, et al.2012. Dense Reconstruction On-the-Fly[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1450-1457.
    [Williams et al 2011] Williams B, Hudson N, Tweddle B, et al.2011. Feature and pose constrained visual aided inertial navigation for computationally constrained aerial vehiclesfC]. Proceedings of the IEEE International Conference on Robotics and Automation,431-438.
    [Wolfe et al 1991] Wolfe W J, Mathis D, Sklair C W, Magee M.1991. The Perspective View of Three Points[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,13(1):66-73.
    [Wu and Hu 2001] Wu FC, Hu ZY.2001. A study on the P5P problem[J]. Journal of Software, 12(5):768-775 (in Chinese with English Abstract).
    [Wunsch and Hirzinger 1996] Wunsch P, Hirzinger G.1996. Registration of CAD-Models to Images by Iterative Inverse Perspective Matching[C]. Proc. International Conference on Pattern Recognition,78-83.
    [Xair Website] Xaircraft Website. \\http://www.xaircraft.com.
    [杨森等2011]杨森.吴福朝.2011.摄像机位姿的加权线性算法[J].软件学报(Journal ofSoftware),22(10):2476-2487.
    [张广军等2005]张广军,周富强.2005.基于双圆特征的无人机着陆位置姿态视觉测量方法[J].航空学报,26(3):344-348.
    [Zhang et al 2009] Zhang T, Kang Y, Achtelik M, et al.2009. Autonomous Hovering of a Vision/IMU Guided Quadrotor[C]. Proceedings of the International Conference on Mechatronics and Automation,2870-2875.
    [Zou and Tan 2013] Zou D P. Tan P.2013. CoSLAM:collaborative visual SLAM in dynamic environments[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,35(2):354-366.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700