全向视觉传感器标定
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文讨论的全向视觉传感器包括被动全向视觉传感器-全向相机和主动全向视觉传感器-全向激光雷达。此类传感器由于具有较大的视场范围,被大量应用于基于地面自主平台的环境感知。其特殊的成像特性使得该类传感器的标定一直是计算机视觉领域的基础性研究问题。
     本文主要研究全向相机和全向激光雷达的标定方法,以全向相机的标定、自标定,雷达-相机系统的外部参数标定这三个方面作为研究重点。利用棋盘格标定板在单位视球上的几何性质进行全向相机精确标定,提高了结果的精度;将基于稀疏表示恢复低秩纹理的思想应用于全向相机自标定,方便地得到比较可靠的标定结果;采用几何结构约束、运动估计等手段,解决全向相机和全向激光雷达的联合标定问题。
     本学位论文主要工作和创新之处在于:
     1.提出了一种基于单位视球的全向相机标定方法,来提供比较精确的二维三维信息对应关系。该方法利用棋盘格标定板中两组相互垂直的平行直线在单位视球上的几何特性,推导出内部参数和外部参数的闭合解。与现有的大多数标定方法相比,本方法依靠更为精确的内参、外参估计,进一步降低了标定结果的不确定度。
     2.提出了基于稀疏表示的全向相机自标定方法,通过简单的标定场景对传感器进行快速校验。该方法利用单张图像,通过恢复图像中的空间低秩纹理,对相机进行自标定:并根据全向相机的成像特性,定义了一种有效的描述球面大视场范围低秩纹理的投影方法。和大多数的自标定方法相比,本方法不依赖于边缘、角点等局部特征,受遮挡、模糊、光照等外部影响较小,标定结果的可靠性更高。
     3.提出了两种基于自然场景的雷达-相机系统外部参数标定方法。和立体相机相比,全向激光雷达和相机组成的系统在进行场景构建时具有计算复杂度低、准确度高、受环境影响小等特点。而对雷达-相机系统进行外部参数标定是有效结合两个传感器数据的前提。该方法根据标定场景中的三面体定义参考世界坐标系,利用三面体结构约束、图像间运动估计等手段求解雷达、相机坐标系相对于参考坐标系的位置关系,来得到两传感器之间的外部参数。本方法比大部分方法更为灵活,不需要特殊的标定物体,对手工输入信息依赖度较低,仅需要两帧数据,即可得到较为准确的结果。
The omnidirectional vision sensors discussed in this dissertation include a passive vision sensor-an omnidirectional camera and an active vision sensor-an omnidirectional lidar. With a large field of view, this kind of sensor is widely used in environment perception of automatic land platform. Due to its special geometric characteristics, its calibration is always a fundamental question in computer vision field.
     In this dissertation, we study on the calibration of omnidirectional camera and lidar, mainly focus on three aspects:calibration and self-calibration of omnidirectional cameras, extrinsic calibration of a lidar-camera system. In order to achieve precise omnidirectional camera calibration, we propose a robust calibration method based on a viewing sphere which improves the accuracy of the results. We take advantage of the idea of compressive sensing based low-rank texture recovery to achieve the self-calibration of omnidirectional cameras, and reliable results are achieved. Geometric constraint and motion estimation are adopted to solve the joint calibration of an omnidirectional lidar and a camera.
     The main contributions are outlined as follows:
     1. To provide accurate correspondences between image and space information, we propose an omnidirectional camera calibration method via the viewing sphere. The geometric properties of two mutually orthogonal sets of parallel lines on the viewing sphere can provide a closed form solution for estimation of intrinsic and extrinsic parameters. Benefitting from the relative precise estimation of the intrinsic and extrinsic parameters, this method can further reduce the uncertainty of calibration results compared with most of the state-of-the-art methods.
     2. We propose an omnidirectional camera self-calibration method based on compressive sensing, and the sensor can be quickly calibrated by a simple scenario. The method calibrates the camera by recovering the low-rank texture in the image, and only one image is demanded. Furthermore, we define a projection function for spherical large-field-of-view low-rank texture to meet the imaging characteristic of omnidirectional cameras. Different from most of the self-calibration methods, this method does not rely on low-level features such as edge, corner, and is weakly affected by external factors such as light, shadow etc. More reliable results can be obtained.
     3. We put forward two methods for a lidar-camera system extrinsic calibration based on natural scenarios. Compared with the stereo camera system, an omnidirectional lidar-camera system is of low computational complexity, high accuracy and less affected by environment when constructing3D scenes. To fuse the data of lidar and camera effectively, we need to calibrate the extrinsic parameters of the lidar-camera system. By defining a reference world coordinates according to a trihedron in the scene, we make use of geometric constraints or matched features of the trihedron to estimate the relative motions between the lidar or camera coordinates and the world coordinates. If the relative motions are known, the extrinsic parameters between the lidar and camera are easy to calculate. This method is flexible and does not need specific calibration objects. Furthermore, it does not largely rely on the input information and only two frames of data are enough to get the reliable results.
引文
[1]Derrien S, Konolige K. Approximating a Single Viewpoint in Panoramic Imaging Devices: Robotics and Automation,2000. Proceedings. ICRA'00. IEEE International Conference on, 2000[C].
    [2]Bogner S L. An Introduction to Panospheric Imaging:Systems, Man and Cybernetics,1995. Intelligent Systems for the 21st Century., IEEE International Conference on,1995[C].
    [3]Velodyne. Velodynehdl-64E[EB/OL]. Http://velodynelidar.com/lidar/hdlproducts/hdl64e.aspxS.
    [4]Point Grey-360 Spherical-Ladybug3-Firewire Camera[EB/OL], [2013/3/9]. http://www.ptgrey.com/products/ladybug3/ladybug3_360_video_camera.asp.
    [5]Miyamoto K. Fish Eye Lens[J]. Josa,1964,54(8):1060-1061.
    [6]Basu A, Licardie S. Modeling Fish-Eye Lenses:Intelligent Robots and Systems'93, IROS '93. Proceedings of the 1993 IEEE/RSJ International Conference on,1993[C].
    [7]Hecht E, Zajac A. Optics[M]. Addison-Wesley,1997.
    [8]Swaminathan R, Nayar S K. Nonmetric Calibration of Wide-Angle Lenses and Polycameras[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On, 2000,22(10):1172-1178.
    [9]Schaffalitzky F, Zisserman A. Multi-View Matching for Unordered Image Sets, Or "How Do I Organize My Holiday Snaps?":Computer Vision-ECCV 2002,2002[C]. Springer Berlin/Heidelberg.
    [10]Bartoli A, Coquerelle M, Sturm P. A Framework for Pencil-of-Points Structure-From-Motion:Computer Vision-ECCV 2004,2004[C], Springer Berlin/Heidelberg.
    [11]丁鑫.全景视觉泊车辅助系统研究[D].浙江大学信息与电子工程学系,2010.
    [12]Hughes C, Glavin M, Jones E, et al. Wide-Angle Camera Technology for Automotive Applications:A Review[J]. Intelligent Transport Systems, let,2009,3(1):19-31.
    [13]Wood R W. Physical Optics[M]. second edition. Macmillan,1911.
    [14]Bond W N. A Wide Angle Lens for Cloud Recording[J]. Phil. Mag., 1922,44(263):999-1001.
    [15]Hill R. A Lens for Whole Sky Photography:Proceedings of the Optical Convention, London, 1926[C].
    [16]Schulz H. D.R. Patent,670538[P].1932.
    [17]Isshiki M, Matsuki K. Achromatic Super Wide-Angle Lens:US,3524697[P].
    [18]Rees D W. Panoramic Television Viewing System:US,3505465[P].1970.
    [19]Baker S, Nayar S K. A Theory of Catadioptric Image Formation:Computer Vision,1998. Sixth International Conference on,1998[C].
    [20]Baker S, Nayar S K. A Theory of Single-Viewpoint Catadioptric Image Formation[J]. International Journal of Computer Vision,1999,35(2):175.
    [21]Nayar S K. Omnidirectional Vision:In Proc. of ISRR,1997[C].
    [22]Chen S, Li Y F, Zhang J, et al. Active Sensor Planning for Multiview Vision Tasks[M].1st. Springer Publishing Company, Incorporated,2008.
    [23]Besl P J. Advances in Machine Vision[M]//SANZ J L C. New York, NY, USA: Springer-Verlag New York, Inc.,1988:1-63.
    [24]Sansoni G, Trebeschi M, Docchio F. State-of-the-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation[J]. Sensors, 2009,9(1):568-601.
    [25]Kinect[EB/OL]. http://baike.baidu.com/view/3766855.htm.
    [26]Petrovskaya A, Thrun S. Model Based Vehicle Detection and Tracking for Autonomous Urban Driving[Z].2009.
    [27]Darms M S, Rybski P E, Baker C, et al. Obstacle Detection and Tracking for the Urban Challenge[J]. Intelligent Transportation Systems, IEEE Transactions On,2009,10(3):475-485.
    [28]Positioning, Panoramic High-Resolution Cameras and Rotating Lidar[EB/OL]. http://sampledata.navteq.com/site/global/products_licensing/data_collection/p_collection.jsp.
    [29]Pandey G, McBride J R, Eustice R M. Ford Campus Vision and Lidar Data Set[J]. The International Journal of Robotics Research,2011,30(13):1543-1552.
    [30]Geiger A, Lenz P, Urtasun R. Are we Ready for Autonomous Driving? The Kitti Vision Benchmark Suite, Providence, USA,2012[C].June.
    [31]Hughes C, Denny P, Jones E, et al. Accuracy of Fish-Eye Lens Models[J]. Applied Optics, 2010,49(17):3338-3347.
    [32]Scaramuzza D, Martinelli A, Siegwart R. A Flexible Technique for Accurate Omnidirectional Camera Calibration and Structure From Motion,2006[C].
    [33]Geyer C, Daniilidis K. A Unifying Theory for Central Panoramic Systems and Practical Implications[J]. Computer Vision- Eccv 2000,2000:445-461.
    [34]Puig L, Bermudez J, Sturm P, et al. Calibration of Omnidirectional Cameras in Practice:A Comparison of Methods[J]. Computer Vision and Image Understanding,2012,116(1):120-137.
    [35]Gasparini S, Sturm P, Barreto J P. Plane-Based Calibration of Central Catadioptric Cameras: Computer Vision,2009 IEEE 12th International Conference on,2009[C].
    [36]Gennery D. Generalized Camera Calibration Including Fish-Eye Lenses[J]. International Journal of Computer Vision,2006,68(3):239-266.
    [37]Shah S, Aggarwal J K. A Simple Calibration Procedure for Fish-Eye (High Distortion) Lens Camera:Robotics and Automation,1994. Proceedings.,1994 IEEE International Conference on, 1994[C].
    [38]Kannala J, Brandt S S. A Generic Camera Model and Calibration Method for Conventional, Wide-Angle, and Fish-Eye Lenses[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,2006,28(8):1335-1340.
    [39]Puig L, Bastanlar Y, Sturm P, et al. Calibration of Central Catadioptric Cameras Using a Dlt-Like Approach[J]. International Journal of Computer Vision,2011,93(1):101-114.
    [40]Mei C, Rives P. Single View Point Omnidirectional Camera Calibration From Planar Grids: Robotics and Automation,2007 IEEE International Conference on,2007[C].
    [41]Zhang Q, Pless R. Extrinsic Calibration of a Camera and Laser Range Finder (Improves Camera Calibration):IEEE/RSJ International Conference on Intelligent Robots and Systems, 2004[C].
    [42]Unnikrishnan R, Hebert M. Fast Extrinsic Calibration of a Laser Rangefinder to a Camera, CMU-RI-TR-05-09[R].Robotics Institute,2005.
    [43]Pandey G, McBride J, Savarese S, et al. Extrinsic Calibration of a 3D Laser Scanner and an Omnidirectional Camera,2010[C].
    [44]Thirthala S, Pollefeys M. The Radial Trifocal Tensor:A Tool for Calibrating the Radial Distortion of Wide-Angle Cameras:Computer Vision and Pattern Recognition,2005. CVPR 2005. IEEE Computer Society Conference on,2005[C].
    [45]Thirthala S, Pollefeys M. Multi-View Geometry of ID Radial Cameras and its Application to Omnidirectional Camera Calibration:Computer Vision,2005. ICCV 2005. Tenth IEEE International Conference on,2005 [C].
    [46]Tardif J P, Sturm P, Roy S. Plane-Based Self-Calibration of Radial Distortion:Computer Vision,2007. ICCV 2007. IEEE 11th International Conference on,2007[C].
    [47]Sing B K. Catadioptric Self-Calibration:Computer Vision and Pattern Recognition,2000. Proceedings. IEEE Conference on,2000[C].
    [48]Cobzas D, Hong Z, Jagersand M. A Comparative Analysis of Geometric and Image-Based Volumetric and Intensity Data Registration Algorithms:Robotics and Automation,2002. Proceedings. ICRA'02. IEEE International Conference on,2002[C].
    [49]Naroditsky O, Patterson A, Daniilidis K. Automatic Alignment of a Camera with a Line Scan Lidar System:Robotics and Automation (ICRA),2011 IEEE International Conference on. 2011[C].
    [50]Rodriguez F S A, Fremont V, Bonnifait P. Extrinsic Calibration Between a Multi-Layer Lidar and a Camera:Multisensor Fusion and Integration for Intelligent Systems,2008. MFI 2008. IEEE International Conference on,2008[C].
    [51]Mei C, Rives P. Calibration Between a Central Catadioptric Camera and a Laser Range Finder for Robotic Applications:Robotics and Automation,2006. ICRA 2006. Proceedings 2006 IEEE International Conference on,2006[C].
    [52]Scaramuzza D, Harati A, Siegwart R. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder From Natural Scenes:Intelligent Robots and Systems,2007. IROS 2007. IEEE/RSJ International Conference on,2007[C].
    [53]Hughes C, Denny P, Glavin M, et al. Equidistant Fish-Eye Calibration and Rectification by Vanishing Point Extraction[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On, 2010,32(12):2289-2296.
    [54]Xianghua Y, Zhanyi H. Catadioptric Camera Calibration Using Geometric Invariants[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,2004,26(10):1260-1271.
    [55]Geyer C, Daniilidis K.. Paracatadioptric Camera Calibration[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,2002,24(5):687-695.
    [56]Ying X, Hu Z, Zha H. Fisheye Lenses Calibration Using Straight-Line Spherical Perspective Projection Constraint:Computer Vision-ACCV 2006,2006[C]. Springer Berlin/Heidelberg.
    [57]Chen Q, Wu H, Wada T. Camera Calibration with Two Arbitrary Coplanar Circles: Computer Vision-ECCV 2004,2004[C]. Springer Berlin/Heidelberg.
    [58]M F M. Perspective Projection:the Wrong Imaging Model[R].University of Iowa: Department, of Computer Science,1995.
    [59]Abraham S, Forstner W. Fish-Eye-Stereo Calibration and Epipolar Rectification[J]. Isprs Journal of Photogrammetry and Remote Sensing,2005,59(5):278-288.
    [60]Ray S F. Applied Photographic Optics[M]. second edition. Oxford:Focal Press,1994.
    [61]Hann, Liang-shin. Complex Numbers and Geometry [M]. Washington DC:Math. Assoc. of Amer.,1994.
    [62]D. H, Cohn-Vossen S. Geometry and the Imagination[M]. second edition. Berlin: Springer-Verlag,1990.
    [63]Samyang 8 Mm F/3.5 Aspherical If Mc Fish-Eye Review-Introduction Lenstip.Com[EB/OL]. http://www.lenstip.com/160. 1-Lens_review-Samyang_8_mm_f_3.5_Aspherical_IF_MC_Fish-ey e-Introduction.html.
    [64]Geyer C, Daniilidis K. Catadioptric Projective Geometry[J]. International Journal of Computer Vision,2001,45(3):223-243.
    [65]Ying X, Hu Z. Can we Consider Central Catadioptric Cameras and Fisheye Cameras within a Unified Imaging Model:Computer Vision-ECCV 2004,2004[C]. Springer Berlin/ Heidelberg.
    [66]Weng J, Cohen P, Herniou M. Camera Calibration with Distortion Models and Accuracy Evaluation [J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On, 1992,14(10):965-980.
    [67]Hughes C, Jones E, Glavin M, et al. Validation of Polynomial-Based Equidistance Fish-Eye Models:Signals and Systems Conference (ISSC 2009), IET Irish,2009[C].
    [68]Slama C C, Ed. Manual of Photogrammetry[M]. fourth edition. American Society of Photogrammetry,1980.
    [69]Tsai R Y. A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf Tv Cameras and Lenses[J]. Robotics and Automation, IEEE Journal of,1987,3(4):323-344.
    [70]Mallon J, Whelan P F. Precise Radial Un-Distortion of Images:Pattern Recognition,2004. ICPR 2004. Proceedings of the 17th International Conference on,2004[C].
    [71]Fitzgibbon A W. Simultaneous Linear Estimation of Multiple View Geometry and Lens Distortion:Computer Vision and Pattern Recognition,2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on,2001[C].
    [72]Basu A, Licardie S. Alternative Models for Fisheye Lenses[J]. Pattern Recognition Letters, 1995,16(4):433-441.
    [73]Shah S, Aggarwal J K. Intrinsic Parameter Calibration Procedure for a (High-Distortion) Fish-Eye Lens Camera with Distortion Model and Accuracy Estimation*[J]. Pattern Recognition, 1996,29(11):1775-1788.
    [74]Devernay F, Faugeras O. Straight Lines Have to be Straight[J].2001,13(1):14-24.
    [75]Camera Calibration Toolbox for Matlab[EB/OL], [2013/3/9]. http://www.vision.caltech.edu/bouguetj/calib_doc/.
    [76]Brown D C. Decentering Distortion of Lens[J]. Photometric Engineering, 1966,32(3):444-462.
    [77]Hughes C, McFeely R, Denny P, et al. Equidistant (f0) Fish-Eye Perspective with Application in Distortion Centre Estimation [J]. Image and Vision Computing, 2010,28(3):538-551.
    [78]H H T, C D C, D. M S. Using Geometrical Constraints for Fisheye Camera Calibration: Proceeding of Workshop on Omnidirectional Vision,2005[C],
    [79]Haijiang Z, Jinfu Y, Zhongtian L. Fisheye Camera Calibration with Two Pairs of Vanishing Points:Information Technology and Computer Science,2009. ITCS 2009. International Conference on,2009[C].
    [80]Shoubo Z, Baofeng Z, Li L, et al. Fisheye Lens Camera System Calibration and Localization Error Analysis:Computer Design and Applications (ICCDA),2010 International Conference on, 2010[C].
    [81]Gallagher A C. Using Vanishing Points to Correct Camera Rotation in Images:Computer and Robot Vision,2005. Proceedings. The 2nd Canadian Conference on,2005[C].
    [82]Shigang L. Binocular Spherical Stereo[J]. Intelligent Transportation Systems, IEEE Transactions On,2008,9(4):589-600.
    [83]Barreto J P, Araujo H. Geometric Properties of Central Catadioptric Line Images and their Application in Calibration[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On, 2005,27(8):1327-1333.
    [84]Xianghua Y, Zhanyi H. Catadioptric Camera Calibration Using Geometric Invariants[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,2004,26(10):1260-1271.
    [85]Kannala J, Brandt S S. A Generic Camera Model and Calibration Method for Conventional, Wide-Angle, and Fish-Eye Lenses[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,2006,28(8):1335-1340.
    [86]Courbon J, Mezouar Y, Eck L, et al. A Generic Fisheye Camera Model for Robotic Applications:Intelligent Robots and Systems,2007. IROS 2007. IEEE/RSJ International Conference on,2007[C].
    [87]Sturm P, Ramalingam S. A Generic Concept for Camera Calibration:Computer Vision-ECCV 2004,2004[C]. Springer Berlin/Heidelberg.
    [88]Dunne A K, Mallon J, Whelan P F. Efficient Generic Calibration Method for General Cameras with Single Centre of Projection:Computer Vision,2007. ICCV 2007. IEEE 11th International Conference on,2007[C].
    [89]Scaramuzza D, Martinelli A, Siegwart R. A Toolbox for Easily Calibrating Omnidirectional Cameras:Intelligent Robots and Systems,2006 IEEE/RSJ International Conference on,2006[C].
    [90]DENG X, WU F, WU Y. An Easy Calibration Method for Central Catadioptric Cameras[J]. Acta Automatica Sinica,2007,33(8):801-808.
    [91]Geyer C, Daniilidis K. Catadioptric Camera Calibration:Computer Vision,1999. The Proceedings of the Seventh IEEE International Conference on,1999[C].
    [92]Kasa I. A Circle Fitting Procedure and its Error Analysis[J]. Instrumentation and Measurement, IEEE Transactions On,1976,IM-25(1):8-14.
    [93]Taubin G. Estimation of Planar Curves, Surfaces, and Nonplanar Space Curves Defined by Implicit Equations with Applications to Edge and Range Image Segmentation[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,1991,13(11):1115-1138.
    [94]Fitzgibbon A W, Pilu M, Fisher R B. Direct Least Squares Fitting of Ellipses:Pattern Recognition,1996., Proceedings of the 13th International Conference on,1996[C].
    [95]Gander W, Golub G H, Strebel R. Least-Squares Fitting of Circles and Ellipses[J]. Bit, 1994(4).
    [96]Madsen K, Nielsen H B, Tingleff O. Methods for Non-Linear Least Squares Problems [M]. second edition. Informatics and Mathematical Modelling, Technical University of Denmark, 2004.
    [97]Zhang Z. A Flexible New Technique for Camera Calibration[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,2000,22(11):1330-1334.
    [98]Christopher Mei-Cnrs Researcher:Laboratoire D'analyse Et D'architecture Des Systemes| Main/Toolbox| Browse[EB/OL]. [2013/3/11]. http://homepages.laas.fr/cmei/index.php/toolbox.
    [99]Zhang L, Du X, Liu J. Using Concurrent Lines in Central Catadioptric Camera Calibration[J].2011,12(3):239-249.
    [100]H3S Mirror[EB/OL]. http://www.neovision.cz/prods/panoramic/h3s.html.
    [101]Barreto J P, Araujo H. Fitting Conics to Paracatadioptric Projections of Lines[J]. Computer Vision and Image Understanding,2006,101(3):151-165.
    [102]Geyer C, Daniilidis K. Catadioptric Camera Calibration:Computer Vision,1999. The Proceedings of the Seventh IEEE International Conference on,1999[C].
    [103]Wei J, Li C, Hu S, et al. Fisheye Video Correction[J]. Visualization and Computer Graphics, IEEE Transactions On,2012,18(10):1771-1783.
    [104]Aliaga D G. Accurate Catadioptric Calibration for Real-Time Pose Estimation in Room-Size Environments:Computer Vision,2001. ICCV 2001. Proceedings. Eighth IEEE International Conference on,2001 [C].
    [105]Tardif J P, Sturm P, Roy S. Plane-Based Self-Calibration of Radial Distortion:Computer Vision,2007. ICCV 2007. IEEE 1 lth International Conference on,2007[C].
    [106]Tardif J P, Sturm P, Roy S. Self-Calibration of a General Radially Symmetric Distortion Model:Computer Vision-ECCV 2006, BERLIN,2006[C]. SPRINGER-VERLAG BERLIN.
    [107]Yalin X, Turkowski K. Creating Image-Based VR Using a Self-Calibrating Fisheye Lens: Computer Vision and Pattern Recognition,1997. Proceedings.,1997 IEEE Computer Society Conference on,1997[C].
    [108]Donoho D. Compressed Sensing[J]. IEEE Transactions On Information Theory, 2006,52(4):1289-1306.
    [109]Cades E, Tao T. Reflections On Compressed Sensing[J]. IEEE Information Theory Society Newsletter,2008,58(4):20-23.
    [110]Peng Y, Ganesh A, Wright J, et al. RASL:Robust Alignment by Sparse and Low-Rank Decomposition for Linearly Correlated Images:Computer Vision and Pattern Recognition (CVPR),2010 IEEE Conference on,2010[C].
    [111]Zhang Z, Liang X, Ma Y. Unwrapping Low-Rank Textures On Generalized Cylindrical Surfaces:Computer Vision (ICCV),2011 IEEE International Conference on,2011 [C].
    [112]Zhang Z, Ganesh A, Liang X, et al. Tilt:Transform Invariant Low-Rank Textures[J]. International Journal of Computer Vision,2012:1-24.
    [113]Zhang Z, Matsushita Y, Ma Y. Camera Calibration with Lens Distortion From Low-Rank Textures:Computer Vision and Pattern Recognition (CVPR),2011 IEEE Conference on, 2011[C].
    [114]Candes E J, Li X, Ma Y, et al. Robust Principal Component Analysis?[J]. J. Acm, 2011,58(3):1-37.
    [115]Chandrasekaran V, Sanghavi S, Parrilo P A, et al. Rank-Sparsity Incoherence for Matrix Decomposition[J]. Siam Journal On Optimization,2011,21(2):572-596.
    [116]Baker S, Matthews I. Lucas-Kanade 20 Years On:A Unifying Framework[J]. International Journal of Computer Vision,2004,56(3):221.
    [117]Lin Z, Chen M, Ma Y. The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices[J]. Mathematical Programming,2010.
    [118]Toh K C, Yun S. An Accelerated Proximal Gradient Algorithm for Nuclear Norm Regularized Linear Least Squares Problems[J]. Pacific Journal of Optimization, 2010,6(3):615-640.
    [119]Yuan X M, Yang J F. Sparse and Low-Rank Matrix Decomposition Via Alternating Direction Method[J]. Pacific Journal of Optimization,2013,9(1):167-180.
    [120]DP B. Nonlinear Programming[M]. Athena Scientific,2004.
    [121]Zhengyou Z. Flexible Camera Calibration by Viewing a Plane From Unknown Orientations: Computer Vision,1999. The Proceedings of the Seventh IEEE International Conference on, 1999[C].
    [122]英向华,胡占义.一种基于球面透视投影约束的鱼眼镜头校正方法[J].计算机学报,2003(12):1702-1708.
    [123]Lin Z, Liu R, Su Z. Linearized Alternating Direction Method with Adaptive Penalty for Low-Rank Representation[J]. Arxiv Preprint Arxiv:1109.0367,2011.
    [124]Ren X, Lin Z. Linearized Alternating Direction Method with Adaptive Penalty and Warm Starts for Fast Solving Transform Invariant Low-Rank Textures[J]. Arxiv Preprint Arxiv:1205.5351,2012.
    [125]Cai J, Cand E J, S, et al. A Singular Value Thresholding Algorithm for Matrix Completion[J]. Siam J. On Optimization,2010,20(4):1956-1982.
    [126]Yang J, Zhang Y. Alternating Direction Algorithms for Cl-Problems in Compressive Sensing[J]. Siam J Sci Comput,2009,33(1):250-278.
    [127]Mirisola L G B, Lobo J, Dias J.3D Map Registration Using Vision Laser and Inertial Sensing:European Conference on Mobile Robots, Germany,2007[C].
    [128]Ess A, Leibe B, Schindler K, et al. Moving Obstacle Detection in Highly Dynamic Scenes: Robotics and Automation,2009. ICRA'09. IEEE International Conference on,2009[C].
    [129]Klimentjew D, Hendrich N, Jianwei Z. Multi Sensor Fusion of Camera and 3D Laser Range Finder for Object Recognition:Multisensor Fusion and Integration for Intelligent Systems (MFI), 2010 IEEE Conference on,2010[C].
    [130]Mirzaei F M, Kottas D G, Roumeliotis S I.3D Lidar--Camera Intrinsic and Extrinsic Calibration:Identifiability and Analytical Least-Squares-Based Initialization[J]. The International Journal of Robotics Research,2012,31(4):452-467.
    [131]Pandey G, McBride J R, Savarese S, et al. Automatic Targetless Extrinsic Calibration of a 3D Lidar and Camera by Maximizing Mutual Information:Proceedings of the AAAI National Conference on Artifical Intelligence, Toronto, Canada,2012[C].
    [132]Long Q, Zhongdan L. Linear N-Point Camera Pose Determination[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,1999,21 (8):774-780.
    [133]Ansar A, Daniilidis K. Linear Pose Estimation From Points Or Lines[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions On,2003,25(5):578-589.
    [134]Nunnez P, Drews P, Rocha R, et al. Data Fusion Calibration for a 3D Laser Range Finder and a Camera Using Inertial Data:European Conference on Mobile Robots,2009[C].
    [135]Aliakbarpour H, Nuez P, Prado J, et al. An Efficient Algorithm for Extrinsic Calibration Between a 3D Laser Range Finder and a Stereo Camera for Surveillance:Advanced Robotics, 2009. ICAR 2009. International Conference on,2009[C].
    [136]Micusik B. Two-View Geometry of Omnidirectional Cameras[D]. Czech Technical University,2004.
    [137]R. H, A Z. Multiple View Geometry in Computer Vision [M]. second. Cambridge University Press, ISBN:0521540518.2004.
    [138]Lowe D G. Object Recognition From Local Scale-Invariant Features:Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference on,1999[C].
    [139]Kinoshita K.3D Accuracy Improvement From an Image-Evaluation and Viewpoint: Proceedings of Australia-Japan Advanced Workshop on Computer Vision, Adelaide, Australia, 2003[C].

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700