无人车鱼眼双目深度提取研究
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Rearch on Depth Estimation in Unmanned Vehicle Using Fisheye Binocular Lens
  • 作者:宇文旋 ; 赵明明 ; 陈龙
  • 英文作者:YUWEN Xuan;ZHAO Mingming;CHEN Long;School of Mechanical and Electrical Engineering,University of Electronic Science and Technology of China;School of Data and Computer Science,Sun Yatsen University;
  • 关键词:无人车 ; 双目视觉 ; 鱼眼双目视觉 ; 图像插值 ; 环境感知
  • 英文关键词:unmanned vehicle;;binocular vision;;fisheye binocular vision;;image interpolation;;environmental perception
  • 中文刊名:ZGJX
  • 英文刊名:China Mechanical Engineering
  • 机构:电子科技大学机械与电气工程学院;中山大学数据科学与计算机学院;
  • 出版日期:2019-07-08 08:52
  • 出版单位:中国机械工程
  • 年:2019
  • 期:v.30;No.517
  • 基金:国家自然科学基金资助项目(61773414)
  • 语种:中文;
  • 页:ZGJX201913010
  • 页数:8
  • CN:13
  • ISSN:42-1294/TH
  • 分类号:69-76
摘要
对无人车鱼眼双目环境感知系统模型进行了研究,分析了鱼眼双目模型与针孔双目模型的转化关系。针对鱼眼图像变形大的特点,提出了一种用于鱼眼相机图像矫正的余弦相似方法插值算法,算法采用余弦相似方法衡量待插值点与周围灰度已知点的相似性,对因重投影产生的像素缺失进行了有效插值。最后,通过场景深度提取试验对该模型和算法进行了试验验证。研究结果表明,与传统双目模型相比,鱼眼双目模型能够获得更大范围的深度信息;对于鱼眼双目模型中的插值,所提出算法能更好地处理插值细节,从而更加有效地提取场景的深度信息。
        The model of the binocular environmental perception system of the unmanned vehicle fisheye was studied. The conversion relationship between the fisheye binocular model and the pinhole binocular model was analyzed. Aiming at the large deformations of the fisheye images, a method was proposed for the conversion, a cosine similar interpolation algorithm was developed for the image correction of fisheye cameras. The algorithm resembled cosine similarity to measure the similarity between the interpolation point and the surrounding gray-scale known points. The pixel missing due to re-projection was effectively interpolated. Finally, the scene depth extraction experiments were performed. The model and algorithm presented herein were verified by experiments. The results show that compared with the traditional binocular model, the fisheye binocular model may obtain a greater range of depth informations; for the interpolation in the fisheye binocular model, the algorithm may better handle the interpolation details and thus extract the depth informations of the scene more effectively.
引文
[1] 英向华,胡占义.一种基于球面透视投影约束的鱼眼镜头校正方法[J].计算机学报,2003,26(12):1702-1708.YING Xianghua,HU Zhanyi.Fisheye Lense Distortion Correction Using Spherical Perspective Projection Constrain[J].Journal of Computer,2003,26(12):1702-1708.
    [2] KANNALA J,BRANDT S S.A Generic Camera Model and Calibration Method for Conventional,Wide-angle,and Fisheye Lenses[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2006,28(8):1335-1340.
    [3] 褚光宇.鱼眼双目视觉成像及定位技术的研究[D].秦皇岛:燕山大学,2016.CHU Guangyu.Study on Fisheye Binocular Vision Imaging and Position Technology[D].Qinhuangdao:Yanshan University,2016.
    [4] 熊文莉.基于鱼眼镜头的双目立体视觉匹配算法的研究[D].秦皇岛:燕山大学,2016.XIONG Wenli.The Research of Stereo Matching Algorithms for the Binocular Vision Based on Fisheye Lens[D].Qinhuangdao:Yanshan University,2016.
    [5] HENG L,LEE G H,SIZOV A,et al.Real-time Direct Dense Matching on Fisheye Images Using Plane-Sweeping Stereo[C]//International Conference on 3D Vision.Tokyo,2014:57-64.
    [6] ZHAO H,AGGARWAL J K.3D Reconstruction of an Urban Scene from Synthetic Fisheye Images[C]//IEEE Southwest Symposium on Image Analysis and Interpretation.Austin,2000:219.
    [7] BOUTTEAU R,SAVATIER X,BONARDI F,et al.Road-line Detection and 3D Reconstruction Using Fisheye Cameras[C]//International IEEE Conference on Intelligent Transportation Systems.Nethertands,2013:1083-1088.
    [8] CAO Zuoliang,HALL E L.Omnivision-based Autonomous Mobile Robotic Platform[C]//Proceedings of SPIE:the International Society for Optical Engineering.Boston,2008:51-60.
    [9] DRULEA M,SZAKATS I,VATAVU A,et al.Omnidirectional Stereo Vision Using Fisheye Lenses[C]//IEEE International Conference on Intelligent Computer Communication and Processing.Cluj-Napoca,2014:251-258.
    [10] 章毓晋.图像工程(下册)[M].北京:清华大学出版社,2012.ZHANG Yujin.Image Engineering (Volume 2)[M].Beijing:Tsinghua University Press,2012.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700