用户名: 密码: 验证码:
基于线阵光学图像的运动参数测量技术及其应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
运动参数测量是计算机视觉、近景摄影测量与遥感等领域中的一个重要问题,在各种运动目标的检测、跟踪以及识别中有着广泛的应用需求。基于光学图像的运动参数测量方法具有设备简单、测量精度高的特点,且有着成熟的摄影测量和图像分析处理技术作为理论支撑,相比雷达等测量方法具有不可替代的优势。
     目前,几乎所有的基于光学图像的目标运动分析与研究都是采用面阵传感器获取的图像。近些年来,随着线阵CCD和CMOS传感器的快速发展,线阵光学图像的研究不断深入,特别是目前几乎所有的遥感测绘卫星都是采用推扫式线阵传感器进行数据获取的,使得线阵光学图像的应用和研究已然成为热点。线阵传感器独特的扫描成像原理,使得线阵光学图像可以在一幅图像中同时记录目标运动的时间和空间信息,这为利用线阵光学图像测量和估计目标的运动参数提供了可能。目前该方面的研究很少,线阵光学图像在运动目标分析方面的优势和潜质尚未得到展现。
     本文从线阵传感器的成像原理和成像模型出发,开展了基于线阵光学图像的目标运动参数测量技术及其应用的研究。论文的主要研究成果如下:
     (1)系统分析了线阵传感器的成像模型及其成像模型参数的标定方法,为利用线阵光学图像分析运动目标提供了理论基础。结合面阵像机的成像模型,论文分别研究了静止和运动成像时线阵传感器的成像模型,并从静止成像和扫描成像两个方面,综述和研究了线阵传感器内外方位元素的标定方法;
     (2)提出了基于特征点的线阵光学图像目标运动参数测量原理,并设计了它在车辆和飞机运动参数测量中的应用算法。该原理要求目标上至少存在五个能够在运动目标线阵图像上找出其对应成像点的特征点,利用这些特征点空间位置和其对应成像点之间的成像几何关系,建立包含目标运动参数的方程组,通过方程组的求解获得目标的运动参数值。具体过程如下:首先设立9个未知参数,它们分别是线阵传感器对运动目标线阵成像时的位置参数(3个)、姿态参数(3个)以及目标三维运动速度(3个),假设目标穿过线阵传感器时做匀速直线运动,则目标上各个特征点被拍摄时的位置参数可以根据这9个未知参数和它们在线阵图像上成像点所对应的时间信息表达出来;然后,对于每一对特征点和其对应的成像点,根据线阵传感器成像模型,可以在x和y方向分别建立一个包含9个未知数的方程式,从而可以得到不少于10个这样的方程式组成的方程组;最后利用非线性最小二乘法求解这个方程组,得到目标的三维运动参数。基于该原理,本文分别针对车辆和飞机目标运动参数的测量设计了两种应用算法。一是利用车辆牌照的线阵图像实现了车辆运动参数的测量。由于车辆牌照的几何尺寸已知,而且一般与车辆速度方向垂直,因此减少了所求未知参数的个数,仅仅利用车辆牌照四个端点以及它们在线阵图像对应的成像点,就可以根据线阵传感器的成像模型建立包含车辆位置、运动方向和速度参数的方程组,通过方程组的求解即可实现对车辆行驶方向和车速的准确测量。二是利用高分辨率推扫式光学卫星图像实现了从单幅图像中获取飞机目标的运动参数。该算法从线阵传感器的成像模型出发,利用飞机首尾和机翼端点以及它们在线阵图像中对应的成像点建立了包含飞机位置、运动方向和速度参数的方程组,通过方程组的求解就可以得到飞机的运动参数。该算法克服了现有方法中依赖多传感器的成像延迟来估计运动参数的局限性,能够处理单线阵传感器(如WorldView-1)获取的线阵图像,具有更广泛的适用性;
     (3)提出了一种基于三维几何模型的线阵光学图像目标运动参数测量原理,并设计了基于该原理的球目标和靶场弹丸目标运动参数的测量算法。对于球状或旋转体等目标,难以从目标上得到足够多能够在线阵图像上分辨出其对应成像点的特征点,基于特征点的测量方法难以适应。为了解决这个问题,论文创新性地将线阵图像中目标运动参数的测量问题转化为基于模型的图像最优化匹配问题,其主要思想是:首先,根据目标先验的几何参数构建目标的三维数字模型,然后通过计算机仿真的方法得到目标三维数字模型在传感器参数、运动参数设定情况下的理论模拟成像,最后构建以目标运动参数为输入参数的最优化模型,通过修正目标的运动参数输入值,不断减小理论模拟成像与目标实际成像中目标轮廓梯度方面的差异,将理论模拟成像结果与目标实际成像结果实现最优化匹配时的运动参数值作为最终的优化求解结果。根据该原理,本文分别针对球目标和靶场弹丸目标运动参数的测量设计了两种应用算法。一是提出了从线阵光学图像中测量球目标位置和速度参数的算法。该算法根据提出的测量原理,构建基于球模型的参数测量最优化模型,通过模型的优化求解,实现对球运动参数的精确测量;针对目前对传统的胶片式狭缝技术数字化改进的研究趋势,提出了一种利用线阵像机测量弹丸速度、姿态、攻角和转速等参数的数字狭缝测量算法,该算法通过单台线阵像机配合双向器实现对运动弹丸的立体交会测量,在一幅线阵图像上同时获得弹丸在两个方向上的立体线阵图像。根据提出的测量原理,构建了基于弹丸数字模型的运动参数立体测量的最优化模型,通过模型求解获得精确的弹丸运动参数测量结果,大大提高了测量的精度和稳定性。
Motion parameter measurement is an important problem in the fields such ascomputer vision, close-range photogrammetry and remote sensing. It has been widelyused in target inspection, tracking and recognition. Compared with other measurementtechniques, such as radar, optical-image-based techniques have distinct advantages onthe complicacy of the facilities, the accuracy of measurement and the theoretical supportof the mature photogrammetric and image processing techniques.
     Currently, almost all of the optical-image-based methods for motion estimation areemployed by using images collected by the area-array sensor. With the rapiddevelopment of the linear CCD and CMOS sensors, the research of the linear arrayimage has been deeper since recent years. What’s more, almost all of the optical remoteand mapping satellites collect image data with linear array sensors, which make theapplication and research of the linear array image a new highlight. Linear array imagecould simultaneously record the temporal and spatial information of the target’s motionfor its unique scanning imagery principle, making it particularly possible to measureand estimate target’s motion parameters. Until now, few works have focused on thissubject. The advantages and potential of the optical linear image on the target motionanalysis have not been attracted enough attention.
     In this dissertation, a research on the estimation technique and application of targetmotion parameters based on the linear array image is proposed, considering the imageryprinciple and model of the linear array sensor. The main works are listed as follows:
     (1) The imagery model and its parameters calibration methods are analyzed, whichprovide theoretical foundation for the moving targets analysis in linear optical images.Integrating with the imagery model of the area array sensor, the dissertation studies thestill and moving models of the linear array sensor respectively. The calibration methodsof the model parameters are summarized and studied from the views of still andscanning imaging.
     (2) The key-points-based motion parameters measurement principle is presented,and its application algorithms of the vehicle and airplane are designed. At least five keypoints which could find their corresponding image points on the linear array image areneeded. The functions group including the motion parameters could be established usingthe imagery relationships of the key points between their spactial positions and imagepoints coordinate valves. The motion parameters could be obtained via the solution ofthe functions group. The procedure is: firstly,9unkown parameters including3positionparameters,3attitude parameters and3velocity parameters are set up. Suppose thetarget moves across the linear array sensor at a certain constant speed, the positionparameters of each key point could be expressed by the9unkown parameters and the temporal information on the linear array image. Secondly, each pair of the key point andits image point could establish a function including9parameters on the x axis and yaxis respectively according to the imagery model, so that at least10functions could beobtained. The3D motion parameters could be obtained finally via the LMS solution ofthe nonlinear functions group. Based on this principle, two appliacaion algorithms of thevehicle and airplane are designed. A motion parameters measurement algorithm usingthe linear array image of the vehicle license plate is presented. Because the size of thelicense plate is known, and the speed diretion is always perpendicular to the licenseplate plane, only the four vertexes of the license plate and their image points are enoughto establish functions group including the position, direction and speed parameterswhich could achive the estimation of the vehicle’s moving direction and speed. Amotion measurement algorithm for the airplane from single high resolution push-broomoptical satellite image is presented. Base on the imagery model, the motion parametersof the airplane are obtained via the solution of the functions group which are establishedusing the relationships of the head, tail and wing vertexes and their image points,considering the position, direction and speed parameters. The algorithm overcomes thelimitation in the existed methods which depend on the the time lag between the multiplesensors on the satellite platform, and could work when the linear array image collecedby single sensor (WorldView-1), so that it has a wider applicability;
     (3) The3D-geometrical-model-based motion parameters measurement principle ispresented, and the measurement algorithms of the ball and projectile targets using theprinciple are designed. The key-point-based method is difficult to deal with sometargets, such as ball and axisymmetric targets, which are hard to find enough key pointsand their corresponding image points on the linear array image. To circle out thisproblem, this dissertation innovatively transforms the measurement of the motionparameters into a problem of the model-based image optimization matching. The mainscheme is: firstly, the3D digital model of the target is established according to thegeometrical information. Secondly, the theoretical simulated image of the3D digitalmodel at preset motion and sensor parameters is simulated, and finally the optimizationmodel is established taking the motion parameters as input values. The differencebetween the contour gradient of the theoretical image and that of the real image isreduced by adjusting the input motion parameters. The parameters that make the twoimages fulfill best matching are considered as the optimization solution of the motionparameters. Based on the principle, two application algorithms of the ball and projectiletargets are designed. A new algorithm for estimate the position and velocity parametersof the ball target is presented. The algorithm establishes an ball-model-basedoptimization model according to the principle, and fulfills an accurate estimation of theball motion from the solution of the optimization. Referring to the trend of the digitalization of the traditional film streak technique, a new digital streak techniquealgorithm is proposed with line-scan camera to measure the speed, attitude, angle ofattack and spin rate of the projectile. The algorithm fulfills a stereo measurement usingsingle camera and mirror. Based on the principle, an optimization model is establishedby using the3D digital model of the projectile and the imagery model of themeasurement system. The motion parameters are obtained from the solution of theoptimization model, which greatly improves the accuracy and stability of themeasurement.
引文
[1]张广军.视觉测量[M].科学出版社,2008.
    [2]王之卓.摄影测量原理[M].武汉大学出版社,2007.
    [3]冯文灏.近景摄影测量-物体外形与运动状态的摄影法测定[M].武汉大学出版社,2002.
    [4] C. Stiller, and J. Konrad. Estimating motion in image sequences[J]. IEEESignal Processing Magazine,1999,16(4):70-91.
    [5] T. J. Broida, and R. Chellappa. Estimating the Kinematics and Structure of aRigid Object from a Sequence of Monocular Images[J], IEEE Transactions on PatternAnalysis and Machine Inteligence,1991,13(6):497-512.
    [6] Dahmouche R. Ait-Aider O., Andreff N., and Mezouar Y. High-speed poseand velocity measurement from vision[C]. Proceedings of the IEEE InternationalConference on Robotics and Automation,2008,107-112.
    [7] H.Y. Lin, K.J. Li, and C.H. Chang. Vehicle speed detection from a singlemotion blurred image[J]. Image and Vision Computing,2008,26(10):1327-1337.
    [8] G. Boracchi, V. Caglioti, and A. Giusti. Estimation of3D InstantaneousMotion of a Ball from a Single Motion-Blurred Image[J]. Communications in Computerand Information Science,2009,24(3):225–237.
    [9] Giacomo Boracchi, Vincenzo Caglioti. Ball Position and MotionReconstruction from Blur in a Single Perspective Image[C]. Proceedings of the14thInternational Conference on Image Analysis and Processing,2007,87-92.
    [10] V. Caglioti and A. Giusti. Ball trajectory reconstruction from a singlelong-exposure perspective image[C]. Proceedings of CVBASE Workshop,2006.
    [11] Xu T., Zhao P. Object’s translational speed measurement using motion blurinformation[J]. Measurement,2010,43:1173-1179.
    [12] Lin, HY, Chang, CH. Automatic speed measurements of spherical objectsusing an off-the-shelf digital camera[C]. Proceedings of the IEEE Conference onMechatronics,2005,66-71.
    [13] Lin HY, Chang CH. Speed measurement of spherical objects using anoff-the-shelf digital camera[J]. Journal of Electronic Imaging,2008,17(3):033015-1-5.
    [14] Celestino,M Horikawa,O. Velocity measurement based on image blur[C].Proceedings of the ABCM Symposium Series in Mechatronics,2008,3:633-642.
    [15] Zhao Peng. Precise speed measurement using an interlaced scan image[J].Optics&Laser Technology,2011,43:204-207.
    [16] Xu Ting-Fa, Zhao Peng. Image motion-blur-based object’s speedmeasurement using an interlaced scan image[J]. Meas. Sci. Tech.2010,21:075502.
    [17] Omar Ait-Aider, Adrien Bartoli, Nicolas Andreff. Kinematics from Lines ina Single Rolling Shutter Image[C]. Proceedings of CVPR,2007.
    [18] Omar Ait-Aider, Nicolas Andreff, Jean Marc Lavest, and Philippe Martinet.Simultaneous Object Pose and Velocity Computation Using a Single View from aRolling Shutter Camera[C]. Proceedings of European Conference on Computer Vision,2006.
    [19] Omar Ait-Aider, Nicolas Andreff, Philippe Martinet and Jean-Marc Lavest.Simultaneous Pose and Velocity Measurement by Vision for High-Speed Robots[C].Proceedings of the IEEE International Conference on Robotics and Automation,2006.
    [20] Edouard Laroche and Shingo Kagami. Dynamical Models for PositionMeasurement with Global Shutter and Rolling Shutter Cameras[C]. Proceedings ofIEEE international conference on IOOS,2009,5204-5209.
    [21] Omar Ait-Aider, Nicolas Andreff, Jean Marc Lavest and Philippe Martinet.Exploiting Rolling Shutter Distortions for Simultaneous Object Pose and VelocityComputation Using a Single View[C]. Proceedings of IEEE International Conferenceon Computer Vision Systems,2006.
    [22] M. Lim, J. Limb. Visual measurement of pile movements for thefoundationwork using a high-speed line-scan camera[J]. Pattern Recognition,2008,41(6):2025-2033.
    [23] M. Nayyerloo, X. Q. Chen, et al. Seismic structural displacementmeasurement using a high-speed line-scan camera: experimental validation[C].Proceedings of NZSEE Conference,2010.
    [24] Kadowaki, T.,Kobayashi, K., Watanabe, K. Rotation Angle Measurement ofHigh-speed Flying Object[C]. Proceedings of International Joint Conference onSICE-ICASE,2006,5256–5259.
    [25] Kajiro Watanabe, Masaki Hokari. Measurement of3-D Loci and Attitude ofthe Golf Drive Head While Swinging[J]. IEEE Transactions on Systems, Man andCybernetics-Part A: Systems and Hunmans,2006,36(6):1161-1169.
    [26] Satoshi Sakuma, Yasuko Takahashi, et al. Measuring Vehicle Speed at aLicense Plate Using a Pair of Linescan Cameras[J]. Electronics and Communications inJapan, Part3,2006,89(10):33-45.
    [27] R. R. Critchfield, et al. Synchro-ballistic recording of detonationphenomena[C]. Proceedings of International Symposium on Optical Science,Engineering and Instrumentation,1997.
    [28]张三喜,薛以辉,卢宇.狭缝摄影胶片图像运动参量测量和处理[J].光子学报,1999,28(12):1117-1121.
    [29]刘同现,宋卫东,宋丕极,张进忠.线阵CCD在弹丸飞行姿态测量中的应用[J].军械工程学院学报,2002,14(3):23-25.
    [30] Song Weidong, Zhao Qinglan, Hu dacheng, Zhou Bin. Softwar Design forMeasurement of Bullet Attitude Based on Linear CCD[C]. Proceedings of the EighthInternational Conference on Electronic Measurement and Instruments,2007,521-524.
    [31]高昕,王颖,黄惠明.利用线阵CCD像机交汇测量弹丸攻角[J].光学技术,2002,28(4):376-379.
    [32]李金珂,陈良益.基于线阵CCD的弹道同步式狭缝摄影系统[J].激光与红外,2009,39(3):300-303.
    [33] Lili Ai, Feng Yuan, and Zhenliang Ding. Measurement of spatial object'sexterior attitude based on linear CCD[J]. Chinese Optics Letters,2008,6(7):505-509.
    [34] Etaya,M., Sakata, T., Shimoda, H., Matsumae, Y. An experiment ondetecting moving objects using a single scene of QuickBird data[J]. Joural of theRemote Sensing Society of Japan,24:357-366.
    [35] Fumio Yamazaki, Wen Liu, T. Thuy Vu. Speed Detection for MovingObjects from Digital Aerial Camera and QuickBird Sensors[C]. Proceedings of5thInternational Workshop on Remote Sensing Application to National Hazards,2007.
    [36] Z. Xiong, Y. Zhang. An initial study of moving target detection based on asingle set of high spatial resolution satellite imagery[C]. Proceedings of ASPRSAnnualConference,2006.
    [37] Y. Zhang, Z. Xiong. Moving Vehicle Detection Using a Single Set ofQuickbird Imagery—an Initial Study[C]. Proceedings of ISPRS Commission VIIMid-term Symposium "Remote Sensing: From Pixels to Processes",2006,397-402.
    [38] Z. Xiong, Y. Zhang. An initial study on vehicle information extraction fromsingle pass QuickBird satellite imagery[J]. Photogramm. Eng. Remote Sens.,2008,74(11):1401-1411.
    [39] Wen Liu, Fumio Yamazaki. Speed detection of moving vehicles from onescene of QuickBird images[C]. Proceedings of Urban Remote Sensing Joint Event,2009.
    [40] Wen Liu, Fumio Yamazaki. Automated Speed Detection of MovingVehicles from Remote Sensing Images[C]. Proceedings of Safety, Reliability and Riskof Structures, Infrastructures and Engineering Systems,2010.
    [41] Greg Easson, Scott DeLozier, Henrique G. Momm. Estimating Speed andDirection of Small Dynamics Targets through Optical Satellite Imaging[J]. RemoteSensing,2010,2(5):1331-1347.
    [42] Pesaresi, M., Gutjahar, K.H., Pagot, E. Estimating the velocity an directionof moving targets using a single optical VHR satellite sensor image[J]. Int. J. RemoteSens.,2008,29(4):1221-1228.
    [43]陶建伟,高分辨率卫星影像中物体运动参数获取方法与技术研究[D],北京大学,2009.
    [44] Jianwei Tao, Qiming Qin. On the Deformation of Moving Objects in HighResolution Satellite Image[C], Proceedings of MIPPR,2009.
    [45] Daniela Poli. Modelling of Spaceborne Linear Array Sensors[D], SwissFederal Institute of Technology Zurich,2005.
    [46]图像和机器视觉产品手册[M],北京凌云光视数字图像技术有限公司,2009.
    [47] DigitalGlobe Inc. QuickBird Imagery Products–Product Guide[M]. revision4.7.1(2007).
    [48] DigitalGlobe Inc. WorldView-1Product Quick Reference Guide[M].revision3.31.09, Version2(2008).
    [49] Rajiv Gupta and Richard I. Hartley. Linear Pushbroom Cameras[J], IEEETransactions on Pattern Analysis and Machine Intelligence,1997,19(9):963-975.
    [50] MyungJin Choi, Taejung Kim. Camera Modeling of Linear PushbroomImages: Performance Analysis[C]. Proceedings of the23rdAsian Conference on RemoteSensing,2002.
    [51] Rajiv Gupta and Richard I. Hartley. Camera Estiamtion for OrbitingPushbroom Imaging Systems[M].2007.
    [52] Daniela Poli. General Model for Airborne and Spaceborne Linear ArraySensors[J]. Int. Archives of Photogrammetry and Remote Sensing,2002,34.
    [53]张永生,巩丹超.高分辨率遥感卫星应用—成像模型、处理算法及应用技术[M].科学出版社,2004.
    [54]陈鹰.遥感影像的数字摄影测量[M].同济大学出版社,2004.
    [55] T. Yamakawa, C.S. Fraser. The Affine Projection Model for SensorOrientation: Experiences with High-resolution Satellite Imagery[C]. Proceedings ofIAPRS of20th Congress, Commission I,2004.
    [56] Radu Horaud, Roger Mohr, and Boguslaw Lorecki. On Single-ScanlineCamera Calibration[J], IEEE Transaction on Robobtics and Automation,1993,9(1):71-75.
    [57] Carlos A. Luna, Manuel Mazo, Jose Luis Lazaro, and Juan F. Vazquez.Calibration of Line-Scan cameras[J], IEEE Transactions on Instrumentation andMeasurement,2010,59(8):2185-2190.
    [58] Jamil Drareni, Sebastien Roy, Peter Sturm. Plane-Based Calibration forLinear Cameras[J], International Comupter Vision,91(2):146-156.
    [59] Hung-Chih Chiang, Randolph L. Moses, and Lee C. Potter. Model-BasedClassification of Radar Images[J], IEEE Transactions on Information Theory,2000,46(5):1842-1854.
    [60] H.-C. Chiang, R. L. Moses, and L. C. Potter. Model-based bayesian featurematching with application to synthetic aperture radar target recognition[J]. PatternRecognition,(Special Issue on Data and Information Fusion in Image Processing andComputer Vision,2000,34(8):1539-1553.
    [61] Zhou Jianxiong, Shi Zhiguang, Cheng Xiao,and Fu Qiang. Automatic TargetRecognition of SAR Images Based on Global Scattering Center Model[J]. IEEETransactions on Geoscience and Remote Sensing,2011,49(10):3713–3729.
    [62] J. Wissinger, R. Washburn, D. Morgan, C. Chong, N. Friedland, A. Nowicki,and R. Fung. Search Algorithms For Model-Based SAR ATR[C]. Proceedings of SPIE2757: Algorithms for Synthetic Aperture Radar Imagery III,1996,279–293.
    [63]张祖勋,苏国中,张剑清,郑顺义.基于序列影像的飞机姿态跟踪测量方法研究[J].武汉大学学报信息科学版,2004,29(4):287-291.
    [64] Rafael C. Gonzalez, Rachard E. Woods. Digital Image Processing[M].Second Edition,2009.
    [65] Sawicki D, Traffic Radar Handbook: A Comprehensive Guide to SpeedMeasuring Systems[M].2002.
    [66] Kastrinaki, V., Zervakis, M., Kalaitzakis, K. A survey of video processingtechniques for traffic applications[J]. Image and Vision Computing,2003,21:359-381.
    [67] Tao, C.V., El-Sheimy N. Highway mobile mapping[J]. GIM International,2000,14(10):81-85.
    [68] D. J. Dailey, L. Li. An Algorithm to Estimate Vehicle Speed UsingUn-Calibrated Cameras, Intelligent transportation systems[C]. Proceedings of IEEEITSC,1999.
    [69] Todd N. Schoepflin, Daniel J. Dailey. Algorithms for calibrating roadsidetraffic cameras and estimating mean vehicle speed[C]. Proceedings of IEEE IntelligentVehicles Symposium,2004,277-283.
    [70] Yucheng Li, Liang Yin, Yan Jia, Mushu Wang. Vehicle Speed MeasurementBased on Video Images[C]. Proceedings of the3rd International Conference onInnovative Computing Information and Control,2008,439.
    [71] Simond, N., Rives, P. Homogragraphy from a vanishing point in urbanscenes[C]. Proceedings of International conference on Intelligent Robots and Systems,2003,1:1005-1010.
    [72] Bose, B., Grimson, E. Ground plane rectification by tracking movingobjects[C]. Proceedings of the Joint IEEE International Workshop on VisualSurveillance and Performance Evaluation of Tracking and Surveillance,2003.
    [73] Lazaros Grammatikopoulos, George Karras, Elli Petsa. AutomaticEstimation of Vehicle Speed From Uncalibrated Video Sequences[C]. Proceedings ofInternational Symposium on Modern Technologies, Education and Professional Practicein Geodesy and Related Fields,2005,332-338.
    [74] Huei-Yung Lin and Kun-Jhih Li. Motion Blur Removal and Its Applicationto Vehicle Speed Detection[C]. Proceedings of IEEE International Conference on ImageProcessing,,2004,5:3407-3410.
    [75] Huei-Yung Lin and Chia-Hong Chang. Automatic Speed Measurement ofSpherical Object Using an Off-the-shelf Digital Camera[C]. Proceedings of the IEEEInternational Conference on Mechatronics,2005,66-71.
    [76] Huei-Yung Lin, Kun-Jhih Li. Vehicle Speed Estimation form Single StillImages Based on Motion Blur Analysis[C]. Proceedings of IAPR Conference onMachine Vision Application,2005,128-131.
    [77] Huei-Yung Lin. Vehicel Speed Detection and Identification from a SingleMotion Blurred Image[C]. Proceedings of the Seventh IEEE Workshop on Applicationsof Computer Vision,2005,461-467.
    [78] Gholam ali rezai rad, Javad mohamadi. Vehicle Speed Estimation Based onthe Image[C]. Proceedings of4th International Conference Sciences of Electronic,Technologies of Information and Telecommunications,2007.
    [79] Javad Mohammadi, Rohollah Akbari, Mehdi Keshavarz Ba haghighat.Vehicle Speed Estimation Based on the Image Motion Blur Using RADONTransform[C]. Proceedings of2nd International Conference on Signal ProcessingSystems,2010, V1:243-247.
    [80] A.Freeeman and A. Currie. Synthetic aperture radar(SAR) images ofmoving targets[J]. GEC Journal of Research,1987,5(2):106-115.
    [81] Paulo A. C. Marques, Jose M. B. Dias. Velocity Estimation of Fast MovingTargets Using Undersampled SAR Raw-Data[C]. Proceedings of IEEE2001International Geoscience and Remote Sensing Symposium,2001,4:1610-1613.
    [82] Paulo A. C. Marques, Jose M. B. Dias. Velocity Estimation of Fast MovingTargets Using a Single SAR Sensor[J]. IEEE Transactions on Aerospace and ElectronicSystems,2005,41(1):75-89.
    [83] M.I. Pettersson. Detection of Moving Targets in Wideband SAR[J]. IEEETransactions on Aerospace and Electronic Systems,2004,40(3):780-796.
    [84] J.J. Sharma, C.H. Gierull and M.J. Collins. The Influence of TargetAcceleration on Velocity Estimation in Dual-channel SAR-GMIT[J]. IEEETransactions on Geoscience and Remote Sensing,2006,44(1):134-147.
    [85] V.T. Vu, T.K. Sjogren, M.I. Pettersson, H.-J. Zepernick and A. Gustavsson.Experimental Results on Moving Targets Detection by Focusing in UWB LowFrequency SAR[C]. Proceedings of IET International Radar Conference,2007.
    [86] Orun A.B., K. Natarajan. A Modified bundle adjustment software for SPOTimagery and photography: A tradeoff[J]. Photogrammetric Eng.&Remote Sens.,2008,60(12):1431-1437.
    [87] Tianen Chen, Ryosuke Shibasaki, and Zongjian Lin. A Rigorous LaboratoryCalibration Method for Interior Orientation of an Airborne Linear Push-BroomCamera[J]. Photogrammetric Engineering&Remote Sensing,2007,73(4):369-374.
    [88] Ohlhof, T., and W. Kornus. Geometric calibration of digital three-line CCDCameras[J]. International Archives of Photogrammetry and Remote Sensing,1994,71-81.
    [89] John A. Richards, Xiuping Jia, Remote Sensing Digital Image Analysis (4thEdition)[M].2009.
    [90] Boeing Company, Boeing777-200/300Airplane Characteristics for AirportPlanning[M].1998.
    [91] G. J. J. Ren, J. Orwell, M. Xu. A general framework for3d soccer ballestimation and tracking[C]. Proceedings of the IEEE International Conference on Imageprocessing(ICIP),2004.
    [92] Woodward, A., Delmas, P. Computer vision for low cost3-d golf ball andclub tracking[C]. Proceedings of the image and vision computing New Zealand,2005.
    [93] A. O. Gopal Pingali, Y. Jean. Ball tracking and virtual replays for innovativerennis broadcasts[C]. Proceedings of the Internantional Conference on PatternRecongnition(ICPR), IEEE Computer Sciety,2000,4152.
    [94] Vincenzo Caglioti, Alessandro Giusti. Recovering ball motion from a singlemotion-blurred image[J]. Computer Vision and Image Understanding,2009,113:590-597.
    [95] Levin, A. Blind motion debluring using image statistics, Advances in NeuralInformation Processing Systems[M]. MIT Press, Cambridge, MA,2007.
    [96] B. Bascle, A. Blake, A. Zisserman. Motion deblurring and super-resolutionform an image sequence[C]. Proceedings of the European Conference on ComputerVision(ECCV),1996,573-582.
    [97] R. Fergus, B. Singh, A. Hertzmann, S.T. Roweis, W.T. Freeman. Removingcamera shake from a single photography[C]. Proceedings of SIGGRAPH: ACMSIGGRAPH,2006,787-794.
    [98] G. Boracchi and V. Caglioti. Motion blur estimation at corners[C].Proceedings of VISAPP Conference,2007,296-302.
    [99]于起峰等.基于图像的精密测量与运动测量[M].科学出版社,2002.
    [100]王鲲鹏.靶场图像目标检测跟踪与定姿技术研究[D].国防科技大学,2010.
    [101] Andre Bernier, Vincent Remillard. Measurement Method for In-Flight Yawof C77Round Final Report[R]. Defence R&D Canada-Valcartier Contract Report,DRDC Valcartier CR2009-321, October2009.
    [102] Tyler E. Ehlers, Bernard J. Guidos, David W. Webb. Small-CaliberProjectile Target Impact Angle Determined from Close Proximity Radiographs[R].ARL-TR-3943, Army Research Laboratory report, Aberdeen Provding Ground, MD21005-5069,2006.
    [103] S. F. Ray. Applied photographic optics: lenses and optical systems forphotography, film, video, electronic and digital imaging[M]. Focal Press,2002.
    [104] Fuller P. Some Highlights in the History of High-Speed Photography andPhotonics as Applied to Ballistics[M].2005.
    [105] Robert R. Critchfield, et al. Synchro-ballistic Recording of DetonationPhenomena[C]. Proceedings of Internantional Symposium on Optical Science,Engineering and Instrumentation,1997.
    [106] US ARMY TEST AND EVALUATION COMMAND TESTOPERATIONS PROCEDURE[R].1982.
    [107] S.R. Verma, R. N. Mukherjee and M. Balakrishnan. Measurement of Spin ofProjectile[J]. Def. Sci. J.,1989,39(1):73-77.
    [108] Abrahams D M. Using synchro-ballistic cameras to determine the velocityand spin rate of high-velocity projectiles[R]. Sandia Report SAND84-8018,1985.
    [109] Hughett P. Projectile velocity and spin rate by image processing ofsynchro-ballistic photography[C]. Proceedings of Ultrahigh-and High-SpeedPhotography, Videography, Photonics, and Velocimetry. San Diego, CA, USA,1990.
    [110] Zhuxin Zhao, Bingwei Hui, Gongjian Wen, Deren Li. A method for themotion parameters estimation in incomplete synchro-ballistic photography[C].Proceedings of the3rd IEEE International Symposium on Photonics andOptoelectronics,2011.
    [111]何小穆.线阵CCD狭缝摄像系统的研究[D].中科院西安光机所硕士学位论文,2000.
    [112] Ai L., et al. Study of the spatial object’s exterior attitude measurement basedon multi-linear CCD[C]. Proceedings of3rdIEEE Conference on Industrial Electronicsand Appliaction,2008,1945-1948.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700