用户名: 密码: 验证码:
基于卫星编队的空间碎片视觉高精度导航方法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:High-precision visual navigation for space debris based on satellite formation
  • 作者:杨博 ; 王浩帆 ; 苗峻 ; 赵晓涛
  • 英文作者:YANG Bo;WANG Haofan;MIAO Jun;ZHAO Xiaotao;School of Astronautics, Beihang University;Beijing Institute of Control Engineering;
  • 关键词:空间碎片 ; 卫星编队 ; 立体视觉 ; 可观测性 ; 高精度导航
  • 英文关键词:space debris;;satellites;;stereo vision;;observability;;navigation
  • 中文刊名:ZGKJ
  • 英文刊名:Chinese Space Science and Technology
  • 机构:北京航空航天大学宇航学院;北京控制工程研究所;
  • 出版日期:2018-11-20 10:14
  • 出版单位:中国空间科学技术
  • 年:2019
  • 期:v.39;No.230
  • 语种:中文;
  • 页:ZGKJ201901007
  • 页数:9
  • CN:01
  • ISSN:11-1859/V
  • 分类号:44-52
摘要
视觉方法被广泛应用于空间碎片这类和卫星之间没有任何通信的非合作目标导航。针对观测过程中视觉传感器的像差偏差引起位置不确定的问题,提出了利用卫星编队的立体视觉导航方法。首先,利用卫星编队构造了长基线的视觉传感器,通过Fisher矩阵对系统的可观测性进行了分析,验证了系统是可观的;其次,对视觉传感器进行了误差分析,通过安排最优视差角,使多颗卫星的观测信息融合达到最优;最后,应用卫星编队的视觉导航方法对空间碎片进行了导航仿真验证。结果表明,基于卫星编队的视觉导航方法可以显著减小观测误差,精度能达到0.1 m量级,而且编队构形简单,易于工程实现。
        Visual methods are widely used for non-cooperative targets navigation because there is no communication among satellites and non-cooperative targets like space debris. To solve the uncertainty positions caused by aberrations of visual sensors, a stereoscopic visual navigation strategy based on satellite formation was formulated. Firstly, a long-baseline visual sensor was constructed by satellite formation. The observability of the system was analyzed and verified by Fisher matrix. Secondly, the error analysis of the stereo sensor was performed, and by arranging the optimal parallax angle, the measurement information fusion of multiple satellites was optimized. Finally, the proposed method is applied in the navigation of space debris. The simulation results demonstrate that the satellite navigation based visual navigation can significantly reduce the observation errors, the accuracy can reach the order of 0.1 m, and the formation configuration is simple and easy to implement.
引文
[1] 龚自正, 徐坤博, 牟永强,等. 空间碎片环境现状与主动移除技术[J]. 航天器环境工程, 2014, 31(2):129-135. GONG Z Z, XU K B, MOU Y Q, et al. The space debris environment and the active debris removal techniques[J]. Spacecraft Environment Engineering, 2014, 31(2):129-135(in Chinese).
    [2] 冯凯, 李丹明, 李居平,等. 空间碎片监测及清除技术研究进展[J]. 真空与低温, 2016, 22(6):335-339. FENG K, LI D M, LI J P, et a. Research status of space debris surveillance and removal techniques[J]. Vacuum & Cryogenics, 2016, 22(6):335-339(in Chinese).
    [3] SANG J, BENNETT J C, SMITH C H. Estimation of ballistic coefficients of low altitude debris objects from historical two line elements[J]. Advances in Space Research, 2013, 52(1): 117-124.
    [4] LIOU J C. An active debris removal parametric study for LEO environment remediation[J]. Advances in Space Research, 2011, 47(11): 1865-1876.
    [5] SMITH C H, GREENE B. Future space debris tracking requirements[C].33rd AIAA International Communications Satellite Systems Conference and Exhibition,2015: 4361.
    [6] 庞宝君, 许可. 空间碎片数据形式及轨道演化算法[J]. 上海航天, 2011, 28(1):50-55. PANG B J, XU K. Reaesrch on data format of space debris and orbit evolution algorithm[J]. Aerospace Shanghai, 2011, 28(1):50-55(in Chinese).
    [7] WEN Q, YANG L, ZHAO S, et al. Impacts of orbital elements of space-based laser station on small scale space debris removal[J]. Optik-International Journal for Light and Electron Optics, 2018, 154: 83-92.
    [8] HOOTS F R, SCHUMACHER Jr P W, GLOVER R A. History of analytical orbit modeling in the US space surveillance system[J]. Journal of Guidance, Control, and Dynamics, 2004, 27(2): 174-185.
    [9] 余建慧, 苏增立, 谭谦. 空间目标天基光学观测模式分析[J]. 量子电子学报, 2006, 23(6):772-776. YU J H, SU Z L, Tan Q. Analysis on the space-based optic observation mode for space object[J]. Journal of Quantum Electronics, 2006, 23(6):772-776(in Chinese).
    [10] 赵琪, 翟光. 基于星载可见光相机的空间碎片探测[J]. 海军航空工程学院学报, 2016, 31(1):44-50. ZHAO Q, ZHAI G. Space debris debris detection method based on spaceborne visible camera[J]. Journal of Naval Aeronautical and Astronautical University, 2016, 31(1):44-50(in Chinese).
    [11] 崔潇潇. 美国天基空间目标监视系统概况[J]. 国际太空, 2011(7):37-43.
    [12] 翟光, 赵琪, 张景瑞. 空间碎片在轨识别与精确定位方法[J]. 红外与激光工程, 2016, 45(s1):169-176. ZHAI G, ZHAO Q, ZHANG J R. On-board space debris recognition and accurate positioning method[J]. Infrared and Laser Engineering, 2016, 45(s1):169-176(in Chinese).
    [13] 于晓婷, 郁丰, 吴佳鹏. 基于立体视觉的非合作航天器超近距离相对导航[J]. 兵工自动化, 2014(4):92-96. YU X T, YU F, WU J P. Stereo vision based super close relative navigation for non-cooperative spacecraft[J]. Ordnance Industry Automation, 2014(4):92-96(in Chinese).
    [14] 李文跃. 基于视觉的近距离非合作空天目标的相对位姿测量技术研究[D]. 南京航空航天大学, 2012. LI W Y. Research on measurement of relative pose between tow non-cooperative spacecrafts in short range based on computer vision[D]. Nanjing: Nanjing University of Aeronautics and Astronautics, 2012(in Chinese).
    [15] 魏许. 空间非合作目标的近距离相对位姿测量技术研究[D]. 南京:南京航空航天大学, 2013. WEI X. Research on measurement of relative pose between non-cooperative targets in short range[D]. Nanjing: Nanjing University of Aeronautics and Astronautics, 2013(in Chinese).
    [16] 韩龙. 编队飞行航天器相对状态的立体视觉测量研究[D]. 合肥:中国科学技术大学, 2007. HAN L. Relative state measurements of spacecraft in formation flying based on stereo vision[D]. Hefei: University of Science and Technology of China, 2007(in Chinese).
    [17] 张庆君, 胡修林, 叶斌,等. 基于双目视觉的航天器间相对位置和姿态的测量方法[J]. 宇航学报, 2008, 29(1):156-161. ZHANG Q J, HU X L, YE B, et al. Binocular vision-based relative position and attitude determination between spacecrafts[J]. Journal of Astronautics, 2008, 29(1):156-161(in Chinese).
    [18] 权宏伟, 彭冬亮, 薛安克. 纯方位角目标运动分析的可观测性研究[J]. 火力与指挥控制, 2009, 34(10):43-46. QUAN H W, PENG D L, XUE A K. Observability of bearings-only target motion analysis[J]. Fire Control & Command Control, 2009, 34(10):43-46(in Chinese).
    [19] 杨海燕, 汤国建. 立体视觉系统的目标可见性及测量精度分析[J]. 系统工程与电子技术, 2012, 34(9):1889-1894. YANG H Y, TANG G J. Target visibility and measure precision analysis of stereo vision systems[J]. Systems Engineering and Electronics, 2012, 34(9):1889-1894(in Chinese).

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700