基于HOURGLASS网络语义关键点提取的光学图像空间目标姿态估计方法

Semantic Key-point Extraction Based Space Target Pose Estimation from Optical Image via Hourglass Network

  • 摘要: 空间目标姿态估计是有效实现各类航天任务的重要前提,基于空间光学观测图像的目标姿态估计关键一环在于快速准确地建立起观测图像与空间目标之间的“二维特征点-三维实体结构”映射关系。传统的方法往往将这一任务分解为特征提取和特征关联两个步骤序贯进行,然而在空间目标光学观测场景中,高动态的光照变化和目标的相对高速运动特点会显著降低图像特征提取的可靠性,影响后续特征关联匹配的正确率并最终降低对空间目标的姿态估计精度。针对这一问题,本文提出了一种基于语义关键点提取的光学图像空间目标姿态估计方法,利用Hourglass网络端到端地提取包含语义信息的关键点,直接实现了光学图像中二维特征点与目标三维实体结构的关联映射,并在此基础上利用EPnP算法求解待估计的目标姿态值。实验结果表明,本文所提的方法能较好地兼顾算法精度与效率,其在仿真数据集上的姿态估计最小误差为0.83°,且在数据降质的情况下平均误差依然优于传统方法。

     

    Abstract: Pose estimation for space target is an essential prerequisite for various space missions. The key part of target pose estimation is establishing the mapping relationship between the two-dimensional feature and three-dimensional feature quickly and accurately. This task usually be decomposed into two sequential steps where are feature extraction and feature correlation. However, high dynamic illumination variation and the relative high-speed movement characteristics of the space target will significantly decay the reliability of image feature extraction which will affect the accuracy of feature matching and finally cause a drop in the accuracy of the pose estimation of the space target in the reality scene of space observation. To solve this problem, this paper proposes a pose estimation method based on semantic key point. In this work, the Hourglass Network is used to extract the key points containing semantic information end-to-end, thereby, the mapping relationship between the two-dimensional feature and three-dimensional feature is established. Then the target pose can be estimated utilized the EPnP algorithm. Experimental results demonstrate that the method proposed in this paper can reach a compromise between the accuracy and efficiency of the algorithm. The minimal error of pose estimation on the simulation data set can reach 0.83°, and the average error is better than the traditional method in the case of data degradation.

     

/

返回文章
返回