CN110617802A - Satellite-borne moving target detection and speed estimation method - Google Patents

Satellite-borne moving target detection and speed estimation method Download PDF

Info

Publication number
CN110617802A
CN110617802A CN201910684818.2A CN201910684818A CN110617802A CN 110617802 A CN110617802 A CN 110617802A CN 201910684818 A CN201910684818 A CN 201910684818A CN 110617802 A CN110617802 A CN 110617802A
Authority
CN
China
Prior art keywords
connected domain
satellite
coordinate system
domains
moving target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910684818.2A
Other languages
Chinese (zh)
Inventor
刘宇宸
徐卿
赵春晖
刘鲁
朱琦
瞿涵
张聪
谢鸣宇
雷拥军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control Engineering
Original Assignee
Beijing Institute of Control Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control Engineering filed Critical Beijing Institute of Control Engineering
Priority to CN201910684818.2A priority Critical patent/CN110617802A/en
Publication of CN110617802A publication Critical patent/CN110617802A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/36Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
    • G01P3/38Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light using photographic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Power Engineering (AREA)
  • Image Analysis (AREA)

Abstract

一种星载动目标检测及速度估计方法,包括步骤:首先,对于连续拍摄的多帧图像的每一帧分别进行:(1)对原始星图处理,得到去背景图像;(2)利用自适应阈值法对去背景图像进行阈值分割,存储高于阈值的所有像元坐标和灰度;(3)对保存的像元进行四连通域目标提取潜在目标;(4)利用连通域面积和长宽比阈值剔除假目标并计算各剩余连通域质心;(5)利用星载计算机提供的坐标系转换阵,及相机姿态信息将各连通域质心投影到GPS坐标系中。对各帧提取出的连通域,剔除假目标,并计算真实动目标在GPS坐标系下的运动速度大小及运动方向。该方法能够有效的消除背景干扰,实现星载实时检测动目标并提供目标的地理坐标及运动速度。

A method for detecting and estimating the speed of a satellite-borne moving target, comprising the steps of: first, for each frame of a multi-frame image continuously shot: (1) processing the original star map to obtain a background-free image; (2) using an automatic The adaptive threshold method performs threshold segmentation on the background image, and stores all pixel coordinates and gray levels higher than the threshold; (3) extracts potential targets by four-connected domain targets on the saved pixels; (4) uses the area and length of the connected domain to extract potential targets. The width ratio threshold is used to eliminate false targets and calculate the centroid of each remaining connected domain; (5) Use the coordinate system transformation matrix provided by the onboard computer and the camera attitude information to project the centroid of each connected domain into the GPS coordinate system. For the connected domain extracted from each frame, the false target is eliminated, and the motion speed and direction of the real moving target in the GPS coordinate system are calculated. The method can effectively eliminate the background interference, realize the real-time detection of the moving target on the satellite and provide the geographical coordinates and movement speed of the target.

Description

一种星载动目标检测及速度估计方法A Spaceborne Moving Target Detection and Velocity Estimation Method

技术领域technical field

本发明涉及一种运动目标多帧检测的图像处理方法,特别是一种星载预警相机在轨实时动目标检测及速度估计方法,术语动目标图像处理技术领域。The present invention relates to an image processing method for multi-frame detection of moving targets, in particular to a method for real-time moving target detection and velocity estimation of a spaceborne early warning camera in orbit, which is the technical field of moving target image processing.

背景技术Background technique

在轨实时对大气内感兴趣动目标进行检测跟踪是对地观测、遥感卫星的重要任务之一,流程是在星载相机对地连续拍摄的过程中,利用多帧连续图像完成对感兴趣动目标进行提取,计算该目标对应位置及相应的运动速度大小及方向,再利用计算结果对目标进行长时间跟踪拍摄,完成对目标的分析且及时向地面报告目标信息,达到快速检测跟踪的作用。On-orbit real-time detection and tracking of moving objects of interest in the atmosphere is one of the important tasks of earth observation and remote sensing satellites. Extract the target, calculate the corresponding position of the target and the size and direction of the corresponding movement speed, and then use the calculation results to track and shoot the target for a long time, complete the analysis of the target and report the target information to the ground in time to achieve rapid detection and tracking.

由于成像距离远,目标在图像上所占面积从几个到十几个像元,内部无纹理特征,外部形状信息较少,背景及噪声对于目标的影响较大,通常目标信噪比较低,算法虚警率普遍较高。算法通常先对单帧图像进行预处理,提高可疑目标的信噪比,再对图像进行阈值分割并提取连通域;再利用多帧图像的处理结果进行运动目标的检测。Due to the long imaging distance, the area of the target on the image ranges from a few to more than a dozen pixels, there is no internal texture feature, and there is less external shape information. , the false alarm rate of the algorithm is generally high. The algorithm usually preprocesses a single frame of images to improve the signal-to-noise ratio of suspicious targets, then performs threshold segmentation on the images and extracts connected domains; and then uses the processing results of multiple frames of images to detect moving targets.

传统的动目标处理算法的设计都是在理想条件下,忽略卫星的运动及相机姿态变化,对预处理后的图像直接进行相面坐标的多帧匹配,该种方法无法应用在真实星载环境下。同时,多数方法如双边滤波,小波变换等算法计算量大,无法在轨实时计算。The design of traditional moving target processing algorithms is under ideal conditions, ignoring satellite motion and camera attitude changes, and directly performing multi-frame matching of phase coordinates on the preprocessed image. This method cannot be applied to the real spaceborne environment. Down. At the same time, most of the methods such as bilateral filtering, wavelet transform and other algorithms require a large amount of calculation and cannot be calculated in real time on-orbit.

由于直接利用相面坐标进行多帧目标检测,传统方法对于速度大小及速度方向阈值的设置都不准确:大部分方法直接使用多帧之间相同连通域在相面上位移完成速度大小的估计,利用位移方向向量完成速度方向的估计,该方法单纯使用了相机理论上的像元分辨率,但忽视了相机的光轴方向、卫星的姿态对于相机对地拍摄角度的影响以及对像元分辨率的影响,因此传统方法的检测虚警率普遍较高。Due to the direct use of phase coordinates for multi-frame target detection, the traditional methods are inaccurate for the setting of velocity magnitude and velocity direction threshold: most methods directly use the same connected domain between multiple frames to displace on the phase surface to complete the velocity magnitude estimation. Using the displacement direction vector to complete the estimation of the velocity direction, this method simply uses the theoretical pixel resolution of the camera, but ignores the optical axis direction of the camera, the influence of the satellite's attitude on the camera's shooting angle to the ground, and the pixel resolution. Therefore, the detection false alarm rate of traditional methods is generally higher.

发明内容SUMMARY OF THE INVENTION

本发明解决的技术问题为:克服现有技术不足,提供一种星载动目标检测及速度估计方法,解决了星载相机无法实时检测地面运动目标及提供目标运动速度大小、速度的问题。The technical problem solved by the present invention is: to overcome the deficiencies of the prior art, to provide a spaceborne moving target detection and speed estimation method, and to solve the problem that the spaceborne camera cannot detect the ground moving target in real time and provide the size and speed of the target moving speed.

本发明解决的技术方案为:一种星载动目标检测及速度估计方法,步骤如下:The technical scheme solved by the present invention is: a method for detecting and estimating the speed of a satellite-borne moving target, and the steps are as follows:

(1)对连续拍摄的多帧星载对地观测图像进行背景抑制处理,得到去背景图像;(1) Background suppression processing is performed on the continuously shot multi-frame satellite-borne Earth observation images to obtain background-removed images;

(2)对步骤(1)的去背景图像,进行阈值分割,再进行连通域提取,剔除部分假目标连通域,得到有效连通域;(2) Perform threshold segmentation on the background-removed image in step (1), and then perform connected domain extraction to remove some false target connected domains to obtain effective connected domains;

具体地,利用自适应阈值法对去背景图像进行阈值分割,保存高于阈值的所有像元的坐标及灰度值;对保存的像元基于四连通域法提取连通域,并根据提取的连通域的面积和长宽比阈值剔除部分假目标连通域,得到有效连通域。Specifically, the adaptive threshold method is used to perform threshold segmentation on the background image, and the coordinates and gray values of all pixels higher than the threshold are stored; The area and aspect ratio thresholds of the domain are used to eliminate some false target connected domains to obtain effective connected domains.

(3)对步骤(2)的有效连通域进行质心提取,求取有效连通域的质心在拍摄相机的相面上的精确坐标。(3) Extract the centroid of the effective connected domain in step (2), and obtain the precise coordinates of the centroid of the effective connected domain on the phase plane of the photographing camera.

(4)利用坐标转换矩阵以及相机光轴指向参数,完成质心在拍摄相机的相面上的精确坐标到GPS坐标系的坐标转换,得到GPS坐标系下的有效连通域质心坐标;(4) Utilize the coordinate transformation matrix and the camera optical axis pointing parameter to complete the coordinate transformation from the precise coordinates of the centroid on the phase plane of the shooting camera to the GPS coordinate system, and obtain the effective connected domain centroid coordinates under the GPS coordinate system;

(5)根据GPS坐标系下的有效连通域质心坐标,得到对应有效连通域质心对应的地理位置,并计算其对应的运动速度大小和方向,使用多帧图像连通域比对提取真实目标方法,提取真实动目标对应的连通域,最终实现动目标检测,并实现其速度估计。(5) According to the centroid coordinates of the effective connected domain in the GPS coordinate system, the geographic location corresponding to the centroid of the corresponding effective connected domain is obtained, and the corresponding movement speed and direction are calculated, and the method of extracting the real target by comparing the connected domain of multiple frames of images is used. Extract the connected domain corresponding to the real moving target, finally realize the moving target detection, and realize its speed estimation.

优选的,步骤(1)中连续拍摄的多帧星载对地观测图像,为:安装于轨道高度500km的在轨卫星上的预警相机,其视场角度为4°。Preferably, the multiple frames of satellite-borne earth observation images continuously captured in step (1) are: an early warning camera installed on an orbiting satellite with an orbital height of 500 km, and its field of view angle is 4°.

优选的,步骤(1)对连续拍摄的多帧星载对地观测图像进行背景抑制处理,得到去背景图像,具体为:对每一帧图像分别进行基于模板的形态学Top-Hat算法,得到去背景图像。Preferably, in step (1), background suppression processing is performed on the consecutively shot multi-frame satellite-borne Earth observation images to obtain a background-removed image. Specifically, the template-based morphological Top-Hat algorithm is performed on each frame of image to obtain Go to background image.

优选的,连通域提取,具体为:基于四连通域法粗提取全部潜在目标连通域,并计算各连通域面积及连通域最大长宽比与设定的固定阈值进行比较,剔除不符合要求的假目标连通域,得到剩余连通域为有效连通域。Preferably, the extraction of connected domains includes: roughly extracting all potential target connected domains based on the four-connected domain method, calculating the area of each connected domain and the maximum length-width ratio of the connected domains and comparing them with a set fixed threshold, and eliminating those that do not meet the requirements. The false target connected domain is obtained, and the remaining connected domain is an effective connected domain.

优选的,联通域质心相面坐标转换GPS坐标系坐标,具体为:利用相机本体坐标系到GPS坐标系的转换矩阵C、卫星在该时刻与地心的位移在GPS坐标系下的表示,对应位置地球半径R;根据连通域在相面的精确坐标对应的光路方向在相机本体坐标系的方向向量通过如下公式计算:Preferably, the phase coordinates of the center of mass of the Unicom domain are converted into the coordinates of the GPS coordinate system, specifically: using the transformation matrix C from the camera body coordinate system to the GPS coordinate system, and the displacement of the satellite and the earth's center at this moment Representation in the GPS coordinate system, corresponding to the radius of the earth R; the direction vector of the optical path corresponding to the precise coordinates of the connected domain in the phase plane in the camera body coordinate system Calculated by the following formula:

式中为像元对应光路在GPS坐标系下的方向向量;通过如下公式计算:in the formula is the direction vector of the light path corresponding to the pixel in the GPS coordinate system; it is calculated by the following formula:

式中为连通域所对应物体在GPS坐标系下相对地心的方向向量,h是目标所处高度,r是卫星到连通域对应物体的距离。根据上式可解出r及方向向量在GPS坐标系下的表示 in the formula is the direction vector of the object corresponding to the connected domain relative to the center of the earth in the GPS coordinate system, h is the height of the target, and r is the distance from the satellite to the object corresponding to the connected domain. According to the above formula, the representation of r and direction vector in the GPS coordinate system can be solved

优选的,多帧图像连通域比对提取真实目标方法,具体为:Preferably, the method for extracting the real target by comparing the connected domains of multiple frames of images is specifically:

(1)对每一帧图像提取出的联通域,按照联通域的面积及能量排序。具体地,先按照联通域的面积进行排序,面积最大的置于队列最前面,对于面积相同的连通域则比较其像元灰度值的和,按照大小排序;(1) The connected domains extracted from each frame of image are sorted according to the area and energy of the connected domains. Specifically, first sort according to the area of the connected domain, and place the one with the largest area at the front of the queue. For the connected domain with the same area, compare the sum of the gray values of its pixels, and sort by size;

(2)从第二帧开始,逐帧逐个连通域进行与之前所有帧中的连通域匹配,将匹配成功的连通域组合存放于当前帧对应联通域中;(2) Starting from the second frame, the connected domains are matched with the connected domains in all previous frames one by one frame by frame, and the successfully matched connected domain combinations are stored in the corresponding connected domains of the current frame;

优选的,连通域匹配,具体为:在当前帧中逐个提取连通域并逐个与被匹配连通域(即当前帧之前的全部结果连通域)进行联通性检测,具体地,包括:Preferably, the connected domains are matched, specifically: extracting the connected domains one by one in the current frame and performing connectivity detection with the matched connected domains (that is, all the result connected domains before the current frame) one by one, specifically, including:

(1)当被匹配连通域长度为1时,利用两者在GPS坐标系下的位移及当前连通域所在帧与被匹配连通域所在帧的差值共同计算对应目标的运动速度大小,并与速度阈值进行对比,若不在阈值范围内则匹配失败,则进行下一次连通域检测;若速度在阈值范围中,则将两个连通域合并,存储在当前连通域所在位置,连通域长度加1,存储对应运动速度大小,利用两者在GPS坐标系下的位移,计算运动方向向量并存储,删除被匹配连通域在原始位置的数据。(1) When the length of the connected domain to be matched is 1, use the displacement of the two in the GPS coordinate system and the difference between the frame where the current connected domain is located and the frame where the connected domain to be matched is located to jointly calculate the motion speed of the corresponding target, and compare it with The speed threshold is compared. If it is not within the threshold range, the matching fails, and the next connected domain detection is performed; if the speed is within the threshold range, the two connected domains are merged and stored in the current connected domain. The length of the connected domain is increased by 1 , store the size of the corresponding motion speed, use the displacement of the two in the GPS coordinate system, calculate the motion direction vector and store it, and delete the data of the matched connected domain at the original position.

(2)当被匹配连通域长度大于1时,利用类似(1)的方法求当前连通域与被匹配连通域中最后一帧连通域的对应运动速度大小及运动方向向量,并检测速度大小是否满足阈值条件,利用速度方向阈值检测确定速度方向是否满足阈值条件,若不同时满足,则进行下一次连通域检测;若同时满足,对运动速度大小、运动方向向量进行基于被匹配连通域长度的加权计算,并存储在当前连通域信息中,并将两个连通域合并,存储在当前连通域所在位置,联通域长度变为被匹配连通域长度加1,删除被匹配连通域在原始位置的数据。(2) When the length of the connected domain to be matched is greater than 1, use the method similar to (1) to find the corresponding motion velocity and the motion direction vector of the connected domain of the current connected domain and the last frame of the connected domain to be matched, and detect whether the velocity is large or not. If the threshold condition is met, use the velocity direction threshold detection to determine whether the velocity direction satisfies the threshold condition, if not at the same time, perform the next connected domain detection; The weighted calculation is stored in the current connected domain information, and the two connected domains are merged and stored in the current connected domain. data.

优选的,速度方向阈值检测,为:利用当前帧与被匹配连通域的速度方向向量与被匹配连通域内部存储的对应速度方向向量进行点乘运算,若计算结果大于阈值,则阈值检测通过;否则为失败。Preferably, the speed direction threshold value detection is: using the current frame and the speed direction vector of the matched connected domain and the corresponding speed direction vector stored in the matched connected domain to perform a dot product operation, if the calculation result is greater than the threshold value, then the threshold value detection is passed; Otherwise it fails.

本发明与现有技术相比的优点在于:The advantages of the present invention compared with the prior art are:

(1)本发明基于运动目标在相面上的尺寸对模板进行了针对性设计,算法精度较高;(1) The present invention has carried out a targeted design on the template based on the size of the moving target on the phase plane, and the algorithm precision is high;

(2)本发明通过星载计算机提供的相机本体坐标系到GPS坐标系的转换矩阵及卫星相对地球的矢量,将连通域质心的相面坐标转换到了GPS坐标系下,能够更精确的计算出目标运动速度大小及方向;(2) The present invention converts the phase coordinates of the centroid of the connected domain into the GPS coordinate system through the conversion matrix of the camera body coordinate system to the GPS coordinate system and the vector of the satellite relative to the earth provided by the on-board computer, and can calculate more accurately The speed and direction of the target movement;

(3)本发明利用速度大小及速度方向两种阈值进行连通域匹配,根据连通域长度切换匹配方法,能够大量降低虚警率;(3) The present invention uses two thresholds of velocity magnitude and velocity direction to perform connected domain matching, and switches the matching method according to the length of the connected domain, which can greatly reduce the false alarm rate;

(4)本发明中使用的全部算法的计算量较小,能够在星载相机的硬件条件下完成实时处理,具有很高的实用价值;(4) The calculation amount of all the algorithms used in the present invention is small, and the real-time processing can be completed under the hardware conditions of the spaceborne camera, which has high practical value;

(5)本发明在小计算量的条件下准确提取可疑目标区域,能够提供更精确的速度大小、方向的计算结果,从而降低虚警率,提高检测成功率。(5) The present invention accurately extracts the suspicious target area under the condition of small calculation amount, and can provide more accurate calculation results of speed and direction, thereby reducing the false alarm rate and improving the detection success rate.

附图说明Description of drawings

图1为一种星载动目标检测及速度估计方法流程图;Figure 1 is a flow chart of a method for on-board moving target detection and velocity estimation;

图2为Top-Hat算法的特定模板;Figure 2 is a specific template of the Top-Hat algorithm;

图3为求解对应连通域相对地心的方向矢量的示意图;Fig. 3 is a schematic diagram of solving the direction vector of the corresponding connected domain relative to the earth's center;

图4为连通域的连通性判断原理示意图。FIG. 4 is a schematic diagram of the principle of connectivity judgment of a connected domain.

具体实施方式Detailed ways

下面结合附图对本发明做进一步详细描述。The present invention will be further described in detail below with reference to the accompanying drawings.

本发明一种星载动目标检测及速度估计方法,包括步骤:首先,对于连续拍摄的多帧图像的每一帧分别进行:(1)对原始星图(即原始图像,连续拍摄的多帧星载对地观测图像)进行处理,得到去背景图像;(2)利用自适应阈值法对去背景图像进行阈值分割,存储高于阈值的所有像元坐标和灰度;(3)对保存的像元进行四连通域目标提取潜在目标;(4)利用连通域面积和长宽比阈值剔除假目标并计算各剩余连通域质心;(5)利用星载计算机提供的坐标系转换阵,及相机姿态信息将各连通域质心投影到GPS坐标系中。其次,对各帧提取出的连通域,基于运动速度及运动方向进行多帧比对剔除假目标,并计算真实动目标在GPS坐标系下的运动速度大小及运动方向。该方法能够有效的消除背景干扰,实现星载实时检测动目标并提供目标的地理坐标及运动速度。A method for detecting and estimating the speed of a satellite-borne moving target according to the present invention includes the following steps: first, for each frame of a multi-frame image continuously shot: (1) for the original star map (that is, the original image, the multiple frames of continuous shooting) (2) Use adaptive threshold method to perform threshold segmentation on the background-removed image, and store all pixel coordinates and gray levels higher than the threshold; Pixels are used for four-connected domain targets to extract potential targets; (4) Use the connected domain area and aspect ratio threshold to eliminate false targets and calculate the centroid of each remaining connected domain; (5) Use the coordinate system transformation matrix provided by the onboard computer, and the camera Attitude information projects the centroids of each connected domain into the GPS coordinate system. Secondly, for the connected domain extracted from each frame, based on the motion speed and motion direction, the false target is eliminated by multi-frame comparison, and the motion speed and motion direction of the real moving target in the GPS coordinate system are calculated. The method can effectively eliminate the background interference, realize the real-time detection of the moving target on the satellite and provide the geographical coordinates and movement speed of the target.

本发明可使用在星载预警相机中,实现在轨实时动目标检测及速度估计,实时向卫星提供的可疑目标信息。本发明针对现有的侦查遥感卫星的载荷相机所拍摄的图像数据量较大,无法实现在轨的实时处理,因此无法实时提供其拍摄到的可疑动目标,而数据传输到地面处理通常已经无法获取该动目标当时的行动及目的。而本发明可实现在轨检测动目标,提供具备时效性信息,增强了侦查卫星的预警能力。本发明的具体操作步骤如下:The invention can be used in a spaceborne early warning camera to realize real-time on-orbit moving target detection and speed estimation, and provide suspicious target information to satellites in real time. According to the present invention, the amount of image data captured by the existing payload camera for reconnaissance remote sensing satellites is large, and real-time on-orbit processing cannot be realized. Therefore, the suspicious moving targets captured by the present invention cannot be provided in real time, and data transmission to the ground for processing is usually impossible. Get the current action and purpose of the moving target. The present invention can realize on-orbit detection of moving targets, provide time-sensitive information, and enhance the early warning capability of reconnaissance satellites. The concrete operation steps of the present invention are as follows:

(1)对连续拍摄的多帧星载对地观测图像(即原始图像)进行背景抑制处理,得到去背景图像,为:在相机持续对地面拍摄过程中,对当前帧图像优选进行抑制背景及噪声且提高目标对比度,对连续拍摄的多帧星载对地观测图像进行背景抑制处理;(1) Background suppression processing is performed on the continuously shot multi-frame satellite-borne earth observation images (that is, the original images) to obtain a background-removed image, which is: in the process of continuous shooting of the ground by the camera, the current frame image is preferably subjected to background suppression and background suppression. Noise and target contrast are improved, and the background suppression processing is performed on the continuous shooting of multiple frames of satellite-borne earth observation images;

如图2所示,优选利用进行背景抑制处理,得到去背景图像,式中TH(f)是结果图像(即去背景图像)矩阵,f是原始图像矩阵。As shown in Figure 2, it is preferable to use Perform background suppression processing to obtain a background-removed image, where TH(f) is the result image (ie, background-removed image) matrix, and f is the original image matrix.

具体如下:details as follows:

第一步:求最大图像:对原始图像进行基于环形结构元A的形态学膨胀运算,为膨胀运算符号,具体地,膨胀运算包括:将对应像元的灰度值替换为以该像元为中心,结构元A对应环形区域内的像元的最大灰度值。The first step: find the maximum image: perform the morphological expansion operation based on the ring structure element A on the original image, is the expansion operation symbol, specifically, the expansion operation includes: replacing the gray value of the corresponding pixel with the maximum gray value of the pixel in the annular area corresponding to the structural element A with the pixel as the center.

结构元A确定的优选方案为,设内部结构元Bi为3*3的平结构元(即3*3矩阵,矩阵中每一元素为1),外部结构元B0为5*5的平结构元(即5*5矩阵,矩阵中每一元素为1),以两者的中心为原点,进行矩阵的伪减法,得到的结果A=B0-Bi,结果A为环形结构元,则A的结构如图2所示,尺寸为5*5,内部3*3区域为空,呈环形状。The preferred solution determined by the structural element A is to set the internal structural element Bi as a 3*3 flat structural element (that is, a 3*3 matrix, each element in the matrix is 1), and the external structural element B 0 is a 5*5 flat structure element (that is, a 5*5 matrix, each element in the matrix is 1), take the center of the two as the origin, perform pseudo-subtraction of the matrix, and obtain the result A=B 0 -B i , the result A is a ring structure element, then The structure of A is shown in Figure 2, the size is 5*5, the inner 3*3 area is empty, and it is in the shape of a ring.

第二步:求最小图像:对第一步得到的最大图像进行基于平结构元Bi的形态学腐蚀运算,Θ为腐蚀运算符合,具体地,腐蚀运算包括:将对应像元的灰度值替换为以该像元为中心,结构元Bi对应3*3区域内的像元的最小灰度值。The second step: find the smallest image: perform the morphological corrosion operation based on the flat structure element B i on the largest image obtained in the first step, where Θ is the corresponding corrosion operation. Specifically, the corrosion operation includes: converting the gray value of the corresponding pixel Replaced with the pixel as the center, the structure element B i corresponds to the minimum gray value of the pixel in the 3*3 area.

第三步:求结果图像(即去背景图像):将原始图像与第二步得到的最小图像做差,即对两者的每一个对应像元进行灰度值相减操作。The third step: find the result image (ie, the background image is removed): make the difference between the original image and the minimum image obtained in the second step, that is, perform the gray value subtraction operation on each corresponding pixel of the two.

第四部:求去背景图像:将第三步得到的结果图像中灰度小于0的像元的灰度值置0,即得到去背景图像;The fourth step: find the background image: set the gray value of the pixel whose gray value is less than 0 in the result image obtained in the third step to 0, that is, the background image is obtained;

(2)对去背景图像进行阈值分割,再进行连通域提取。具体如下:(2) Perform threshold segmentation on the background image, and then extract the connected domain. details as follows:

对去背景图像利用自适应阈值法进行阈值分割,保存图像中灰度值高于阈值的像元的相面坐标及灰度值。自适应阈值法包括:计算全局阈值,其计算公式为Vth=μ+α·σ,式中μ为去背景图像的灰度均值,σ为全局灰度标准差,α为设定的固定系数,决定了提取目标的灵敏程度(即检测率),综合考虑灵敏度和虚警率要求,优选取2~5,优选取α=3,提高平衡检测率及虚警率。The background-removed image is segmented by the adaptive threshold method, and the phase coordinates and the gray value of the pixels whose gray value is higher than the threshold value in the image are saved. The adaptive threshold method includes: calculating the global threshold, and its calculation formula is V th =μ+α·σ, where μ is the gray mean value of the background-removed image, σ is the global gray standard deviation, and α is a set fixed coefficient , determines the sensitivity of the extraction target (ie detection rate), and takes into account the sensitivity and false alarm rate requirements, preferably 2 to 5, preferably α=3, to improve the balance detection rate and false alarm rate.

对通过阈值法的像元利用聚类法进行四连通域提取,优选保存面积大于2且小于10的全部连通域(由于拍摄距离远,认为动目标与点目标无异,由于微离焦的作用,目标大小如设置的阈值,使用者可根据相机光学系统参数自行设置),即有效连通域。The four-connected domain is extracted by the clustering method for the pixels that pass the threshold method, and it is preferable to save all the connected domains with an area greater than 2 and less than 10 (due to the long shooting distance, it is considered that the moving target is no different from the point target, due to the effect of micro defocusing). , the target size is the set threshold, the user can set it according to the parameters of the camera optical system), that is, the effective connected domain.

(3)对有效连通域进行质心提取,求取其在拍摄相机相面上的精确坐标。具体如下:(3) Extract the centroid of the effective connected domain, and obtain its precise coordinates on the photographing camera phase. details as follows:

本发明中相机优选安装于距离地面500km轨道高度的卫星上,与卫星通过云台连接,云台可相对卫星进行三轴转动,因此有相机本体坐标系到卫星本体坐标系的转换矩阵Ccs,相机视场角为θ,优选值为4°,质心在相面的坐标为(xi,yi),即拍摄相机相面上的精确坐标,其中xi为图像列方向坐标,yi为图像行方向坐标。In the present invention, the camera is preferably installed on a satellite with an orbital height of 500km from the ground, and is connected to the satellite through a pan/tilt, which can rotate in three axes relative to the satellite. Therefore, there is a transformation matrix C cs from the camera body coordinate system to the satellite body coordinate system, The camera's field of view is θ, and the preferred value is 4°. The coordinates of the centroid on the phase surface are (x i , y i ), that is, the exact coordinates on the phase surface of the camera, where x i is the image column direction coordinate, and y i is Image row direction coordinates.

(4)利用多个坐标系的转换矩阵以及相机光轴光路方向向量,完成质心在拍摄相机相面上的坐标到GPS坐标系的坐标转换,得到GPS坐标系下的有效连通域质心坐标。具体如下:(4) Using the transformation matrices of multiple coordinate systems and the direction vector of the optical axis of the camera to complete the coordinate transformation of the coordinates of the centroid on the camera phase plane to the GPS coordinate system, and obtain the effective connected domain centroid coordinates in the GPS coordinate system. details as follows:

相机本体坐标系到GPS坐标系的转换矩阵C以及(3)中求出的各连通域的质心在相面的坐标(xi,yi),求解各连通域在GPS坐标系下的坐标。其中相机本体坐标系到GPS坐标系的转换矩阵C由三个转换矩阵组成:从地心惯性坐标系到GPS坐标系的转换矩阵CIe、从卫星本体坐标系到地心惯性坐标系的转换矩阵CsI、从相机本体坐标系到卫星本体坐标系Ccs。其中相机光轴指向为相机本体坐标系z轴正向,像平面行方向与相机本体坐标系x轴平行,像平面列方向与相机本体坐标系y轴平行,相机视场角为θ,相机总分辨率u*v,进一步地,优选的坐标转换方法如下:The transformation matrix C from the camera body coordinate system to the GPS coordinate system and the coordinates (x i , y i ) of the centroid of each connected domain on the phase surface obtained in (3), and the coordinates of each connected domain in the GPS coordinate system are obtained. The transformation matrix C from the camera body coordinate system to the GPS coordinate system consists of three transformation matrices: the transformation matrix C Ie from the geocentric inertial coordinate system to the GPS coordinate system, and the transformation matrix from the satellite body coordinate system to the geocentric inertial coordinate system. C sI , from the camera body coordinate system to the satellite body coordinate system C cs . The optical axis of the camera points to the positive z-axis of the camera body coordinate system, the row direction of the image plane is parallel to the x-axis of the camera body coordinate system, the column direction of the image plane is parallel to the y-axis of the camera body coordinate system, and the camera field of view is θ. Resolution u*v, further, the preferred coordinate transformation method is as follows:

第一步:利用卫星姿态敏感器系统得到的多个转换矩阵求取最终需求的从相机本体坐标系到GPS坐标系的优选转换矩阵C=CIe·CsI·CcsThe first step: using the multiple transformation matrices obtained by the satellite attitude sensor system to obtain the optimal transformation matrix C=C Ie · C sI · C cs from the camera body coordinate system to the GPS coordinate system that is ultimately required;

第二步:利用对应连通域的相面坐标及相机视场角计算对应连通域所对应的相机光路方向向量在相机本体坐标系下的坐标优选的计算公式如下:Step 2: Use the phase coordinates of the corresponding connected domain and the camera field of view to calculate the coordinates of the camera optical path direction vector corresponding to the corresponding connected domain in the camera body coordinate system The preferred calculation formula is as follows:

式中θx为光路方向向量在XoZ平面上的投影与z轴的夹角,计算公式为θx=(xi-u/2)/u;类似地,θy为光路方向向量在YoZ平面的投影与z轴的夹角,计算公式为:θy=(yi-v/2)/v;where θ x is the angle between the projection of the optical path direction vector on the XoZ plane and the z-axis, and the calculation formula is θ x =(x i -u/2)/u; similarly, θ y is the optical path direction vector on the YoZ plane The angle between the projection of and the z-axis, the calculation formula is: θ y =(y i -v/2)/v;

第三步:利用第二步中计算的光路方向向量及第一步中计算的转换矩阵,得到光路方向向量在GPS坐标系下的坐标 Step 3: Use the optical path direction vector calculated in the second step and the transformation matrix calculated in the first step to obtain the coordinates of the optical path direction vector in the GPS coordinate system

第四步:利用第三步的结果以及卫星提供的卫星质心到地心的矢量在GPS坐标系下的坐标解方程:其中R+h为动目标所在位置到地心的距离,R为对应地点地球半径,h为目标海拔高度,为预设值,根据目标的不同可自行设定不同的高度,未知数为相机到目标的距离r以及目标到地心的方向向量在GPS坐标系下的向量具体实际问题见图3,图中对应变量与文中涉及变量相同。具体的,利用三角形余弦定理公式:Step 4: Use the results of the third step and the coordinates of the vector from the satellite centroid to the earth's center provided by the satellite in the GPS coordinate system Solving equations: Among them, R+h is the distance from the position of the moving target to the center of the earth, R is the radius of the earth at the corresponding location, h is the altitude of the target, which is a preset value. Different heights can be set by yourself according to different targets, and the unknown is the camera to the target. The distance r and the direction vector of the target to the center of the earth in the GPS coordinate system The specific practical problems are shown in Figure 3. The corresponding variables in the figure are the same as those involved in the text. Specifically, using the triangular cosine theorem formula:

(R+h)2=r2+l2-2rlcos(αi)(R+h) 2 =r 2 +l 2 -2rlcos(α i )

式中αi为相机到目标的反向矢量与卫星质心到地心的矢量的夹角,由于上述两个量均为已知量,因此可直接求出。再利用公式可求出目标到地心的方向向量在GPS坐标系下的向量最后可得到目标在GPS坐标系下的坐标矢量:li=(R+h)·ltowhere α i is the angle between the reverse vector from the camera to the target and the vector from the satellite's center of mass to the earth's center. Since the above two quantities are known quantities, they can be directly obtained. Reuse formula The vector of the direction vector from the target to the center of the earth in the GPS coordinate system can be obtained Finally, the coordinate vector of the target in the GPS coordinate system can be obtained: l i =(R+h)·l to .

(5)基于目标速度大小及方向阈值,提取真实动目标连通域,并利用其GPS坐标计算动目标的地理位置,计算动目标运动速度大小和方向,设定动目标速度大小及方向阈值,判定计算出的满足阈值条件的有效连通域为动目标,否则剔除不满足条件的连通域,实现动目标检测,并实现其速度估计。具体如下:(5) Based on the target speed and direction threshold, extract the connected domain of the real moving target, and use its GPS coordinates to calculate the geographic location of the moving target, calculate the speed and direction of the moving target, set the speed and direction threshold of the moving target, and determine The calculated effective connected domain that satisfies the threshold condition is the moving target, otherwise the connected domain that does not meet the condition is eliminated to realize the moving target detection and its speed estimation. details as follows:

第一步:完成每一帧的有效连通域质心在GPS坐标系下的坐标计算后,对当前帧图像提取出的联通域,按照联通域的面积及能量排序。先按照联通域的面积进行排序,面积最大的置于队列最前面,对于面积相同的连通域则比较其像元灰度值的和,按照大小排序;如果都相同,则维持当前排序,并将各连通域的坐标、面积、能量等存储于各自的连通域编号下。(根据存储空间的大小,滚动存储固定帧数N帧图像的连通域的信息,超过该帧数后覆盖最初帧的信息,以此类推)从第二帧开始,执行第二步~第五步;Step 1: After completing the coordinate calculation of the effective connected domain centroid of each frame in the GPS coordinate system, the connected domain extracted from the current frame image is sorted according to the area and energy of the connected domain. First, sort according to the area of the connected domain, and place the largest area at the front of the queue. For connected domains with the same area, compare the sum of the gray values of the pixels and sort by size; if they are all the same, maintain the current sorting and put The coordinates, area, energy, etc. of each connected domain are stored under the respective connected domain number. (According to the size of the storage space, the information of the connected domain of N-frame images with a fixed number of frames is stored in a rolling manner, and the information of the initial frame is covered after the number of frames is exceeded, and so on.) Starting from the second frame, perform steps 2 to 5 ;

第二步:基于当前帧拍摄的图像的连通域信息,逐个连通域(当前连通域)与之前存储的连通域信息进行逐帧逐个连通域(被检测连通域)的连通性检测。根据被检测联通域的长度分别执行第三步和第四步。Step 2: Based on the connected domain information of the image captured by the current frame, the connectivity detection of each connected domain (current connected domain) and the previously stored connected domain information is performed frame by frame by connected domain (detected connected domain). The third step and the fourth step are respectively performed according to the length of the detected connectivity domain.

若被检测连通域的长度为1,则执行第三步:仅基于两个连通域之间的运动速度进行判断;若被检测连通域的长度大于1,则执行第四步:基于两个连通域之间的运动速度及运动方向进行判断;If the length of the detected connected domain is 1, perform the third step: judge only based on the motion speed between the two connected domains; if the length of the detected connected domain is greater than 1, perform the fourth step: based on the two connected domains Judging the movement speed and movement direction between domains;

第三步:被检测连通域的长度仅为1时,仅有两个连通域之间的运动速度这一个判据能够判定两个连通域的连通性。利用两者记录的在GPS坐标系下的坐标做差,可得到两者之间的位移,再利用被检测连通域所在帧数,与当前帧做差,并利用曝光间隔时间T可得到两帧之间的时间间隔,并计算两个连通域的对应运动速度。将计算出的运动速度与设定的目标运动速度阈值进行比对。(例如,飞机的运动速度区间优选为200m/s~800m/s,并引入一定的误差容忍度,优选设定在90%~110%,则最终目标的运动速度阈值上限优选为880m/s,下限优选为180m/s)当运动速度不在阈值区间内时,跳回第二步继续执行循环;当运动速度符合区间时,优选当前连通域长度加1,将被检测连通域的全部信息转存于当前连通域所在位置且删除被检测连通域存储的全部内容,保存计算出的运动速度,计算两个连通域对应的运动方向向量并存储。跳回第二步,将当前连通域更改为下一个连通域,进入新的循环。Step 3: When the length of the detected connected domain is only 1, only the criterion of the motion speed between the two connected domains can determine the connectivity of the two connected domains. Using the difference between the coordinates recorded by the two in the GPS coordinate system, the displacement between the two can be obtained, and then using the frame number of the detected connected domain to make a difference with the current frame, and using the exposure interval T to obtain two frames the time interval between and calculate the corresponding motion velocities of the two connected domains. Compare the calculated movement speed with the set target movement speed threshold. (For example, the movement speed range of the aircraft is preferably 200m/s~800m/s, and a certain error tolerance is introduced, preferably set at 90%~110%, then the upper limit of the movement speed threshold of the final target is preferably 880m/s, The lower limit is preferably 180m/s) when the motion speed is not within the threshold interval, jump back to the second step to continue the loop; when the motion speed conforms to the interval, it is preferable to add 1 to the length of the current connected domain, and dump all the information of the detected connected domain. At the current position of the connected domain, delete all the contents stored in the detected connected domain, save the calculated motion speed, calculate and store the motion direction vector corresponding to the two connected domains. Jump back to the second step, change the current connected domain to the next connected domain, and enter a new loop.

第四步:被检测连通域的长度大于1时,可根据连通域的运动速度及方向判断两个连通域的连通性。提取被检测连通域中最新帧的连通域信息,与被检测连通域信息按照第三步共同计算出对应的运动速度及运动方向向量。将计算出的运动速度与设定的目标运动速度阈值进行对比,当运动速度在阈值区间内时:将计算出的运动方向向量与被检测连通域存储的运动方向向量进行点乘运算,可求出两个运动方向向量夹角的余弦值,并与设定的余弦阈值进行对比。(例如设定运动目标短时间内运动方向的变化不会很大,因此认为运动方向夹角在多帧之间优选不应超过20°,则对应的余弦值优选为0.94,即余弦阈值设置优选为0.94,当两个运动方向向量的点乘结果小于该阈值时,认为当前连通域不是被检测连通域所对应的目标,判断两者不连通)当对比结果为两者不连通时,跳回第二步继续执行循环;当对比结果为两者联通时,当前连通域长度变为被检测连通域长度加1,将被检测连通域的全部信息转存于当前连通域所在位置且删除被检测连通域存储的全部内容,将计算出的运动速度及运动方向基于被检测连通域长度与被检测连通域自身存储的运动速度及运动方向进行加权计算,并存储于当前联通域中。跳回第二步,将当前更改为下一连通域,进入新的循环。连通性判断原理见图4,图中的uij,其中i,j的数字分别代表第i帧图像中的第j各连通域,u代表其是对应连通域质心在相面上的x轴坐标,v代表其是对应连通域质心在相面上的y轴坐标。Step 4: When the length of the detected connected domain is greater than 1, the connectivity of the two connected domains can be judged according to the movement speed and direction of the connected domain. Extract the connected domain information of the latest frame in the detected connected domain, and calculate the corresponding motion speed and motion direction vector together with the detected connected domain information according to the third step. Compare the calculated movement speed with the set target movement speed threshold. When the movement speed is within the threshold range: Do a dot product between the calculated movement direction vector and the movement direction vector stored in the detected connected domain, and obtain The cosine value of the angle between the two motion direction vectors is obtained and compared with the set cosine threshold value. (For example, the change of the moving direction of the moving target in a short period of time will not be very large, so it is considered that the included angle of the moving direction should preferably not exceed 20° between multiple frames, and the corresponding cosine value is preferably 0.94, that is, the cosine threshold is preferably set. is 0.94, when the dot product result of the two motion direction vectors is less than the threshold, it is considered that the current connected domain is not the target corresponding to the detected connected domain, and it is judged that the two are not connected) When the comparison result is that the two are not connected, jump back The second step continues to execute the loop; when the comparison result is that the two are connected, the length of the current connected domain becomes the length of the connected domain to be detected plus 1, and all the information of the connected domain to be detected is transferred to the location of the current connected domain and the detected connected domain is deleted. For all the contents stored in the connected domain, the calculated movement speed and movement direction are weighted based on the length of the detected connected domain and the stored movement speed and movement direction of the detected connected domain, and stored in the current connected domain. Jump back to the second step, change the current to the next connected domain, and enter a new loop. The principle of connectivity judgment is shown in Figure 4. In the figure, u ij , the numbers of i and j respectively represent the jth connected domain in the ith frame image, and u represents the x-axis coordinate of the centroid of the corresponding connected domain on the phase plane. , v represents the y-axis coordinate of the centroid of the corresponding connected domain on the phase plane.

第五步:第二步运行完毕后,对当前帧中存储的所有连通域进行考察,优选当连通域长度大于2时,向星载计算机发送检测成功数据包。数据包括检测成功标志,检测到的目标的运动速度、其在GPS坐标系下的运动方向向量、目标在GPS坐标系下的位置坐标等信息,供星载计算机判断。Step 5: After the second step is completed, examine all connected domains stored in the current frame, preferably when the length of the connected domain is greater than 2, send a detection success data packet to the onboard computer. The data includes the detection success sign, the motion speed of the detected target, its motion direction vector in the GPS coordinate system, and the position coordinates of the target in the GPS coordinate system, etc., for the onboard computer to judge.

本发明通过样机进行了拍摄,并实现了算法的实时运行,结果表明,本方法能够实时准确地提取图像中的可疑动目标并计算该目标的运动速度大小及方向。The present invention shoots through the prototype and realizes the real-time operation of the algorithm. The results show that the method can accurately extract suspicious moving objects in the image in real time and calculate the size and direction of the moving speed of the objects.

本发明未详细说明部分属本领域技术人员公知常识。The parts not described in detail in the present invention belong to the common knowledge of those skilled in the art.

Claims (10)

1. A satellite-borne moving target detection and speed estimation method is characterized by comprising the following steps:
(1) carrying out background suppression processing on a plurality of continuously shot satellite-borne earth observation images to obtain background-removed images;
(2) performing threshold segmentation on the background-removed image in the step (1), then performing connected domain extraction, and removing part of false target connected domains to obtain effective connected domains;
specifically, threshold segmentation is carried out on the background-removed image by using a self-adaptive threshold method, and coordinates and gray values of all pixels higher than a threshold value are stored; extracting connected domains from the stored pixels based on a four-connected domain method, and eliminating partial false target connected domains according to the area and length-width ratio threshold values of the extracted connected domains to obtain effective connected domains;
(3) carrying out centroid extraction on the effective connected domain in the step (2) and solving an accurate coordinate of the centroid of the effective connected domain on the phase plane of the shooting camera;
(4) the coordinate conversion from the accurate coordinate of the centroid on the phase plane of the shooting camera to the GPS coordinate system is completed by utilizing the coordinate conversion matrix and the camera optical axis pointing parameter, and the effective connected domain centroid coordinate under the GPS coordinate system is obtained;
(5) obtaining a geographic position corresponding to the centroid of the corresponding effective connected domain according to the centroid coordinates of the effective connected domain in the GPS coordinate system, calculating the corresponding movement speed and direction of the geographic position, extracting a real target by using a method of comparing multi-frame image connected domains, extracting the connected domain corresponding to the real moving target, finally realizing moving target detection and realizing speed estimation of the moving target.
2. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: the continuously shot multiframe satellite-borne earth observation images in the step (1) require that: the early warning camera installed on an orbiting satellite with an orbit height of 500km shoots, and the view field angle of the early warning camera is 4 degrees.
3. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: the method comprises the following steps of (1) carrying out background suppression processing on a plurality of continuously shot multi-frame satellite-borne earth observation images to obtain background-removed images, wherein the background-removed images specifically comprise the following steps: and respectively processing each frame of image, and performing background suppression to obtain a background-removed image.
4. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: extracting a connected domain, specifically: and (3) roughly extracting all potential target connected domains based on a four-connected domain method, calculating the area of each connected domain and the maximum length-width ratio of the connected domains, comparing the area and the maximum length-width ratio with a set fixed threshold, and eliminating false target connected domains which do not meet the requirements to obtain the remaining connected domains as effective connected domains.
5. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: the method comprises the following steps of converting the coordinates of a communication domain centroid phase plane into the coordinates of a GPS coordinate system, specifically: using the transformation matrix C from the camera body coordinate system to the GPS coordinate system, the satellite displacement between the time and the earth centerRepresenting in a GPS coordinate system, corresponding to the position earth radius R; according to the direction vector of the optical path direction corresponding to the accurate coordinate of the connected domain on the phase surface in the camera body coordinate systemBy the formula:
in the formulaThe direction vector of the corresponding light path of the pixel under the GPS coordinate system is taken as the direction vector of the pixel; by the formula:
in the formulaThe direction vector of the object corresponding to the connected domain relative to the earth center under the GPS coordinate system, h is the height of the target, r is the distance from the satellite to the object corresponding to the connected domain, and the expression of r and the direction vector under the GPS coordinate system is solved according to the formula
6. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: the method for extracting the real target by comparing the connected domains of the multi-frame images specifically comprises the following steps:
(1) sequencing the communication domains extracted from each frame of image according to the areas and the energies of the communication domains; specifically, sorting is carried out according to the area of the connected domains, the largest area is arranged at the front of the queue, and for the connected domains with the same area, the sum of pixel gray values of the connected domains is compared and sorted according to the size;
(2) and starting from the second frame, carrying out frame-by-frame connected domain matching with the connected domains in all the previous frames one by one, and storing the successfully matched connected domain combination in the corresponding connected domain of the current frame.
7. The method according to claim 6, wherein the method comprises: and sequencing the communication domains extracted from each frame of image according to the areas and the energies of the communication domains, specifically, sequencing according to the areas of the communication domains, placing the communication domains with the largest areas at the forefront of the queue, comparing the sums of the gray values of the pixels of the communication domains with the same areas, and sequencing according to the sizes.
8. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: the suspicious target area is accurately extracted under the condition of small calculated amount, and more accurate calculation results of speed and direction can be provided, so that the false alarm rate is reduced, and the detection success rate is improved.
9. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: the method is used in a satellite-borne early warning camera, and realizes on-orbit real-time moving target detection and speed estimation.
10. The method for detecting and estimating the speed of the satellite-borne moving target according to claim 1, wherein the method comprises the following steps: suspicious target information is provided to the satellite in real time.
CN201910684818.2A 2019-07-26 2019-07-26 Satellite-borne moving target detection and speed estimation method Pending CN110617802A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910684818.2A CN110617802A (en) 2019-07-26 2019-07-26 Satellite-borne moving target detection and speed estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910684818.2A CN110617802A (en) 2019-07-26 2019-07-26 Satellite-borne moving target detection and speed estimation method

Publications (1)

Publication Number Publication Date
CN110617802A true CN110617802A (en) 2019-12-27

Family

ID=68921577

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910684818.2A Pending CN110617802A (en) 2019-07-26 2019-07-26 Satellite-borne moving target detection and speed estimation method

Country Status (1)

Country Link
CN (1) CN110617802A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429479A (en) * 2020-03-26 2020-07-17 中国科学院长春光学精密机械与物理研究所 Space target identification method based on image integral mean value
CN112669297A (en) * 2020-12-31 2021-04-16 中国科学院长春光学精密机械与物理研究所 Target detection method
CN112883865A (en) * 2021-02-09 2021-06-01 北京深蓝长盛科技有限公司 Ball-bearing breakthrough event identification method and device, computer equipment and storage medium
CN116740332A (en) * 2023-06-01 2023-09-12 南京航空航天大学 Method for positioning center and measuring angle of space target component on satellite based on region detection

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825475A (en) * 2010-05-17 2010-09-08 哈尔滨工业大学 Image motion compensation method for space optical remote sensor
WO2015199502A1 (en) * 2014-06-26 2015-12-30 한국과학기술원 Apparatus and method for providing augmented reality interaction service
CN106709944A (en) * 2016-12-14 2017-05-24 上海微小卫星工程中心 Satellite remote sensing image registration method
CN106709914A (en) * 2017-01-05 2017-05-24 北方工业大学 SAR image ship detection false alarm eliminating method based on two-stage DEM sea-land reservoir
KR20170125716A (en) * 2016-05-04 2017-11-15 임재형 Apparatus for determining position information of object and method thereof
CN107504966A (en) * 2017-07-10 2017-12-22 北京控制工程研究所 There is the method that nautical star asterism extracts under cloud environment in a kind of daytime
US20180172822A1 (en) * 2016-12-21 2018-06-21 The Boeing Company Method and apparatus for multiple raw sensor image enhancement through georegistration
CN108876807A (en) * 2018-05-31 2018-11-23 长春博立电子科技有限公司 A kind of real-time piggyback satellite image motion object detection tracking
CN109146963A (en) * 2017-06-13 2019-01-04 南京鑫和汇通电子科技有限公司 One kind being based on the matched image position offsets detection method of swift nature

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825475A (en) * 2010-05-17 2010-09-08 哈尔滨工业大学 Image motion compensation method for space optical remote sensor
WO2015199502A1 (en) * 2014-06-26 2015-12-30 한국과학기술원 Apparatus and method for providing augmented reality interaction service
KR20170125716A (en) * 2016-05-04 2017-11-15 임재형 Apparatus for determining position information of object and method thereof
CN106709944A (en) * 2016-12-14 2017-05-24 上海微小卫星工程中心 Satellite remote sensing image registration method
US20180172822A1 (en) * 2016-12-21 2018-06-21 The Boeing Company Method and apparatus for multiple raw sensor image enhancement through georegistration
CN106709914A (en) * 2017-01-05 2017-05-24 北方工业大学 SAR image ship detection false alarm eliminating method based on two-stage DEM sea-land reservoir
CN109146963A (en) * 2017-06-13 2019-01-04 南京鑫和汇通电子科技有限公司 One kind being based on the matched image position offsets detection method of swift nature
CN107504966A (en) * 2017-07-10 2017-12-22 北京控制工程研究所 There is the method that nautical star asterism extracts under cloud environment in a kind of daytime
CN108876807A (en) * 2018-05-31 2018-11-23 长春博立电子科技有限公司 A kind of real-time piggyback satellite image motion object detection tracking

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李劲东: "《卫星遥感技术》", 31 March 2018 *
王苗苗,毛晓艳,魏春岭: "空间小目标的检测算法", 《空间控制技术与应用》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429479A (en) * 2020-03-26 2020-07-17 中国科学院长春光学精密机械与物理研究所 Space target identification method based on image integral mean value
CN111429479B (en) * 2020-03-26 2022-10-11 中国科学院长春光学精密机械与物理研究所 Space target identification method based on image integral mean value
CN112669297A (en) * 2020-12-31 2021-04-16 中国科学院长春光学精密机械与物理研究所 Target detection method
CN112669297B (en) * 2020-12-31 2022-05-27 中国科学院长春光学精密机械与物理研究所 Target detection method
CN112883865A (en) * 2021-02-09 2021-06-01 北京深蓝长盛科技有限公司 Ball-bearing breakthrough event identification method and device, computer equipment and storage medium
CN112883865B (en) * 2021-02-09 2024-01-19 北京深蓝长盛科技有限公司 Identification method and device for break-through event with ball, computer equipment and storage medium
CN116740332A (en) * 2023-06-01 2023-09-12 南京航空航天大学 Method for positioning center and measuring angle of space target component on satellite based on region detection
CN116740332B (en) * 2023-06-01 2024-04-02 南京航空航天大学 Method for positioning center and measuring angle of space target component on satellite based on region detection

Similar Documents

Publication Publication Date Title
CN110617802A (en) Satellite-borne moving target detection and speed estimation method
Leira et al. Automatic detection, classification and tracking of objects in the ocean surface from UAVs using a thermal camera
EP3678095B1 (en) Determination of position from images and associated camera positions
CN108734103A (en) The detection of moving target and tracking in satellite video
Najiya et al. UAV video processing for traffic surveillence with enhanced vehicle detection
CN103697855B (en) A measurement method of hull level attitude based on sea antenna detection
CN103149939A (en) Dynamic target tracking and positioning method of unmanned plane based on vision
Mansour et al. Automated vehicle detection in satellite images using deep learning
US10042047B2 (en) Doppler-based segmentation and optical flow in radar images
CN115285381B (en) Collision early warning method and device for space debris
CN111462182B (en) A three-dimensional trajectory estimation method for ballistic missiles based on infrared early warning images
CN103996027A (en) Space-based space target recognizing method
CN109708627B (en) Method for rapidly detecting space dynamic point target under moving platform
CN115856885A (en) Ship target continuous tracking method based on low-orbit SAR satellite constellation
CN108519083A (en) A Space Non-Cooperative Multi-Target Acquisition and Tracking Algorithm
CN112927294B (en) Satellite orbit and attitude determination method based on single sensor
KR20180127567A (en) System for unmanned aircraft image auto geometric correction
Long et al. Object detection research of SAR image using improved faster region-based convolutional neural network
CN117739972A (en) A UAV approach phase positioning method without global satellite positioning system
CN117095029A (en) Method and device for detecting small target in air flight
Xiao et al. Safe Mars landing strategy: Towards lidar-based high altitude hazard detection
Chen et al. Aerial robots on the way to underground: An experimental evaluation of VINS-mono on visual-inertial odometry camera
CN110047103A (en) Mixed and disorderly background is removed from image to carry out object detection
Del Prete et al. A deep learning-based crater detector for autonomous vision-based spacecraft navigation
Le et al. Human detection and tracking for autonomous human-following quadcopter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191227