CN108492324A - Aircraft method for tracing based on fully-connected network and Kalman filter - Google Patents

Aircraft method for tracing based on fully-connected network and Kalman filter Download PDF

Info

Publication number
CN108492324A
CN108492324A CN201810079824.0A CN201810079824A CN108492324A CN 108492324 A CN108492324 A CN 108492324A CN 201810079824 A CN201810079824 A CN 201810079824A CN 108492324 A CN108492324 A CN 108492324A
Authority
CN
China
Prior art keywords
bounding box
kalman filter
airplane
state
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810079824.0A
Other languages
Chinese (zh)
Other versions
CN108492324B (en
Inventor
杨嘉琛
韩煜蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201810079824.0A priority Critical patent/CN108492324B/en
Publication of CN108492324A publication Critical patent/CN108492324A/en
Application granted granted Critical
Publication of CN108492324B publication Critical patent/CN108492324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The present invention relates to a kind of aircraft method for tracing based on fully-connected network and Kalman filter, includes the following steps:Video is detected frame by frame using fully-connected network R FCN, obtains the bounding box of previous frame image, so that trajectory corrector uses;State vector, description airplane motion track are built, state vector should represent the position of target aircraft central point, show the size and aspect ratio of bounding box again;Kalman filter and extended Kalman filter are combined, subvector is isolated from the state vector of structure to describe moving target;When testing result deviation is larger, effective range is limited according to the size of target object and improves detection speed, corrects the position of bounding box in consecutive frame, to realize the correction to movement locus.

Description

airplane tracking method based on full-connection network and Kalman filter
Technical Field
The invention belongs to the field of computer vision, and relates to a deep learning tracking method for an airplane in a video.
Background
Aircraft tracking is an important technology in the fields of aviation safety and the like, and the introduction of scientific and technological means to strengthen safety measures in military reconnaissance is also gradually paid national attention. The target tracking is an important field in computer vision, and can realize the functions of checking and even tracking the airplane in the acquired video image.
In recent years, with the development of deep learning, machine learning algorithms are gradually applied to various visual fields, and target detection and tracking technologies based on deep learning are rapidly developed. Compared with the traditional method, the tracking performance is greatly improved. Popular object detection strategies include two categories, area proposal and area step. However, the accuracy of aircraft tracking is mainly affected by complex environmental conditions, and the visual tracking algorithm still has some challenging problems, such as sudden motion, attitude change, deformation, occlusion, background clutter, illumination or viewpoint change, etc., which all cause the accuracy of tracking to be reduced, and even cause the tracking to fail. At present, no effective algorithm is available for solving the problem of airplane tracking.
Disclosure of Invention
The invention aims to establish a more accurate airplane tracking method. The aircraft tracking method provided by the invention comprises three main parts, namely an R-FCN-based detection model, a Kalman filter-based state estimation model and an aircraft motion track correction module. The technical scheme is as follows:
an aircraft tracking method based on a fully-connected network and a Kalman filter comprises the following steps:
the first step is as follows: detecting the video frame by using a full connection network R-FCN to obtain a bounding box of a previous frame image for track correction;
the second step is that: constructing a state vector, describing an airplane motion track, wherein the state vector not only expresses the position of the center point of the target airplane, but also shows the size and the aspect ratio of the bounding box;
the third step: in order to avoid target drift caused by detection failure of target detection on a certain frame, combining a Kalman filter and an extended Kalman filter, and separating a sub-vector from a constructed state vector to describe a moving target; the specific method comprises the following steps:
(1) kalman filter process linear part: a sub-vector representing the position of the central point of the airplane in the state vector is approximated by a linear model, then Kalman gain is calculated according to the uncertainty of the prediction result and the current observation result, and the prediction result and the observation result are weighted and averaged to obtain the state estimation of the current moment and the uncertainty of the state estimation;
(2) the extended kalman filter is used to fit the non-linear part that is not suitable for the linear model: creating a state subvector which can represent the size and the aspect ratio of a bounding box in the same way as in the step (1), but the related state matrix and the mapping matrix are no longer constant matrixes, and obtaining the state estimation of the nonlinear part at the current moment and the uncertainty of the state estimation;
(3) adding a nonlinear part into a linear system to describe the motion state of the airplane;
the fourth step: when the deviation of the detection result is large, limiting an effective range according to the size of a target object to improve the detection speed, and correcting the position of a bounding box in an adjacent frame so as to realize the correction of the motion track; otherwise, the target is taken as the center, a bounding box is drawn and input into the detection network for training, and the correction formula is as follows:
wherein, deltaabRespectively representing the confidence degrees of detection results between the previous frame and the next frame, wherein the higher the value of the confidence degree is, the more accurate the model is; w is aa,wb,wcRespectively representing the widths, h, of the preceding and following frames and the corrected bounding boxa,hb,hcRespectively representing the heights of the front frame and the rear frame and the bounding box after correction; (x)a,ya),(xb,yb),(xc,yc) Respectively representing horizontal and vertical coordinates of the central points of the front and rear frames and the bounding box after correction;
the fifth step: collecting videos containing airplane motion, uniformly processing the videos into videos with fixed lengths, jointly forming a training database, wherein the videos are randomly divided into two parts, 80% of the videos are input into an R-FCN as a training set, and predicting the remaining 20% of the videos according to a model obtained through training to obtain an airplane tracking result.
The specific airplane tracking algorithm provided by the invention is realized on the basis of an R-FCN and a Kalman filter. The R-FCN is a detection model used for obtaining position information of the airplane. In order to reduce the detection time, we also cut out a specific area in each frame according to the position of the bounding box in the previous frame. In addition, the size of the detection area may also be changed depending on the size of the target. The kalman filter is used as an evaluation model to adjust the predicted motion trace. When the detection difference between adjacent frames is large, the bounding box of the next frame can be adjusted according to the detection result in order to improve the determination rate of detection. Therefore, the method can accurately detect and track the motion trail of the airplane.
Drawings
FIG. 1: flow chart
FIG. 2: schematic diagram of correction method
Detailed Description
The invention provides a method for using R-FCN[1]And Kalman filter[2,3]The airplane tracking method based on the KF comprises three main parts, namely an R-FCN-based detection model, a KF-based state estimation model and an airplane motion track correction module. Specifically, the method can be represented as the following steps:
the first step is as follows: the bounding box (bounding box) of the previous frame is obtained through the R-FCN.
The specific method comprises the following steps:
(1) relative spatial position information of a Region of interest (ROI) is encoded by creating a spatially sensitive score map according to the R-FCN principle. The present invention generates a 4-dimensional vector v ═ v (v) for each ROI by a bounding box regression methodx,vy,vw,vh) For subsequent calculation of bounding boxes. Wherein v isx,vyX, y coordinates, v, representing the center pointw,vhRepresenting the width and height of the bounding box.
(2) Dividing the ROI into upper left,Four word areas of left lower, right upper and right lower, these sub-areas are used as the score map. The bounding box vertices can be respectively denoted as Btl,Btr,Bbl,BbtAnd v isx,vy,vw,vhThe relationship of (a) is expressed as follows:
the second step is that: constructing state vectorsDescribing the motion trail of the airplane.
Wherein,the position of the center point of the object is represented,representing the bounding box ratios and aspect ratios, and black dots representing derivations. The step mainly processes the linear part, and the specific method is as follows:
(1)separating the subvectors from the constructed state vector by approximation of a linear modelTo describe the moving object.
xk=Axk-1+Buk+wk(2)
zk=Hxk+vk(3)
Wherein x represents the state vector of the system, z is the observed value, A is the state transition matrix, and B and u constitute the control part which can be ignored in the non-controllable system to which the invention is applied. H is an observation model, i.e., an observation matrix, which can map the true state to the sensing space. w and v represent noise during state update and observation, respectively. Previous studies have shown that the above noise follows a gaussian distribution. Equation (2) is referred to as the equation of state, equation (3) is referred to as the observation equation, and the subscript k denotes the value at time k.
(2) Known from the KF principle:
wherein, P is an error covariance matrix between the predicted value and the true value, which is used for representing the uncertainty of the prediction result, and Q is a new uncertainty added in the prediction process. Equation (4) shows that the current state is predicted from the state at the previous time plus external input. Equation (5) represents the previously existing uncertainty, which in turn adds a new uncertainty Q in the prediction process.
(3) Calculating Kalman gain K and K-th predicted valueCan be calculated by the following formula:
equation (6) represents the uncertainty of the prediction resultComputing Kalman gain from the uncertainty R of the observation(weight) K. Formula (7) shows that the prediction result and the observation result are weighted and averaged to obtain the state estimation of the current time.
(4) UpdatingAnd representing the uncertainty of the state estimation.
The third step: the non-linear part of the s and r, which are not fit to the linear model, was fitted using EKF. Creating sub-vectors in the same manner as in the third stepBut the state matrix a and the mapping matrix H are no longer constant matrices, they are represented by the following equations:
wherein, the matrix Fk,HkDerived from Jacobian (Jacobian) matrix, which correspond to A and H in KF, respectively.
The fourth step: the motion state of the airplane can be described by adding the nonlinear part into the linear system.
The fifth step: and correcting the motion trail. It is determined whether the bounding box is capable of properly detecting an aircraft in motion. And if the value of the overlapping degree (IOU) of the bounding box and the window where the airplane is located is larger than a predefined threshold value T, modifying the position and the size of the current detection frame according to the bounding box of the previous frame. Otherwise, the target is taken as the center, and a bounding box is drawn and input into the detection network for training. The specific correction formula is as follows:
wherein, deltaabRespectively representing the front and rear framesThe higher the confidence of the detection result, the more accurate the model. w is aa,wb,wcRespectively representing the widths, h, of the preceding and following frames and the corrected bounding boxa,hb,hcThe heights of the preceding and following frames and the corrected bounding box are indicated, respectively. (x)a,ya),(xb,yb),(xc,yc) The horizontal and vertical coordinates of the center points of the previous and following frames and the corrected bounding box are indicated, respectively.
And a sixth step: videos containing airplane motion are collected and processed into videos with fixed length for convenient training, and a training database is formed together. The video is randomly divided into two parts, 80% of the video is input into R-FCN as a training set, and the remaining 20% of the video is predicted according to a model obtained by training to obtain an airplane tracking result.
Reference documents:
[1]Dai J,Li Y,He K,Sun J(2016)R-fcn:Object detection via region-basedfully convolutional networks.In:Advances in Neural Information ProcessingSystems,pp 379C387.
[2]Kalman RE,et al(1960)A new approach to linear filtering andprediction problems. Journal of basic Engineering 82(1):35C45.
[3]Kalman RE,Bucy RS(1961)New results in linear filtering andprediction theory.Journal of basic engineering 83(1):95C108。

Claims (1)

1. An aircraft tracking method based on a fully-connected network and a Kalman filter comprises the following steps:
the first step is as follows: detecting the video frame by using a full connection network R-FCN to obtain a bounding box of a previous frame image for track correction;
the second step is that: constructing a state vector, describing an airplane motion track, wherein the state vector not only expresses the position of the center point of the target airplane, but also shows the size and the aspect ratio of the bounding box;
the third step: in order to avoid target drift caused by detection failure of target detection on a certain frame, combining a Kalman filter and an extended Kalman filter, and separating a sub-vector from a constructed state vector to describe a moving target; the specific method comprises the following steps:
(1) kalman filter process linear part: a sub-vector representing the position of the central point of the airplane in the state vector is approximated by a linear model, then Kalman gain is calculated according to the uncertainty of the prediction result and the current observation result, and the prediction result and the observation result are weighted and averaged to obtain the state estimation of the current moment and the uncertainty of the state estimation;
(2) the extended kalman filter is used to fit the non-linear part that is not suitable for the linear model: creating a state subvector which can represent the size and the aspect ratio of a bounding box in the same way as in the step (1), but the related state matrix and the mapping matrix are no longer constant matrixes, and obtaining the state estimation of the nonlinear part at the current moment and the uncertainty of the state estimation;
(3) adding a nonlinear part into a linear system to describe the motion state of the airplane;
the fourth step: when the deviation of the detection result is large, limiting an effective range according to the size of a target object to improve the detection speed, and correcting the position of a bounding box in an adjacent frame so as to realize the correction of the motion track; otherwise, the target is taken as the center, a bounding box is drawn and input into the detection network for training, and the correction formula is as follows:
wherein, deltaabRespectively representing the confidence degrees of detection results between the previous frame and the next frame, wherein the higher the value of the confidence degree is, the more accurate the model is; w is aa,wb,wcRespectively representing the widths, h, of the preceding and following frames and the corrected bounding boxa,hb,hcRespectively representing preceding and following frames and corrected framingThe height of the box; (x)a,ya),(xb,yb),(xc,yc) Respectively representing horizontal and vertical coordinates of the central points of the front and rear frames and the bounding box after correction;
the fifth step: collecting videos containing airplane motion, uniformly processing the videos into videos with fixed lengths, jointly forming a training database, wherein the videos are randomly divided into two parts, 80% of the videos are input into an R-FCN as a training set, and predicting the remaining 20% of the videos according to a model obtained through training to obtain an airplane tracking result.
CN201810079824.0A 2018-01-27 2018-01-27 Airplane tracking method based on full-connection network and Kalman filter Active CN108492324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810079824.0A CN108492324B (en) 2018-01-27 2018-01-27 Airplane tracking method based on full-connection network and Kalman filter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810079824.0A CN108492324B (en) 2018-01-27 2018-01-27 Airplane tracking method based on full-connection network and Kalman filter

Publications (2)

Publication Number Publication Date
CN108492324A true CN108492324A (en) 2018-09-04
CN108492324B CN108492324B (en) 2021-05-11

Family

ID=63343824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810079824.0A Active CN108492324B (en) 2018-01-27 2018-01-27 Airplane tracking method based on full-connection network and Kalman filter

Country Status (1)

Country Link
CN (1) CN108492324B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109684991A (en) * 2018-12-24 2019-04-26 北京旷视科技有限公司 Image processing method, device, electronic equipment and storage medium
CN109816071A (en) * 2019-02-12 2019-05-28 河南工程学院 A kind of indoor objects method for tracing based on RFID
CN110033050A (en) * 2019-04-18 2019-07-19 杭州电子科技大学 A kind of water surface unmanned boat real-time target detection calculation method
CN110070565A (en) * 2019-03-12 2019-07-30 杭州电子科技大学 A kind of ship trajectory predictions method based on image superposition
CN110490901A (en) * 2019-07-15 2019-11-22 武汉大学 The pedestrian detection tracking of anti-attitudes vibration
CN112464886A (en) * 2020-12-14 2021-03-09 上海交通大学 Aircraft identification tracking method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110288772A1 (en) * 2010-05-19 2011-11-24 Denso Corporation Current position detector for vehicle
CN103064086A (en) * 2012-11-04 2013-04-24 北京工业大学 Vehicle tracking method based on depth information
CN103716867A (en) * 2013-10-25 2014-04-09 华南理工大学 Wireless sensor network multiple target real-time tracking system based on event drive
CN104951084A (en) * 2015-07-30 2015-09-30 京东方科技集团股份有限公司 Eye-tracking method and device
CN106780542A (en) * 2016-12-29 2017-05-31 北京理工大学 A kind of machine fish tracking of the Camshift based on embedded Kalman filter

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110288772A1 (en) * 2010-05-19 2011-11-24 Denso Corporation Current position detector for vehicle
CN103064086A (en) * 2012-11-04 2013-04-24 北京工业大学 Vehicle tracking method based on depth information
CN103716867A (en) * 2013-10-25 2014-04-09 华南理工大学 Wireless sensor network multiple target real-time tracking system based on event drive
CN104951084A (en) * 2015-07-30 2015-09-30 京东方科技集团股份有限公司 Eye-tracking method and device
CN106780542A (en) * 2016-12-29 2017-05-31 北京理工大学 A kind of machine fish tracking of the Camshift based on embedded Kalman filter

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
DAN SIMON: ""Training radial basis neural networks with the extended Kalman filter"", 《NEUROCOMPUTING》 *
史阳春 等: ""一种基于神经网络的干扰抑制系统"", 《武汉大学学报(工学版)》 *
唐新星 等: ""基于扩展卡尔曼滤波算法的RBF神经网络主动视觉跟踪"", 《制造业自动化》 *
曲仕茹 等: ""采用Kalman_BP神经网络的视频序列多目标检测与跟踪"", 《红外与激光工程》 *
王化明 等: ""基于CMAC神经网络和Kalman滤波器的三维视觉跟踪"", 《东南大学学报》 *
陈玲玲 等: ""融合卡尔曼滤波与TLD算法的目标跟踪"", 《2015 27TH CHINESE CONTROL AND DECISION CONFERENCE (CCDC)》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109684991A (en) * 2018-12-24 2019-04-26 北京旷视科技有限公司 Image processing method, device, electronic equipment and storage medium
CN109684991B (en) * 2018-12-24 2021-10-01 北京旷视科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN109816071A (en) * 2019-02-12 2019-05-28 河南工程学院 A kind of indoor objects method for tracing based on RFID
CN110070565A (en) * 2019-03-12 2019-07-30 杭州电子科技大学 A kind of ship trajectory predictions method based on image superposition
CN110033050A (en) * 2019-04-18 2019-07-19 杭州电子科技大学 A kind of water surface unmanned boat real-time target detection calculation method
CN110033050B (en) * 2019-04-18 2021-06-22 杭州电子科技大学 Real-time target detection and calculation method for unmanned surface vehicle
CN110490901A (en) * 2019-07-15 2019-11-22 武汉大学 The pedestrian detection tracking of anti-attitudes vibration
CN112464886A (en) * 2020-12-14 2021-03-09 上海交通大学 Aircraft identification tracking method

Also Published As

Publication number Publication date
CN108492324B (en) 2021-05-11

Similar Documents

Publication Publication Date Title
CN108492324B (en) Airplane tracking method based on full-connection network and Kalman filter
CN113269098B (en) Multi-target tracking positioning and motion state estimation method based on unmanned aerial vehicle
CN110222581B (en) Binocular camera-based quad-rotor unmanned aerial vehicle visual target tracking method
CN107230218B (en) Method and apparatus for generating confidence measures for estimates derived from images captured by vehicle-mounted cameras
CN108573496B (en) Multi-target tracking method based on LSTM network and deep reinforcement learning
CN104200494B (en) Real-time visual target tracking method based on light streams
CN111932588A (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
CN110738690A (en) unmanned aerial vehicle video middle vehicle speed correction method based on multi-target tracking framework
CN108022254B (en) Feature point assistance-based space-time context target tracking method
CN106803265A (en) Multi-object tracking method based on optical flow method and Kalman filtering
CN107301657A (en) A kind of video target tracking method for considering target movable information
CN112052802A (en) Front vehicle behavior identification method based on machine vision
CN106599918B (en) vehicle tracking method and system
CN111680713A (en) Unmanned aerial vehicle ground target tracking and approaching method based on visual detection
CN111402303A (en) Target tracking architecture based on KFSTRCF
US11080562B1 (en) Key point recognition with uncertainty measurement
Shiba et al. A fast geometric regularizer to mitigate event collapse in the contrast maximization framework
CN118189959A (en) Unmanned aerial vehicle target positioning method based on YOLO attitude estimation
CN116977902B (en) Target tracking method and system for on-board photoelectric stabilized platform of coastal defense
Kang et al. Robust visual tracking framework in the presence of blurring by arbitrating appearance-and feature-based detection
Xiao et al. Research on uav multi-obstacle detection algorithm based on stereo vision
CN115100565B (en) Multi-target tracking method based on spatial correlation and optical flow registration
CN106934818B (en) Hand motion tracking method and system
CN114782484A (en) Multi-target tracking method and system for detection loss and association failure
CN109754412B (en) Target tracking method, target tracking apparatus, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant