CN107678024B - Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection - Google Patents

Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection Download PDF

Info

Publication number
CN107678024B
CN107678024B CN201710992916.3A CN201710992916A CN107678024B CN 107678024 B CN107678024 B CN 107678024B CN 201710992916 A CN201710992916 A CN 201710992916A CN 107678024 B CN107678024 B CN 107678024B
Authority
CN
China
Prior art keywords
target
radar
infrared
unmanned aerial
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710992916.3A
Other languages
Chinese (zh)
Other versions
CN107678024A (en
Inventor
陈唯实
李敬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Civil Aviation Science and Technology
Original Assignee
China Academy of Civil Aviation Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Civil Aviation Science and Technology filed Critical China Academy of Civil Aviation Science and Technology
Priority to CN201710992916.3A priority Critical patent/CN107678024B/en
Publication of CN107678024A publication Critical patent/CN107678024A/en
Application granted granted Critical
Publication of CN107678024B publication Critical patent/CN107678024B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection. The light and small unmanned aerial vehicle fusion tracking method provided by the invention is characterized in that a target motion model is established based on radar detection data, the target state is estimated based on the model, the target state is updated by infrared measurement data before new radar measurement data is acquired, and the target motion state is fusion updated by radar and infrared measurement data after the new radar measurement data is acquired, so that the fusion tracking of the light and small unmanned aerial vehicle target is realized.

Description

Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection
Technical Field
The invention relates to a light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection, belongs to the technical field of low-altitude airspace security monitoring, and relates to target tracking and data fusion.
Background
With the rapid development of consumer-grade unmanned aerial vehicle technology, the quantity of light and small unmanned aerial vehicles represented by 'Dajiang' is rapidly increased, and the 'black flight' disturbance events of unmanned aerial vehicles in airports in China are frequently caused due to lack of effective supervision. Aiming at the current severe situation, a series of supervision actions aiming at the unmanned aerial vehicles are intensively taken out, and the owners of civil unmanned aerial vehicles in civil aviation departments must register in real names according to the requirements of the management regulation. Aiming at the supervision of domestic light and small unmanned aerial vehicles, various advanced technical means are urgently needed.
For the unmanned aerial vehicles which are 'cooperative' and are connected to the grid, the flight information of the unmanned aerial vehicles on the grid can be accessed into management systems such as 'unmanned aerial vehicle cloud' in real time, and the supervision department can inquire and record the unmanned aerial vehicles which mistakenly break into the corresponding areas. For unmanned aerial vehicles which are 'cooperative' but not connected to the grid, the Xinjiang company monitors the flight states of the Xinjiang brand unmanned aerial vehicles, the zero degree unmanned aerial vehicles and other brands by monitoring the 'flight control protocol', and can cover more than 95% of the current consumption-level unmanned aerial vehicles.
For the non-cooperative unmanned aerial vehicle, active detection technologies such as radar, photoelectric, acoustic and radio detection are mainly adopted at present to detect and track the unmanned aerial vehicle. The radar technology is long in detection distance and stable in performance, but the data updating rate of a general monitoring radar is limited, a certain proportion of false alarms exist, and after a target is found, other detection means are needed for confirmation, so that technical advantage complementation is formed.
Disclosure of Invention
The invention aims to solve the problems and provides a light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection.
A light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection comprises the following steps:
step 1, modeling target motion;
step 2, tracking the target of the unmanned aerial vehicle based on infrared measurement;
and 3, tracking the target of the unmanned aerial vehicle based on radar and infrared measurement.
The invention has the advantages that:
the light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection can make up the defect of low radar measurement data updating rate, and updates the estimated target state by using infrared measurement during the radar measurement updating interval, so that fusion tracking based on radar and infrared measurement is realized, and the target tracking precision is improved.
Drawings
FIG. 1 is a schematic diagram of a light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection;
FIG. 2 is a radar target tracking trajectory of a light and small unmanned aerial vehicle according to an embodiment of the present invention;
fig. 3 is an infrared image of a light and small unmanned aerial vehicle according to an embodiment of the present invention;
fig. 4 is a light and small unmanned aerial vehicle fusion tracking trajectory according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
The invention discloses a light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection, which comprises the following steps as shown in figure 1:
firstly, modeling target motion;
estimating the target state at the moment k +1 based on the unmanned aerial vehicle target state at the moment k and a radar target motion model, as follows
{xk+1|k,Pk+1|k}=fp{xk,Pk,Qk} (1)
In the formula, xkIs a target state, PkIs the target covariance, QkIs systematic process noise, xk+1|kFor the estimated state of the object, Pk+1|kEstimate covariance for target, fp{. represents the state estimation part in the target motion model. Thereafter, the estimated state of the target is updated based on the metrology data, as shown in the following equation
{xk+1,Pk+1}=fu{xk+1|k,Pk+1|k,zk+1,Rk+1} (2)
In the formula, zk+1For the measurement data, the measurement data may be radar measurement, infrared measurement after data conversion, or fusion measurement of radar and infrared, Rk+1To measure noise, fu{. represents the state update part in the target motion model. x is the number ofk+1、Pk+1Respectively representing the target state and the target covariance at the moment k +1, and the modeling method of the target motion model includes but is not limited to linear or nonlinear methods such as Kalman filtering, extended Kalman filtering, interactive multi-model and the like. The unmanned aerial vehicle target state is represented by the following formula
Figure BDA0001441806970000021
Wherein, [ x y ]]Representing the target position in a two-dimensional coordinate system,
Figure BDA0001441806970000031
representing the speed of movement of the object in a two-dimensional coordinate system.
Step 2, tracking the target of the unmanned aerial vehicle based on infrared measurement;
the data update rate of radar systems is generally lower than that of infrared systems, for example, infrared systems can update measurement data once every moment, while radar systems can update measurement data once only at n moments. Therefore, the motion state of the target is updated by adopting infrared measurement at the time from k +1 to k + n-1, namely during two data updating intervals of the radar system. Data conversion of the infrared measurements is required before the target status is updated with the infrared data.
The radar monitoring system of the light small unmanned aerial vehicle usually adopts a two-coordinate radar, and the measured data is expressed as
zR=[x y] (4)
Wherein: z is a radical ofRMetrology data representing a radar system.
The radial distance D between the target and the radar is calculated by
Figure BDA0001441806970000032
The infrared monitoring system of the light and small unmanned aerial vehicle can only obtain the azimuth information of the target, and the measured data is expressed as
zI=θ (6)
Wherein: z is a radical ofIRepresents the measured data of the infrared system, and is assigned by theta.
Measuring the noise as
RI=σθ (7)
Wherein: rIRepresenting the measurement noise, σ, of the infrared systemθTo which value is assigned.
Combining the radial distance of the target, the infrared target measurement and the noise are used for updating the target state after data conversion, as follows
zI-R=[D·cosθ D·sinθ] (8)
Figure BDA0001441806970000033
Wherein: z is a radical ofI-RAnd RI-RRespectively, through data conversionAnd measuring the converted infrared target and noise.
During the radar measurement data updating interval, the target state is estimated by using the formula (1) as follows
Figure BDA0001441806970000034
Wherein the input variables are the covariance of the radar system
Figure BDA0001441806970000041
And process noise
Figure BDA0001441806970000042
Then, the target state is updated by the following equation (2)
Figure BDA0001441806970000043
Wherein the input variable is measured by infrared after data conversion
Figure BDA0001441806970000044
And noise
Figure BDA0001441806970000045
See equations (8) and (9).
Step 3, tracking the target of the unmanned aerial vehicle based on radar and infrared measurement;
and at the moment of k + n, the radar system acquires new measurement data, and the radar and the infrared measurement are fused to track the target of the unmanned aerial vehicle. Firstly, the target state is estimated by using the formula (1), which is as follows
Figure BDA0001441806970000046
Wherein the input variables are the covariance of the radar system
Figure BDA0001441806970000047
And process noise
Figure BDA0001441806970000048
xk+n|k+n-1And Pk+n|k+n-1Respectively the estimated state and the estimated covariance of the target at the moment k + n.
Then, the target state is updated by the following equation (2)
{xk+n,Pk+n}=fu{xk+n|k+n-1,Pk+n|k+n-1,zk+n,Rk+n} (13)
Wherein x isk+nAnd Pk+nRespectively target state and target covariance after updating at time k + n, target measurement z at time k + nk+nUsing radar measurements
Figure BDA0001441806970000049
Infrared measurement after data conversion
Figure BDA00014418069700000410
The fusion of (A) is calculated by
Figure BDA00014418069700000411
In-the infrared measurement
Figure BDA00014418069700000412
See equation (8). The noise matrix also adopts the fusion of radar and infrared system noise
Figure BDA00014418069700000413
In the formula (I), the compound is shown in the specification,
Figure BDA00014418069700000414
in order for the radar system to measure the noise,
Figure BDA00014418069700000415
for the infrared system subjected to data conversion to measure noise, the conversion method is shown in formula (9).
Example (b):
the method for the fusion tracking of the light and small unmanned aerial vehicle based on the radar and infrared joint detection is shown and described below with reference to the target tracking result of the light and small unmanned aerial vehicle based on the radar and infrared data in the attached drawings.
The invention relates to a light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection, which comprises the following steps:
step 1, modeling target motion;
estimating the target state at the moment k +1 based on the unmanned aerial vehicle target state at the moment k and a radar target motion model, as follows
{xk+1|k,Pk+1|k}=fp{xk,Pk,Qk} (1)
In the formula, xkIs a target state, PkIs the target covariance, QkIs systematic process noise, xk+1|kFor the estimated state of the object, Pk+1|kEstimate covariance for target, fp{. represents the state estimation part in the target motion model. Thereafter, the estimated state of the target is updated based on the metrology data, as shown in the following equation
{xk+1,Pk+1}=fu{xk+1|k,Pk+1|k,zk+1,Rk+1} (2)
In the formula, zk+1For the measurement data, the measurement data may be radar measurement, infrared measurement after data conversion, or fusion measurement of radar and infrared, Rk+1To measure noise, fu{. represents the state update part in the target motion model. In this example, the modeling method of the target motion model employs kalman filtering. The unmanned aerial vehicle target state is represented by the following formula
Figure BDA0001441806970000051
Wherein, [ x y ]]Which is representative of the position of the target,
Figure BDA0001441806970000052
representing the speed of movement of the object.
In this example, the target state of the drone target at time k is
xk=[0 1000 0 10]
The estimated state at the moment of k +1 is
xk+1|k=[0 1010 0 10]
Based on the target motion model, under the condition that radar measurement cannot be obtained, at the time from k +1 to k +5, the target estimation state is calculated by adopting the formula (1), the target estimation state is not corrected by adopting the formula (2), the target estimation state is approximate to the target state, and the target states at different times are obtained as follows
xk+1=[0 1010 0.1 10]
xk+2=[0.1 1020 -0.1 10.1]
xk+3=[0 1030.1 -0.1 10.2]
xk+4=[-0.1 1040.3 0 10.1]
xk+5=[-0.1 1050.4 0 10.1]
The tracking trajectory of the radar target based on the above data is shown in fig. 2.
Step 2, tracking the target of the unmanned aerial vehicle based on infrared measurement;
the data update rate of the radar system is generally lower than that of the infrared system, in this example, the infrared system can update the measurement data once at each moment, and the radar system can update the measurement data once only at 5 moments. Therefore, at the time points k +1 to k +4, namely during the two data updating intervals of the radar system, the motion state of the target is updated by using infrared measurement, and the infrared measurement data at a certain time point is shown in fig. 3. Data conversion of the infrared measurements is required before the target status is updated with the infrared data.
The radar monitoring system of the light small unmanned aerial vehicle usually adopts a two-coordinate radar, and the measured data is expressed as
zR=[x y] (4)
In this example, the radar cannot acquire the measured data at time k +1, and estimates the state x from the measured datak+1|kApproximate substitution of position data in
Figure BDA0001441806970000061
The radial distance D between the target and the radar is calculated by
Figure BDA0001441806970000062
In the present case, it is preferred that,
Dk+1=1010
the infrared monitoring system of the light and small unmanned aerial vehicle can only obtain the azimuth information of the target, and the measured data is expressed as
zI=θ (6)
In the present case, it is preferred that,
Figure BDA0001441806970000063
measuring the noise as
RI=σθ (7)
In the present case, it is preferred that,
Figure BDA0001441806970000071
combining the radial distance of the target, the infrared target measurement and the noise are used for updating the target state after data conversion, as follows
zI-R=[D·cosθ D·sinθ] (8)
Figure BDA0001441806970000072
In the present case, it is preferred that,
Figure BDA0001441806970000073
Figure BDA0001441806970000074
during the radar measurement data updating interval, the target state is estimated by using the formula (1) as follows
Figure BDA0001441806970000075
Wherein the input variables are the covariance of the radar system
Figure BDA0001441806970000076
And process noise
Figure BDA0001441806970000077
Then, the target state is updated by the following equation (2)
Figure BDA0001441806970000078
Wherein the input variable is measured by infrared after data conversion
Figure BDA0001441806970000079
And noise
Figure BDA00014418069700000710
See equations (8) and (9). In this example, the target state is updated using infrared measurements, and the target states at times k +1 to k +4 are as follows
xk+1=[8.5 1009.5 0.6 9.2]
xk+2=[9.2 1020.5 0.5 10.5]
xk+3=[7.1 1030.2 -0.4 9.8]
xk+4=[5.8 1039.9 -0.8 10]
Step 3, tracking the target of the unmanned aerial vehicle based on radar and infrared measurement;
and at the moment of k + n, the radar system acquires new measurement data, and the radar and the infrared measurement are fused to track the target of the unmanned aerial vehicle. Firstly, the target state is estimated by using the formula (1), which is as follows
Figure BDA00014418069700000711
Wherein the input variables are the covariance of the radar system
Figure BDA0001441806970000081
And process noise
Figure BDA0001441806970000082
Then, the target state is updated by the following equation (2)
{xk+n,Pk+n}=fu{xk+n|k+n-1,Pk+n|k+n-1,zk+n,Rk+n} (13)
Wherein, the target measurement adopts radar measurement
Figure BDA0001441806970000083
Infrared measurement after data conversion
Figure BDA0001441806970000084
The fusion of (A) is calculated by
Figure BDA0001441806970000085
In-the infrared measurement
Figure BDA0001441806970000086
See equation (8). In this example, if n is 5, then
Figure BDA0001441806970000087
The noise matrix also adopts the fusion of radar and infrared system noise
Figure BDA0001441806970000088
In the formula (I), the compound is shown in the specification,
Figure BDA0001441806970000089
in order for the radar system to measure the noise,
Figure BDA00014418069700000810
for the infrared system subjected to data conversion to measure noise, the conversion method is shown in formula (9). In the present example, the number of the first and second,
Figure BDA00014418069700000811
with equations (12) and (13), the state of the target at time k +5 is
xk+5=[0.8 1050.6 -0.7 10.1]
Based on the fusion tracking data in the step 3 and the step, the radar and infrared fusion tracking track of the light small unmanned aerial vehicle is shown in fig. 4.

Claims (1)

1. A light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection comprises the following steps:
firstly, modeling target motion;
estimating the target state at the k +1 moment based on the unmanned aerial vehicle target state and the radar target motion model at the k moment:
{xk+1|k,Pk+1|k}=fp{xk,Pk,Qk} (1)
in the formula, xkIs a target state, PkIs the target covariance, QkIs systematic process noise, xk+1|kFor the estimated state of the object, Pk+1|kThe covariance is estimated for the target,fp{. represents a state estimation part in the target motion model, and the target estimation state is updated based on the measurement data:
{xk+1,Pk+1}=fu{xk+1|k,Pk+1|k,zk+1,Rk+1} (2)
in the formula, zk+1For the measured data, Rk+1To measure noise, fu{. represents the state update part in the target motion model; x is the number ofk+1、Pk+1Respectively representing a target state and a target covariance at the moment k + 1;
the target state of the unmanned aerial vehicle is as follows:
Figure FDA0002610736000000011
wherein, [ x 'y']Representing the target position in a two-dimensional coordinate system,
Figure FDA0002610736000000012
representing the motion speed of the target under a two-dimensional coordinate system;
step 2, tracking the target of the unmanned aerial vehicle based on infrared measurement;
it adopts two coordinate radars to establish light small unmanned aerial vehicle radar monitored control system, and its measured data is:
zR=[x y] (4)
wherein: z is a radical ofRMetrology data representative of a radar system;
the radial distance D of the target from the radar is:
Figure FDA0002610736000000013
the infrared monitoring system of light and small-size unmanned aerial vehicle obtains the position information of target, and its measured data is:
zI=θ (6)
wherein:zIrepresenting the measured data of the infrared system, and assigning value theta;
the measurement noise is:
RI=σθ (7)
wherein: rIRepresenting the measurement noise, σ, of the infrared systemθAssigning a value to the value;
combining the radial distance of the target, the infrared target measurement and the noise are used for updating the target state after data conversion, as follows:
zI-R=[D·cosθ D·sinθ] (8)
Figure FDA0002610736000000021
wherein: z is a radical ofI-RAnd RI-RRespectively measuring the infrared target and the noise after data conversion;
during the radar measurement data updating interval, estimating the target state by adopting the formula (1):
Figure FDA0002610736000000022
wherein the input variables are the covariance of the radar system
Figure FDA0002610736000000023
And process noise
Figure FDA0002610736000000024
Then, the target state is updated by the following equation (2):
Figure FDA0002610736000000025
wherein the input variable is measured by infrared after data conversion
Figure FDA0002610736000000026
And noise
Figure FDA0002610736000000027
Step 3, tracking the target of the unmanned aerial vehicle based on radar and infrared measurement;
at the moment of k + n, the radar system acquires new measurement data, the radar and the infrared measurement are fused, and the target of the unmanned aerial vehicle is tracked; firstly, the target state is estimated by adopting the formula (1):
Figure FDA0002610736000000028
wherein the input variables are the covariance of the radar system
Figure FDA0002610736000000029
And process noise
Figure FDA00026107360000000210
xk+n|k+n-1And Pk+n|k+n-1Respectively an estimated state and an estimated covariance of the target at the moment k + n;
then, the target state is updated by the following equation (2)
{xk+n,Pk+n}=fu{xk+n|k+n-1,Pk+n|k+n-1,zk+n,Rk+n} (13)
Wherein x isk+nAnd Pk+nRespectively target state and target covariance after updating at time k + n, target measurement z at time k + nk+nUsing radar measurements
Figure FDA00026107360000000211
Infrared measurement after data conversion
Figure FDA00026107360000000212
The fusion of (A) is calculated by
Figure FDA0002610736000000031
In-the infrared measurement
Figure FDA0002610736000000032
See equation (8); the noise matrix also adopts the fusion of radar and infrared system noise
Figure FDA0002610736000000033
In the formula (I), the compound is shown in the specification,
Figure FDA0002610736000000034
in order for the radar system to measure the noise,
Figure FDA0002610736000000035
noise is measured for the infrared system after data conversion.
CN201710992916.3A 2017-10-23 2017-10-23 Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection Active CN107678024B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710992916.3A CN107678024B (en) 2017-10-23 2017-10-23 Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710992916.3A CN107678024B (en) 2017-10-23 2017-10-23 Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection

Publications (2)

Publication Number Publication Date
CN107678024A CN107678024A (en) 2018-02-09
CN107678024B true CN107678024B (en) 2020-12-29

Family

ID=61140964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710992916.3A Active CN107678024B (en) 2017-10-23 2017-10-23 Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection

Country Status (1)

Country Link
CN (1) CN107678024B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113721642B (en) * 2021-02-25 2023-12-19 北京理工大学 Unmanned aerial vehicle countering control method integrating detection, tracking and treatment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07229960A (en) * 1994-02-16 1995-08-29 Mitsubishi Heavy Ind Ltd All-round visible display apparatus for aircraft
JPH08211146A (en) * 1995-02-02 1996-08-20 Mitsubishi Heavy Ind Ltd Radar unit
CN101661104A (en) * 2009-09-24 2010-03-03 北京航空航天大学 Target tracking method based on radar/infrared measurement data coordinate conversion
JP2011047882A (en) * 2009-08-28 2011-03-10 Toshiba Corp Target-tracking system
CN102831620A (en) * 2012-08-03 2012-12-19 南京理工大学 Infrared dim target searching and tracking method based on multi-hypothesis tracking data association
CN104252178A (en) * 2014-09-12 2014-12-31 西安电子科技大学 Strong maneuver-based target tracking method
CN105739335A (en) * 2015-12-29 2016-07-06 中国民航科学技术研究院 Airport bird detection early warning and repelling linkage system
CN106204629A (en) * 2016-08-17 2016-12-07 西安电子科技大学 Space based radar and infrared data merge moving target detection method in-orbit
CN106546975A (en) * 2016-10-14 2017-03-29 中国民航科学技术研究院 A kind of small-sized unmanned plane based on radar data and flying bird classifying identification method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07229960A (en) * 1994-02-16 1995-08-29 Mitsubishi Heavy Ind Ltd All-round visible display apparatus for aircraft
JPH08211146A (en) * 1995-02-02 1996-08-20 Mitsubishi Heavy Ind Ltd Radar unit
JP2011047882A (en) * 2009-08-28 2011-03-10 Toshiba Corp Target-tracking system
CN101661104A (en) * 2009-09-24 2010-03-03 北京航空航天大学 Target tracking method based on radar/infrared measurement data coordinate conversion
CN102831620A (en) * 2012-08-03 2012-12-19 南京理工大学 Infrared dim target searching and tracking method based on multi-hypothesis tracking data association
CN104252178A (en) * 2014-09-12 2014-12-31 西安电子科技大学 Strong maneuver-based target tracking method
CN105739335A (en) * 2015-12-29 2016-07-06 中国民航科学技术研究院 Airport bird detection early warning and repelling linkage system
CN106204629A (en) * 2016-08-17 2016-12-07 西安电子科技大学 Space based radar and infrared data merge moving target detection method in-orbit
CN106546975A (en) * 2016-10-14 2017-03-29 中国民航科学技术研究院 A kind of small-sized unmanned plane based on radar data and flying bird classifying identification method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Data Fusion of Infrared and Radar for Target Tracking;Anfu ZHU et al.;《IEEE》;20081231;正文第1-4页 *
一种红外/雷达双传感器融合目标跟踪算法;郑黎义;《PLC & FA》;20050228;第106-108页 *

Also Published As

Publication number Publication date
CN107678024A (en) 2018-02-09

Similar Documents

Publication Publication Date Title
CN111860589A (en) Multi-sensor multi-target cooperative detection information fusion method and system
CN110929810B (en) Multi-source data fusion method for low-speed small-target detection system
CN108089183B (en) Detection and tracking integrated method for asynchronous multi-base radar system
CN104849702B (en) Radar system error combined estimation method is filtered using the GM EPHD of ADS B datas
CN109884586A (en) Unmanned plane localization method, device, system and storage medium based on ultra-wide band
CN104297739B (en) Method for guiding photoelectric tracking equipment in navigation monitoring
CN103699713A (en) Collision detection method for airplane formation and application of method
CN108535713A (en) A kind of radar and the tracking of AIS joints and information fusion method
CN104237862B (en) Probability hypothesis density filter radar system error fusion estimation method based on ADS-B
CN103017771A (en) Multi-target joint distribution and tracking method of static sensor platform
US20210158128A1 (en) Method and device for determining trajectories of mobile elements
CN106969767B (en) Estimation method for system deviation of moving platform sensor
CN113311398A (en) Tracking method for high maneuvering dim small target with strong clutter complex background
CN116863382A (en) Expressway multi-target tracking method based on radar fusion
CN107678024B (en) Light and small unmanned aerial vehicle fusion tracking method based on radar and infrared combined detection
CN117724059A (en) Multi-source sensor fusion track correction method based on Kalman filtering algorithm
Lei et al. Multi-platform and multi-sensor data fusion based on ds evidence theory
CN110536233B (en) Real-time unmanned aerial vehicle supervisory systems
CN111816005A (en) Remote piloted aircraft environment monitoring optimization method based on ADS-B
Chen et al. Research on AIS and radar information fusion method based on distributed Kalman
CN115220002B (en) Multi-target data association tracking method and related device for fixed single station
CN111095024A (en) Height determination method, height determination device, electronic equipment and computer-readable storage medium
Ren et al. Information fusion of digital camera and radar
CN113219452A (en) Distributed multi-radar joint registration and multi-target tracking method under unknown vision field
Shen et al. A benchmark for vision-based multi-UAV multi-object tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant