CN111693051B - Multi-target data association method based on photoelectric sensor - Google Patents

Multi-target data association method based on photoelectric sensor Download PDF

Info

Publication number
CN111693051B
CN111693051B CN202010482930.0A CN202010482930A CN111693051B CN 111693051 B CN111693051 B CN 111693051B CN 202010482930 A CN202010482930 A CN 202010482930A CN 111693051 B CN111693051 B CN 111693051B
Authority
CN
China
Prior art keywords
target
information
sensor
angle measurement
measurement information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010482930.0A
Other languages
Chinese (zh)
Other versions
CN111693051A (en
Inventor
张艳
陈金涛
杨雪榕
王爽
曲承志
张鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN202010482930.0A priority Critical patent/CN111693051B/en
Publication of CN111693051A publication Critical patent/CN111693051A/en
Application granted granted Critical
Publication of CN111693051B publication Critical patent/CN111693051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a multi-target data association method based on a photoelectric sensor, which comprises the following steps: judging angle measurement information according to the angle measurement information acquired by different sensors and judging whether the angle measurement information comes from the same target or not; obtaining three-dimensional space coordinate information of the target after judging that the three-dimensional space coordinate information comes from the same target; and updating the track of the target based on a weighted and distance nearest neighbor track-track association algorithm. By using the method and the device, multi-target tracking can be realized under the condition that only measurement information of the target azimuth angle and the pitch angle can be acquired. The multi-target data association method based on the photoelectric sensor can be widely applied to the fields of multi-target tracking and multi-sensor information processing.

Description

Multi-target data association method based on photoelectric sensor
Technical Field
The invention relates to the field of multi-target tracking and multi-sensor information processing, in particular to a multi-target data association method based on a photoelectric sensor.
Background
In the field of multi-target tracking, a data association technology is a key technology for realizing multi-target tracking, and how to utilize measurement information to perform data association is a core problem. Under more complex modern environments, a single sensor cannot provide comprehensive and accurate sky situation information, real-time battlefield situations can be better mastered by combining measurement information of multiple sensors and utilizing an information processing and fusion technology, but the current single sensor cannot realize multi-target tracking under the condition that measurement information of a target azimuth angle and a target pitch angle can only be obtained.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a multi-target data association method based on a photoelectric sensor, which can achieve multi-target tracking under the condition that only measurement information of a target azimuth angle and a target pitch angle can be obtained.
The technical scheme adopted by the invention is as follows: a multi-target data association method based on photoelectric sensors comprises the following steps:
acquiring angle measurement information of a target through different sensors;
judging whether the angle measurement information obtained by different sensors is from the same target;
judging that the angle measurement information from different sensors at the current moment comes from the same target, and acquiring the three-dimensional space coordinate information of the target according to the angle measurement information;
and obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track pointing information obtained according to the three-dimensional space coordinate information so as to update the target track with the actual track pointing information.
Further, the obtaining of the angle measurement information of the target by the different sensors specifically includes obtaining azimuth angle measurement information, pitch angle measurement information, and a measurement model of the target by the two sensors, where the measurement model has the following expression:
Figure GDA0003478671020000011
Figure GDA0003478671020000012
the above-mentioned
Figure GDA0003478671020000013
Observed values of the azimuth angle and the pitch angle of the target i by the sensor n, respectively
Figure GDA0003478671020000014
The actual values of the azimuth angle and the pitch angle of the sensor n to the target i are respectively, and the delta A and the delta E are respectively the azimuth angle deviation and the pitch angle deviation of the sensor at the current moment.
Further, the step of determining whether the angle measurement information obtained by the different sensors is from the same target specifically includes:
preliminarily judging whether two pairs of angle measurement information from different sensors are from the same target or not according to the azimuth angle information;
and preliminarily judging whether the two pairs of angle measurement information from different sensors originate from the same target or not according to the pitch angle information.
Further, the step of preliminarily determining whether two pairs of angle measurement information from different sensors are from the same target according to the azimuth information is specifically that the azimuth information satisfies any one of the following conditions, and preliminarily determining that the two pairs of angle measurement information are from the same target:
Figure GDA0003478671020000021
and is
Figure GDA0003478671020000022
Figure GDA0003478671020000023
And is
Figure GDA0003478671020000024
Figure GDA0003478671020000025
And is
Figure GDA0003478671020000026
Figure GDA0003478671020000027
And is
Figure GDA0003478671020000028
Figure GDA0003478671020000029
And is
Figure GDA00034786710200000210
The above-mentioned
Figure GDA00034786710200000211
Is the azimuth information of the object i received by the first sensor, said
Figure GDA00034786710200000212
It is the second sensor that receives the azimuth information of target j.
Further, the step of preliminarily judging whether the two pairs of angle measurement information from different sensors originate from the same target and judging again according to the pitch angle information includes the following steps:
determining rays related to the target according to azimuth angle information and pitch angle information of the target by taking coordinates of the two sensors as starting points respectively to obtain a first ray and a second ray;
obtaining x-axis and y-axis coordinate information of an intersection point of projection lines of the first ray and the second ray on the x-axis plane and the y-axis plane according to the azimuth angle information;
obtaining the z-axis coordinate information of the first ray and the second ray under the space coordinate systems of the first sensor and the second sensor for the target according to the pitch angle information;
and judging whether the angle measurement information of the first sensor and the second sensor is from the same target or not according to the gating technology and the z-axis coordinate information.
Further, the three-dimensional space coordinate information of the target is obtained according to the angle measurement information, and the expression is as follows:
[xR,yR,zR]=[x,y1,0.5(Z1+Z2)];
said [ x ]R,yR,zR]Is the three-dimensional space coordinate information of the target R.
Further, the method also comprises the steps of judging that the angle measurement information from different sensors does not come from the same target, judging the angle measurement information as impossible to associate pairs and generating three-dimensional space coordinates of a false target, and specifically comprising the following steps:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
Further, the step of obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, making a decision by combining the actual track pointing information obtained according to the three-dimensional space coordinate information, and updating the target track with the actual track pointing information specifically includes:
generating position information of the current moment according to the state information of the target at the previous moment to obtain a predicted trace;
obtaining an actual point trace according to the three-dimensional space coordinate information of the target;
and calculating a weighted sum distance according to the predicted trace point and the actual trace point, and selecting the actual trace point with the minimum weighted sum distance to update the track of the target.
Further, the step of generating the position information of the current moment according to the state information of the previous moment of the target to obtain the specific position of the predicted trace is performed according to the following expression:
Figure GDA0003478671020000031
Figure GDA0003478671020000032
Figure GDA0003478671020000033
t is the sampling time interval of the sensorX is saidi(k+1)、yi(k +1) and zi(k +1) represents the predicted trace of the target i at time (k + 1).
Further, the calculating a weighted sum distance according to the predicted trace point and the actual trace point specifically includes:
Dli=a*|xl(k+1) -xi(k+1)|+b*|yl(k+1) -yi(k+1)|+c*|zl(k+1) -zi(k+1)|;
said DliExpressed as the weighted sum of the distance between the predicted trace of target i at (k +1) and the trace of point l at (k +1), a, b, c are artificially adjustable weighting factors, and xl(k+1)、yl(k +1) and zl(k +1) represents the three-dimensional space coordinate information of the actual point trace at the time (k + 1).
The method has the beneficial effects that: the method comprises the steps of judging angle measurement information according to the angle measurement information acquired by different sensors, judging whether the angle measurement information comes from the same target or not, obtaining three-dimensional space coordinate information of the target after judging that the angle measurement information comes from the same target, and updating the track of the target based on a weighted and distance nearest neighbor point track-track association algorithm, thereby realizing multi-target tracking.
Drawings
FIG. 1 is a flow chart of the steps of a multi-target data association method based on a photoelectric sensor according to the present invention;
FIG. 2 is a diagram illustrating the tracking effect of the embodiment of the present invention on three targets moving along a straight line at a constant speed;
Detailed Description
The invention is described in further detail below with reference to the figures and the specific embodiments. The step numbers in the following embodiments are provided only for convenience of illustration, the order between the steps is not limited at all, and the execution order of each step in the embodiments can be adapted according to the understanding of those skilled in the art.
In recent years, most radar devices are adopted to realize target tracking, but the radar devices are not suitable for partial devices, and other measuring sensors cannot realize the positioning and tracking of targets.
As shown in FIG. 1, the invention provides a multi-target data association method based on photoelectric sensors, which comprises the following steps:
s101, angle measurement information of the target is obtained through different sensors.
Specifically, the sensor may be a photoelectric sensor, and the sensor can only acquire measurement information of the target azimuth angle a and the target pitch angle E, where the measurement information of the target angle is affected by a noise environment and a measurement error.
S102, judging whether the angle measurement information obtained by different sensors is from the same target;
in particular, two pairs of angle measurement information from different sensors at the current moment are combined
Figure GDA0003478671020000041
Preliminarily judging whether two pairs of angle measurement information from different sensors possibly come from the same target by using the azimuth angle measurement information, and combining the two pairs of angle measurement information from different sensors at the current moment
Figure GDA0003478671020000042
And judging whether the two pairs of angle measurement information from different sensors are from the same target again by utilizing the pitch angle measurement information.
S103, judging that the angle measurement information from different sensors at the current moment comes from the same target, and acquiring three-dimensional space coordinate information of the target according to the angle measurement information;
specifically, if it is determined that two pairs of angle measurement information from different sensors at the current time are from the same target, the three-dimensional space coordinate information of the target is located by combining the two pairs of angle measurement information, and if it is determined that two pairs of angle measurement information from different sensors at the current time are unlikely to be from the same target, the two pairs of angle measurement information are determined as unlikely to be associated.
And S104, obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track point information obtained according to the three-dimensional space coordinate information to update the target track with the actual track point information.
Specifically, the position information of the current moment of the target is predicted by combining the motion state attributes such as the position, the speed and the like of the previous moment of the target, the distance between the predicted information of the current moment of the target and an actual measuring point is calculated by combining the predicted information of the current moment of the target and the actual measured position information of the target positioned according to the angle measuring information of the sensor at the current moment, a method based on the weighting sum distance is used for calculating the distance between the predicted information of the current moment and the actual measuring point, and the actual target measuring point closest to the one-step predicted position estimating point of the target is distributed to the target according to the idea of the nearest neighbor method to finish the point track-track association
Further, as a preferred embodiment of the method, the obtaining of the angle measurement information of the target by the different sensors specifically includes obtaining azimuth angle measurement information, pitch angle measurement information, and a measurement model of the target by the two sensors, where the measurement model is expressed as follows, and the measurement model of the sensor n at the time k is a measurement model of the target i:
Figure GDA0003478671020000051
Figure GDA0003478671020000052
the above-mentioned
Figure GDA0003478671020000053
Observed values of the azimuth angle and the pitch angle of the target i by the sensor n, respectively
Figure GDA0003478671020000054
The actual values of the azimuth angle and the pitch angle of the sensor n to the target i are respectively, and the delta A and the delta E are respectively the azimuth angle deviation and the pitch angle deviation of the sensor at the current moment.
Specifically, the three-dimensional cartesian space coordinates of the first and second sensors are set to:
SENSOR1=[0,0,0];
SENSOR2=[0,b,0];
as a preferred embodiment of the method, the step of determining whether the angle measurement information obtained by different sensors is from the same target specifically includes:
preliminarily judging whether two pairs of angle measurement information from different sensors are from the same target or not according to the azimuth angle information;
in particular, two pairs of angle measurement information from different sensors at the current moment are combined
Figure GDA0003478671020000055
The azimuth angle measurement information is used to preliminarily judge whether two pairs of angle measurement information from different sensors possibly originate from the same target.
Wherein the content of the first and second substances,
Figure GDA0003478671020000056
measurement information from the first and second sensors about the azimuth angle A and the pitch angle E of the targets i, j, respectively, i.e. using
Figure GDA0003478671020000057
Preliminary judgment
Figure GDA0003478671020000058
Whether it is possible to originate from the same target.
And preliminarily judging whether the two pairs of angle measurement information from different sensors originate from the same target or not according to the pitch angle information.
By using
Figure GDA0003478671020000059
Judging again
Figure GDA00034786710200000510
Whether derived from the same target.
As a preferred embodiment of the method, the step of preliminarily determining whether two pairs of angle measurement information from different sensors are from the same target according to the azimuth information specifically includes that the two pairs of azimuth information satisfy any one of the following conditions, and the step of preliminarily determining that the angle measurement information is from the same target:
(1)
Figure GDA00034786710200000511
and is
Figure GDA00034786710200000512
(2)
Figure GDA0003478671020000061
And is
Figure GDA0003478671020000062
(3)
Figure GDA0003478671020000063
And is
Figure GDA0003478671020000064
(4)
Figure GDA0003478671020000065
And is
Figure GDA0003478671020000066
(5)
Figure GDA0003478671020000067
And is
Figure GDA0003478671020000068
If the two measured azimuth angles satisfy one of the five conditions, it is determined preliminarily that the two pairs of angle measurement information are likely to be sourcedAt the same target, said
Figure GDA0003478671020000069
Is the azimuth information of the object i received by the first sensor, said
Figure GDA00034786710200000610
It is the second sensor that receives the azimuth information of target j.
Further, as a preferred embodiment of the method, the step of preliminarily determining whether the two pairs of angle measurement information from different sensors originate from the same target and determining again whether the two pairs of angle measurement information from different sensors originate from the same target according to the pitch angle information includes the following steps:
determining rays related to the target according to azimuth angle information and pitch angle information of the target by taking coordinates of the two sensors as starting points respectively to obtain a first ray and a second ray;
obtaining x-axis and y-axis coordinate information of an intersection point of projection lines of the first ray and the second ray on the x-axis plane and the y-axis plane according to the azimuth angle information;
obtaining the z-axis coordinate information of the first ray and the second ray under the space coordinate systems of the first sensor and the second sensor for the target according to the pitch angle information;
and judging whether the angle measurement information of the first sensor and the second sensor is from the same target or not according to the gating technology and the z-axis coordinate information.
Specifically, the angle measurement information from the sensor 1 is judged again
Figure GDA00034786710200000611
And angle measurement information from the sensor 2
Figure GDA00034786710200000612
Whether the target is from the same target or not can be determined uniquely by using the coordinate of the sensor n as a starting point and using the azimuth angle and the pitch angle information of the target i, wherein the ray is called as a ray n → i:
obtaining x-axis and y-axis coordinate information of an intersection point of projection lines of the rays 1 → i and the rays 2 → j on the x-axis and y-axis planes by using the azimuth angle measurement information;
the x-axis coordinate information of the intersection point under the space coordinate systems of the first sensor and the second sensor is as follows:
Figure GDA00034786710200000613
the y-axis coordinate information of the intersection point in the space coordinate system of the first sensor is as follows:
Figure GDA00034786710200000614
the y-axis coordinate information of the intersection point in the space coordinate system of the second sensor is as follows:
y2=y1-b;
obtaining the z-axis coordinate information of the ray 1 → i and the ray 2 → j relative to the targets i and j under the space coordinate systems of the first sensor and the second sensor respectively by utilizing the pitch angle measurement information;
the z-axis coordinate information of the target i in the space coordinate system of the first sensor is as follows:
Figure GDA0003478671020000071
the z-axis coordinate information of the target j in the space coordinate system of the second sensor is as follows:
Figure GDA0003478671020000072
the method for judging whether two pairs of angle measurement information from the first sensor and the second sensor come from the same target by adopting the gating technology comprises the following steps:
Δz=|z1-z2|;
Δz≤z0
wherein z is0Is a manually set thresholdAnd if the inequality is true, judging that the two pairs of angle measurement information come from the same target.
As a preferred embodiment of the method, the three-dimensional space coordinate information of the target is obtained according to the angle measurement information, and the expression is as follows:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
Further as a preferred embodiment of the method, the method further includes determining that the angle measurement information from different sensors does not originate from the same target, and determining that the angle measurement information is unlikely to be associated with a pair and generate a three-dimensional space coordinate of a false target, where the expression is as follows:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
As a preferred embodiment of the method, the step of obtaining the predicted position information of the target at the current time according to the state of the target at the previous time, making a decision by combining the actual track-pointing information obtained according to the three-dimensional space coordinate information, and updating the target track with the actual track-pointing information specifically includes:
generating position information of the current moment according to the state information of the target at the previous moment to obtain a predicted trace;
obtaining an actual point trace according to the three-dimensional space coordinate information of the target;
and calculating a weighted sum distance according to the predicted trace point and the actual trace point, and selecting the actual trace point with the minimum weighted sum distance to update the track of the target.
As a further preferred embodiment of the method, the step of generating the position information of the current time according to the state information of the previous time of the target to obtain the specific position of the predicted trace includes:
Figure GDA0003478671020000081
Figure GDA0003478671020000082
Figure GDA0003478671020000083
t is the sensor sampling time interval, xi(k+1)、yi(k +1) and zi(k +1) represents the predicted trace of the target i at time (k + 1).
Specifically, assume that the state information of target i at time k is:
Figure GDA0003478671020000084
the position prediction information of the target i at the time (k +1) is as described above.
Further as a preferred embodiment of the method, the calculating of the weighted sum distance according to the predicted trace and the actual trace includes the following expression:
Dli=a*|xl(k+1) -xi(k+1)|+b*|yl(k+1) -yi(k+1)|+c*|zl(k+1)-zi(k+1)|;
said DliExpressed as the weighted sum of the distance between the predicted trace of target i at (k +1) and the trace of point l at (k +1), a, b, c are artificially adjustable weighting factors, and xl(k+1)、yl(k +1) and zl(k +1) represents the three-dimensional space coordinate information of the actual point trace at the time (k + 1).
Specifically, the weighted sum distance D of all actual point traces and the target i at the time (k +1) is calculatedliThen, selecting the actual point track with the minimum weighting sum distance to update the flight track of the target i; and one actual point track can only be used for updating the track of one target at most, one target can only be matched with one actual point track for updating the track at most, (k +1) the coordinate of the point track l obtained by angle measurement information association at the moment is as follows:
Xl(k+1)=[xl(k+1),yl(k+1),zl(k+1)]T;l∈1…L;
wherein, L is the number of the traces of the point obtained by data association of the angle measurement information at the moment (k + 1).
Referring to fig. 2, a simulation experiment result of the invention is that a straight thick line segment in a No. 1 line is a real track of a target 1, and a curve segment in the No. 1 line is an observation track of the target 1; a straight thick line segment in the No. 2 line is the real track of the target 2, and a curve segment in the No. 2 line is the observation track of the target 2; the straight thick line segment in the No. 3 line is the real track of the target 3, the curve segment in the No. 3 line is the observation track of the target 3, the figure 2 is the tracking effect diagram of the method of the invention which utilizes two sensors to track three targets flying along the uniform straight line, the simulation experiment parameter explains: the target number is 3, the measurement sampling interval is 2S, and the threshold value z0200m, the position of the sensor S1 is (0,0,0), the position of the sensor S2 is (0,10000,0), the weight factor a is 0.35, the weight factor b is 0.35, the weight factor C is 0.3, the azimuth angle deviation delta A is 0.05, and the pitch angle deviation delta E is 0.05, and as can be seen from FIG. 2, the algorithm can effectively track a plurality of flight targets under the environment with measurement noise and has certain robustness.
The present invention provides another embodiment: a multi-target data association system based on photoelectric sensors comprises:
the acquisition module is used for acquiring angle measurement information of a target through different sensors;
the judging module is used for judging whether the angle measurement information acquired by different sensors is from the same target or not;
the actual trace pointing module is used for judging that the angle measurement information from different sensors at the current moment comes from the same target and acquiring the three-dimensional space coordinate information of the target according to the angle measurement information;
and the track module is used for obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track point information obtained according to the three-dimensional space coordinate information so as to update the target track with the actual track point information.
The present invention provides another embodiment: a multi-target data association device based on photoelectric sensors comprises:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement a multi-objective photosensor-based data correlation method as described above.
The contents in the above method embodiments are all applicable to the present apparatus embodiment, the functions specifically implemented by the present apparatus embodiment are the same as those in the above method embodiments, and the advantageous effects achieved by the present apparatus embodiment are also the same as those achieved by the above method embodiments.
In another embodiment of the present invention, a storage medium having stored thereon instructions executable by a processor, the storage medium comprises: the processor-executable instructions, when executed by the processor, are for implementing a photosensor-based multi-target data correlation method as described above.
The contents in the above method embodiments are all applicable to the present storage medium embodiment, the functions specifically implemented by the present storage medium embodiment are the same as those in the above method embodiments, and the advantageous effects achieved by the present storage medium embodiment are also the same as those achieved by the above method embodiments.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (7)

1. A multi-target data association method based on photoelectric sensors is characterized by comprising the following steps:
acquiring azimuth angle measurement information, pitch angle measurement information and a measurement model of a target through two sensors, wherein the expression of the measurement model is shown as follows;
Figure FDA0003499857070000011
Figure FDA0003499857070000012
the above-mentioned
Figure FDA0003499857070000013
Observed values of the azimuth angle and the pitch angle of the target i by the sensor n, respectively
Figure FDA0003499857070000014
Actual values of an azimuth angle and a pitch angle of the sensor n to the target i are respectively, and the delta A and the delta E are respectively an azimuth angle deviation and a pitch angle deviation of the sensor at the current moment;
preliminarily judging whether two pairs of angle measurement information from different sensors are from the same target or not according to the azimuth angle information;
determining rays related to the target according to azimuth angle information and pitch angle information of the target by taking coordinates of the two sensors as starting points respectively to obtain a first ray and a second ray;
obtaining x-axis and y-axis coordinate information x and y of the intersection point of the projection lines of the first ray and the second ray on the x-axis and y-axis planes according to the azimuth angle information1
Obtaining the Z-axis coordinate information Z of the first ray and the second ray under the space coordinate system of the first sensor and the second sensor for the target through the pitch angle information1And Z2
Judging whether the angle measurement information of the first sensor and the second sensor is from the same target or not according to the gating technology and the z-axis coordinate information;
judging that the angle measurement information from different sensors at the current moment comes from the same target, and acquiring the three-dimensional space coordinate information of the target according to the angle measurement information;
and obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track pointing information obtained according to the three-dimensional space coordinate information so as to update the target track with the actual track pointing information.
2. The multi-target data association method based on photoelectric sensors as claimed in claim 1, wherein the step of preliminarily determining whether two pairs of angle measurement information from different sensors are from the same target according to the azimuth information, specifically, the azimuth information satisfies any one of the following conditions, and the two pairs of angle measurement information are preliminarily determined to be from the same target:
Figure FDA0003499857070000015
and is
Figure FDA0003499857070000016
Figure FDA0003499857070000017
And is
Figure FDA0003499857070000018
Figure FDA0003499857070000019
And is
Figure FDA00034998570700000110
Figure FDA0003499857070000021
And is
Figure FDA0003499857070000022
Figure FDA0003499857070000023
And is
Figure FDA0003499857070000024
The above-mentioned
Figure FDA0003499857070000025
Is the azimuth information of the object i received by the first sensor, said
Figure FDA0003499857070000026
It is the second sensor that receives the azimuth information of target j.
3. The multi-target data association method based on the photoelectric sensor as claimed in claim 2, wherein the three-dimensional space coordinate information of the target is obtained according to the angle measurement information, and the expression is as follows:
[xR,yR,zR]=[x,y1,0.5(Z1+Z2)];
said [ x ]R,yR,zR]Is the three-dimensional space coordinate information of the target R.
4. The multi-target data association method based on the photoelectric sensor as claimed in claim 3, further comprising determining that angle measurement information from different sensors does not originate from the same target, determining that the angle measurement information is not likely to be associated with a pair and generating three-dimensional space coordinates of a false target, and the expression is as follows:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
5. The multi-target data association method based on the photoelectric sensor as claimed in claim 4, wherein the step of obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, making a decision by combining with the actual track point information obtained according to the three-dimensional space coordinate information, and updating the target track with the actual track point information specifically comprises:
generating position information of the current moment according to the state information of the target at the previous moment to obtain a predicted trace;
obtaining an actual point trace according to the three-dimensional space coordinate information of the target;
and calculating a weighted sum distance according to the predicted trace point and the actual trace point, and selecting the actual trace point with the minimum weighted sum distance to update the track of the target.
6. The multi-target data association method based on the photoelectric sensor as claimed in claim 5, wherein the step of generating the position information of the current time according to the state information of the previous time of the target to obtain the specific position of the predicted trace is as follows:
Figure FDA0003499857070000027
Figure FDA0003499857070000028
Figure FDA0003499857070000029
t is the sensor sampling time interval, xi(k+1)、yi(k +1) and zi(k +1) represents the predicted trace of the target i at time (k + 1).
7. The multi-target data association method based on the photoelectric sensor as claimed in claim 6, wherein the weighted sum distance is calculated according to the predicted trace and the actual trace, and the expression is as follows:
Dli=a*|xl(k+1)-xi(k+1)|+b*|yl(k+1)-yi(k+1)|+c*|zl(k+1)-zi(k+1)|;
said DliExpressed as the weighted sum of the distance between the predicted trace of target i at (k +1) and the trace of point l at (k +1), a, b, c are artificially adjustable weighting factors, and xl(k+1)、yl(k +1) and zl(k +1) represents the three-dimensional space coordinate information of the actual point trace at the time (k + 1).
CN202010482930.0A 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor Active CN111693051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010482930.0A CN111693051B (en) 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010482930.0A CN111693051B (en) 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor

Publications (2)

Publication Number Publication Date
CN111693051A CN111693051A (en) 2020-09-22
CN111693051B true CN111693051B (en) 2022-04-08

Family

ID=72479024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010482930.0A Active CN111693051B (en) 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor

Country Status (1)

Country Link
CN (1) CN111693051B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113611112B (en) * 2021-07-29 2022-11-08 中国第一汽车股份有限公司 Target association method, device, equipment and storage medium
CN114234982B (en) * 2021-12-20 2024-04-16 中南大学 Three-dimensional track planning method, system, equipment and medium based on azimuth positioning

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007255982A (en) * 2006-03-22 2007-10-04 Mitsubishi Electric Corp Target track correlation device, and correlation determination method of target track
CN102798867A (en) * 2012-09-10 2012-11-28 电子科技大学 Correlation method for flight tracks of airborne radar and infrared sensor
CN103759732A (en) * 2014-01-14 2014-04-30 北京航空航天大学 Angle information assisted centralized multi-sensor multi-hypothesis tracking method
CN104977559A (en) * 2014-04-04 2015-10-14 上海机电工程研究所 Target positioning method in interference environment
CN110672103A (en) * 2019-10-21 2020-01-10 北京航空航天大学 Multi-sensor target tracking filtering method and system
CN110824467A (en) * 2019-11-15 2020-02-21 中山大学 Multi-target tracking data association method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007255982A (en) * 2006-03-22 2007-10-04 Mitsubishi Electric Corp Target track correlation device, and correlation determination method of target track
CN102798867A (en) * 2012-09-10 2012-11-28 电子科技大学 Correlation method for flight tracks of airborne radar and infrared sensor
CN103759732A (en) * 2014-01-14 2014-04-30 北京航空航天大学 Angle information assisted centralized multi-sensor multi-hypothesis tracking method
CN104977559A (en) * 2014-04-04 2015-10-14 上海机电工程研究所 Target positioning method in interference environment
CN110672103A (en) * 2019-10-21 2020-01-10 北京航空航天大学 Multi-sensor target tracking filtering method and system
CN110824467A (en) * 2019-11-15 2020-02-21 中山大学 Multi-target tracking data association method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
多目标跟踪的数据关联技术研究;杨盼盼;《中国优秀硕士学位论文全文数据库》;20170630;第2、6-7页 *
无源测向定位中测向数据关联方法研究;李卿澜等;《计算机技术与发展》;20160228;第26卷(第02期);第110-113页 *

Also Published As

Publication number Publication date
CN111693051A (en) 2020-09-22

Similar Documents

Publication Publication Date Title
CN111208492B (en) Vehicle-mounted laser radar external parameter calibration method and device, computer equipment and storage medium
CN110081881B (en) Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
CN111693051B (en) Multi-target data association method based on photoelectric sensor
CN106896363B (en) A kind of submarine target active tracing track initiation method
CN110738275A (en) UT-PHD-based multi-sensor sequential fusion tracking method
CN109917373B (en) Dynamic planning track-before-detect method for motion compensation search of moving platform radar
CN109839633B (en) Multi-frame pre-detection tracking method of airborne early warning radar based on minimum coverage airspace
CN110597056A (en) Large closed-loop calibration control method for antiaircraft gun fire control system
CN111222225B (en) Method and device for determining pose of sensor in robot
CN111735443B (en) Dense target track correlation method based on assignment matrix
CN112986977A (en) Method for overcoming radar extended Kalman track filtering divergence
CN116027320A (en) Radar and AIS data fusion method based on multi-factor Euclidean distance correlation
CN115540854A (en) Active positioning method, equipment and medium based on UWB assistance
CN114359338A (en) Pose estimation method and device, terminal equipment and computer readable storage medium
CN115563574A (en) Multi-sensor air target point trace data fusion method based on comprehensive criterion
CN113866754A (en) Moving target track correlation method based on Gaussian distribution wave gate
CN113376626A (en) High maneuvering target tracking method based on IMMPDA algorithm
Zheng et al. Road map extraction using GMPHD filter and linear regression method for ground target tracking
Hem et al. Compensating radar rotation in target tracking
CN113566828A (en) Impact-resistant scanning matching method and system based on multi-sensor decision fusion
CN112083410A (en) Maneuvering target tracking method
Zhou et al. Radar Error Correlation Analysis under the Simultaneous Observation of Double Targets (December 2023)
CN115235475B (en) MCC-based EKF-SLAM back-end navigation path optimization method
Urru et al. Data Fusion algorithms to improve test range sensors accuracy and precision
CN115128597B (en) Maneuvering target tracking method under non-Gaussian noise based on IMM-STEKF

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant