CN111693051A - Multi-target data association method based on photoelectric sensor - Google Patents

Multi-target data association method based on photoelectric sensor Download PDF

Info

Publication number
CN111693051A
CN111693051A CN202010482930.0A CN202010482930A CN111693051A CN 111693051 A CN111693051 A CN 111693051A CN 202010482930 A CN202010482930 A CN 202010482930A CN 111693051 A CN111693051 A CN 111693051A
Authority
CN
China
Prior art keywords
target
information
angle measurement
measurement information
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010482930.0A
Other languages
Chinese (zh)
Other versions
CN111693051B (en
Inventor
张艳
陈金涛
杨雪榕
王爽
曲承志
张鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
National Sun Yat Sen University
Original Assignee
National Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Sun Yat Sen University filed Critical National Sun Yat Sen University
Priority to CN202010482930.0A priority Critical patent/CN111693051B/en
Publication of CN111693051A publication Critical patent/CN111693051A/en
Application granted granted Critical
Publication of CN111693051B publication Critical patent/CN111693051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The invention discloses a multi-target data association method based on a photoelectric sensor, which comprises the following steps: judging angle measurement information according to the angle measurement information acquired by different sensors and judging whether the angle measurement information comes from the same target or not; obtaining three-dimensional space coordinate information of the target after judging that the three-dimensional space coordinate information comes from the same target; and updating the track of the target based on a weighted and distance nearest neighbor track-track association algorithm. By using the method and the device, multi-target tracking can be realized under the condition that only measurement information of the target azimuth angle and the pitch angle can be acquired. The multi-target data association method based on the photoelectric sensor can be widely applied to the fields of multi-target tracking and multi-sensor information processing.

Description

Multi-target data association method based on photoelectric sensor
Technical Field
The invention relates to the field of multi-target tracking and multi-sensor information processing, in particular to a multi-target data association method based on a photoelectric sensor.
Background
In the field of multi-target tracking, a data association technology is a key technology for realizing multi-target tracking, and how to utilize measurement information to perform data association is a core problem. Under more complex modern environments, a single sensor cannot provide comprehensive and accurate sky situation information, real-time battlefield situations can be better mastered by combining measurement information of multiple sensors and utilizing an information processing and fusion technology, but the current single sensor cannot realize multi-target tracking under the condition that measurement information of a target azimuth angle and a target pitch angle can only be obtained.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a multi-target data association method based on a photoelectric sensor, which can achieve multi-target tracking under the condition that only measurement information of a target azimuth angle and a target pitch angle can be obtained.
The technical scheme adopted by the invention is as follows: a multi-target data association method based on photoelectric sensors comprises the following steps:
acquiring angle measurement information of a target through different sensors;
judging whether the angle measurement information obtained by different sensors is from the same target;
judging that the angle measurement information from different sensors at the current moment comes from the same target, and acquiring the three-dimensional space coordinate information of the target according to the angle measurement information;
and obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track pointing information obtained according to the three-dimensional space coordinate information so as to update the target track with the actual track pointing information.
Further, the obtaining of the angle measurement information of the target by the different sensors specifically includes obtaining azimuth angle measurement information, pitch angle measurement information, and a measurement model of the target by the two sensors, where the measurement model has the following expression:
Figure BDA0002517912830000011
Figure BDA0002517912830000012
the above-mentioned
Figure BDA0002517912830000013
Observed values of the azimuth angle and the pitch angle of the target i by the sensor n, respectively
Figure BDA0002517912830000021
The actual values of the azimuth angle and the pitch angle of the sensor n to the target i are respectively, and the delta A and the delta E are respectively the azimuth angle deviation and the pitch angle deviation of the sensor at the current moment.
Further, the step of determining whether the angle measurement information obtained by the different sensors is from the same target specifically includes:
preliminarily judging whether two pairs of angle measurement information from different sensors are from the same target or not according to the azimuth angle information;
and preliminarily judging whether the two pairs of angle measurement information from different sensors originate from the same target or not according to the pitch angle information.
Further, the step of preliminarily determining whether two pairs of angle measurement information from different sensors are from the same target according to the azimuth information is specifically that the azimuth information satisfies any one of the following conditions, and preliminarily determining that the two pairs of angle measurement information are from the same target:
Figure BDA0002517912830000022
and is
Figure BDA0002517912830000023
Figure BDA0002517912830000024
And is
Figure BDA0002517912830000025
Figure BDA0002517912830000026
And is
Figure BDA0002517912830000027
Figure BDA0002517912830000028
And is
Figure BDA0002517912830000029
Figure BDA00025179128300000210
And is
Figure BDA00025179128300000211
The above-mentioned
Figure BDA00025179128300000212
Is the azimuth information of the object i received by the first sensor, said
Figure BDA00025179128300000213
It is the second sensor that receives the azimuth information of target j.
Further, the step of preliminarily judging whether the two pairs of angle measurement information from different sensors originate from the same target and judging again according to the pitch angle information includes the following steps:
determining rays related to the target according to azimuth angle information and pitch angle information of the target by taking coordinates of the two sensors as starting points respectively to obtain a first ray and a second ray;
obtaining x-axis and y-axis coordinate information of an intersection point of projection lines of the first ray and the second ray on the x-axis plane and the y-axis plane according to the azimuth angle information;
obtaining the z-axis coordinate information of the first ray and the second ray under the space coordinate systems of the first sensor and the second sensor for the target according to the pitch angle information;
and judging whether the angle measurement information of the first sensor and the second sensor is from the same target or not according to the gating technology and the z-axis coordinate information.
Further, the three-dimensional space coordinate information of the target is obtained according to the angle measurement information, and the expression is as follows:
[xR,yR,zR]=[x,y1,0.5(Z1+Z2)];
said [ x ]R,yR,zR]Is the three-dimensional space coordinate information of the target R.
Further, the method also comprises the steps of judging that the angle measurement information from different sensors does not come from the same target, judging the angle measurement information as impossible to associate pairs and generating three-dimensional space coordinates of a false target, and specifically comprising the following steps:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
Further, the step of obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, making a decision by combining the actual track pointing information obtained according to the three-dimensional space coordinate information, and updating the target track with the actual track pointing information specifically includes:
generating position information of the current moment according to the state information of the target at the previous moment to obtain a predicted trace;
obtaining an actual point trace according to the three-dimensional space coordinate information of the target;
and calculating a weighted sum distance according to the predicted trace point and the actual trace point, and selecting the actual trace point with the minimum weighted sum distance to update the track of the target.
Further, the step of generating the position information of the current moment according to the state information of the previous moment of the target to obtain the specific position of the predicted trace is performed according to the following expression:
Figure BDA0002517912830000031
Figure BDA0002517912830000032
Figure BDA0002517912830000033
t is the sensor sampling time interval, xi(k+1)、yi(k +1) and zi(k +1) represents the predicted trace of the target i at time (k + 1).
Further, the calculating a weighted sum distance according to the predicted trace point and the actual trace point specifically includes:
Dli=a*|xl(k+1)-xi(k+1)|+b*|yl(k+1)-yi(k+1)|+c*|zl(k+1)-zi(k+1)|;
said DliExpressed as the weighted sum of the distance between the predicted trace of target i at (k +1) and the trace of point l at (k +1), a, b, c are artificially adjustable weighting factors, and xl(k+1)、yl(k +1) and zl(k +1) represents the three-dimensional space coordinate information of the actual point trace at the time (k + 1).
The method has the beneficial effects that: the method comprises the steps of judging angle measurement information according to the angle measurement information acquired by different sensors, judging whether the angle measurement information comes from the same target or not, obtaining three-dimensional space coordinate information of the target after judging that the angle measurement information comes from the same target, and updating the track of the target based on a weighted and distance nearest neighbor point track-track association algorithm, thereby realizing multi-target tracking.
Drawings
FIG. 1 is a flow chart of the steps of a multi-target data association method based on a photoelectric sensor according to the present invention;
FIG. 2 is a diagram illustrating the tracking effect of the embodiment of the present invention on three targets moving along a straight line at a constant speed;
Detailed Description
The invention is described in further detail below with reference to the figures and the specific embodiments. The step numbers in the following embodiments are provided only for convenience of illustration, the order between the steps is not limited at all, and the execution order of each step in the embodiments can be adapted according to the understanding of those skilled in the art.
In recent years, most radar devices are adopted to realize target tracking, but the radar devices are not suitable for partial devices, and other measuring sensors cannot realize the positioning and tracking of targets.
As shown in FIG. 1, the invention provides a multi-target data association method based on photoelectric sensors, which comprises the following steps:
s101, angle measurement information of the target is obtained through different sensors.
Specifically, the sensor may be a photoelectric sensor, and the sensor can only acquire measurement information of the target azimuth angle a and the target pitch angle E, where the measurement information of the target angle is affected by a noise environment and a measurement error.
S102, judging whether the angle measurement information obtained by different sensors is from the same target;
in particular, two pairs of angle measurement information from different sensors at the current moment are combined
Figure BDA0002517912830000041
Preliminarily judging whether two pairs of angle measurement information from different sensors possibly come from the same target by using the azimuth angle measurement information, and combining the two pairs of angle measurement information from different sensors at the current moment
Figure BDA0002517912830000042
And judging whether the two pairs of angle measurement information from different sensors are from the same target again by utilizing the pitch angle measurement information.
S103, judging that the angle measurement information from different sensors at the current moment comes from the same target, and acquiring three-dimensional space coordinate information of the target according to the angle measurement information;
specifically, if it is determined that two pairs of angle measurement information from different sensors at the current time are from the same target, the three-dimensional space coordinate information of the target is located by combining the two pairs of angle measurement information, and if it is determined that two pairs of angle measurement information from different sensors at the current time are unlikely to be from the same target, the two pairs of angle measurement information are determined as unlikely to be associated.
And S104, obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track point information obtained according to the three-dimensional space coordinate information to update the target track with the actual track point information.
Specifically, the position information of the current moment of the target is predicted by combining the motion state attributes such as the position, the speed and the like of the previous moment of the target, the distance between the predicted information of the current moment of the target and an actual measuring point is calculated by combining the predicted information of the current moment of the target and the actual measured position information of the target positioned according to the angle measuring information of the sensor at the current moment, a method based on the weighting sum distance is used for calculating the distance between the predicted information of the current moment and the actual measuring point, and the actual target measuring point closest to the one-step predicted position estimating point of the target is distributed to the target according to the idea of the nearest neighbor
Further, as a preferred embodiment of the method, the obtaining of the angle measurement information of the target by the different sensors specifically includes obtaining azimuth angle measurement information, pitch angle measurement information, and a measurement model of the target by the two sensors, where the measurement model is expressed as follows, and the measurement model of the sensor n at the time k is a measurement model of the target i:
Figure BDA0002517912830000051
Figure BDA0002517912830000052
the above-mentioned
Figure BDA0002517912830000053
Observed values of the azimuth angle and the pitch angle of the target i by the sensor n, respectively
Figure BDA0002517912830000054
The actual values of the azimuth angle and the pitch angle of the sensor n to the target i are respectively, and the delta A and the delta E are respectively the azimuth angle deviation and the pitch angle deviation of the sensor at the current moment.
Specifically, the three-dimensional cartesian space coordinates of the first and second sensors are set to:
SENSOR1=[0,0,0];
SENSOR2=[0,b,0];
as a preferred embodiment of the method, the step of determining whether the angle measurement information obtained by different sensors is from the same target specifically includes:
preliminarily judging whether two pairs of angle measurement information from different sensors are from the same target or not according to the azimuth angle information;
in particular, two pairs of angle measurement information from different sensors at the current moment are combined
Figure BDA0002517912830000061
The azimuth angle measurement information is used to preliminarily judge whether two pairs of angle measurement information from different sensors possibly originate from the same target.
Wherein the content of the first and second substances,
Figure BDA0002517912830000062
measurement information from the first and second sensors about the azimuth angle A and the pitch angle E of the targets i, j, respectively, i.e. using
Figure BDA0002517912830000063
Preliminary judgment
Figure BDA0002517912830000064
Whether it is possible to originate from the same target.
And preliminarily judging whether the two pairs of angle measurement information from different sensors originate from the same target or not according to the pitch angle information.
By using
Figure BDA0002517912830000065
Judging again
Figure BDA0002517912830000066
Whether derived from the same target.
As a preferred embodiment of the method, the step of preliminarily determining whether two pairs of angle measurement information from different sensors are from the same target according to the azimuth information specifically includes that the two pairs of azimuth information satisfy any one of the following conditions, and the step of preliminarily determining that the angle measurement information is from the same target:
(1)
Figure BDA0002517912830000067
and is
Figure BDA0002517912830000068
(2)
Figure BDA0002517912830000069
And is
Figure BDA00025179128300000610
(3)
Figure BDA00025179128300000611
And is
Figure BDA00025179128300000612
(4)
Figure BDA00025179128300000613
And is
Figure BDA00025179128300000614
(5)
Figure BDA00025179128300000615
And is
Figure BDA00025179128300000616
If the two measured azimuth angles satisfy one of the five conditions, it is preliminarily determined that the two pairs of angle measurement information are likely to originate from the same target, and the two pairs of angle measurement information are determined to originate from the same target
Figure BDA00025179128300000617
Is the azimuth information of the object i received by the first sensor, said
Figure BDA00025179128300000618
It is the second sensor that receives the azimuth information of target j.
Further, as a preferred embodiment of the method, the step of preliminarily determining whether the two pairs of angle measurement information from different sensors originate from the same target and determining again whether the two pairs of angle measurement information from different sensors originate from the same target according to the pitch angle information includes the following steps:
determining rays related to the target according to azimuth angle information and pitch angle information of the target by taking coordinates of the two sensors as starting points respectively to obtain a first ray and a second ray;
obtaining x-axis and y-axis coordinate information of an intersection point of projection lines of the first ray and the second ray on the x-axis plane and the y-axis plane according to the azimuth angle information;
obtaining the z-axis coordinate information of the first ray and the second ray under the space coordinate systems of the first sensor and the second sensor for the target according to the pitch angle information;
and judging whether the angle measurement information of the first sensor and the second sensor is from the same target or not according to the gating technology and the z-axis coordinate information.
Specifically, the angle measurement information from the sensor 1 is judged again
Figure BDA0002517912830000071
And angle measurement information from the sensor 2
Figure BDA0002517912830000072
Whether the target is from the same target or not can be determined uniquely by using the coordinate of the sensor n as a starting point and using the azimuth angle and the pitch angle information of the target i, wherein the ray is called as a ray n → i:
obtaining x-axis and y-axis coordinate information of an intersection point of projection lines of the rays 1 → i and the rays 2 → i on the x-axis and y-axis planes by using the azimuth angle measurement information;
the x-axis coordinate information of the intersection point under the space coordinate systems of the first sensor and the second sensor is as follows:
Figure BDA0002517912830000073
the y-axis coordinate information of the intersection point in the space coordinate system of the first sensor is as follows:
Figure BDA0002517912830000074
the y-axis coordinate information of the intersection point in the space coordinate system of the second sensor is as follows:
y2=y1-b;
obtaining the z-axis coordinate information of the ray 1 → i and the ray 2 → j relative to the targets i and j under the space coordinate systems of the first sensor and the second sensor respectively by utilizing the pitch angle measurement information;
the z-axis coordinate information of the target i in the space coordinate system of the first sensor is as follows:
Figure BDA0002517912830000075
the z-axis coordinate information of the target j in the space coordinate system of the second sensor is as follows:
Figure BDA0002517912830000076
the method for judging whether two pairs of angle measurement information from the first sensor and the second sensor come from the same target by adopting the gating technology comprises the following steps:
Δz=|z1-z2|;
Δz≤z0
wherein z is0The angle measurement information is a threshold value which is set artificially, and if the inequality is established, the two pairs of angle measurement information are judged to be from the same target.
As a preferred embodiment of the method, the three-dimensional space coordinate information of the target is obtained according to the angle measurement information, and the expression is as follows:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
Further as a preferred embodiment of the method, the method further includes determining that the angle measurement information from different sensors does not originate from the same target, and determining that the angle measurement information is unlikely to be associated with a pair and generate a three-dimensional space coordinate of a false target, where the expression is as follows:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
As a preferred embodiment of the method, the step of obtaining the predicted position information of the target at the current time according to the state of the target at the previous time, making a decision by combining the actual track-pointing information obtained according to the three-dimensional space coordinate information, and updating the target track with the actual track-pointing information specifically includes:
generating position information of the current moment according to the state information of the target at the previous moment to obtain a predicted trace;
obtaining an actual point trace according to the three-dimensional space coordinate information of the target;
and calculating a weighted sum distance according to the predicted trace point and the actual trace point, and selecting the actual trace point with the minimum weighted sum distance to update the track of the target.
As a further preferred embodiment of the method, the step of generating the position information of the current time according to the state information of the previous time of the target to obtain the specific position of the predicted trace includes:
Figure BDA0002517912830000081
Figure BDA0002517912830000082
Figure BDA0002517912830000083
t is the sensor sampling time interval, xi(k+1)、yi(k +1) and zi(k +1) represents the predicted trace of the target i at time (k + 1).
Specifically, assume that the state information of target i at time k is:
Figure BDA0002517912830000084
the position prediction information of the target i at the time (k +1) is as described above.
Further as a preferred embodiment of the method, the calculating of the weighted sum distance according to the predicted trace and the actual trace includes the following expression:
Dli=a*|xl(k+1)-xi(k+1)|+b*|yl(k+1)-yi(k+1)|+c*|zl(k+1)-zi(k+1)|;
said DliExpressed as the weighted sum of the distance between the predicted trace of target i at (k +1) and the trace of point l at (k +1), a, b, c are artificially adjustable weighting factors, and xl(k+1)、yl(k +1) and zl(k +1) represents the three-dimensional space coordinate information of the actual point trace at the time (k + 1).
Specifically, the weighted sum distance D of all actual point traces and the target i at the time (k +1) is calculatedliThen, the weight is selectedUpdating the flight path of the target i by the actual point path with the minimum distance; and one actual point track can only be used for updating the track of one target at most, one target can only be matched with one actual point track for updating the track at most, (k +1) the coordinate of the point track l obtained by angle measurement information association at the moment is as follows:
Xl(k+1)=[xl(k+1),yl(k+1),zl(k+1)]T;l∈1…L;
wherein, L is the number of the traces of the point obtained by data association of the angle measurement information at the moment (k + 1).
Referring to fig. 2, a simulation experiment result of the invention is that a straight thick line segment in a No. 1 line is a real track of a target 1, and a curve segment in the No. 1 line is an observation track of the target 1; a straight thick line segment in the No. 2 line is the real track of the target 2, and a curve segment in the No. 2 line is the observation track of the target 2; the straight thick line segment in the No. 3 line is the real track of the target 3, the curve segment in the No. 3 line is the observation track of the target 3, the figure 2 is the tracking effect diagram of the method of the invention which utilizes two sensors to track three targets flying along the uniform straight line, the simulation experiment parameter explains: the target number is 3, the measurement sampling interval is 2S, and the threshold value z0200m, the position of the sensor S1 is (0,0,0), the position of the sensor S2 is (0,10000,0), the weight factor a is 0.35, the weight factor b is 0.35, the weight factor C is 0.3, the azimuth angle deviation delta A is 0.05, and the pitch angle deviation delta E is 0.05, and as can be seen from FIG. 2, the algorithm can effectively track a plurality of flight targets under the environment with measurement noise and has certain robustness.
The present invention provides another embodiment: a multi-target data association system based on photoelectric sensors comprises:
the acquisition module is used for acquiring angle measurement information of a target through different sensors;
the judging module is used for judging whether the angle measurement information acquired by different sensors is from the same target or not;
the actual trace pointing module is used for judging that the angle measurement information from different sensors at the current moment comes from the same target and acquiring the three-dimensional space coordinate information of the target according to the angle measurement information;
and the track module is used for obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track point information obtained according to the three-dimensional space coordinate information so as to update the target track with the actual track point information.
The present invention provides another embodiment: a multi-target data association device based on photoelectric sensors comprises:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement a multi-objective photosensor-based data correlation method as described above.
The contents in the above method embodiments are all applicable to the present apparatus embodiment, the functions specifically implemented by the present apparatus embodiment are the same as those in the above method embodiments, and the advantageous effects achieved by the present apparatus embodiment are also the same as those achieved by the above method embodiments.
In another embodiment of the present invention, a storage medium having stored thereon instructions executable by a processor, the storage medium comprises: the processor-executable instructions, when executed by the processor, are for implementing a photosensor-based multi-target data correlation method as described above.
The contents in the above method embodiments are all applicable to the present storage medium embodiment, the functions specifically implemented by the present storage medium embodiment are the same as those in the above method embodiments, and the advantageous effects achieved by the present storage medium embodiment are also the same as those achieved by the above method embodiments.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A multi-target data association method based on photoelectric sensors is characterized by comprising the following steps:
acquiring angle measurement information of a target through different sensors;
judging whether the angle measurement information obtained by different sensors is from the same target;
judging that the angle measurement information from different sensors at the current moment comes from the same target, and acquiring the three-dimensional space coordinate information of the target according to the angle measurement information;
and obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, and making a judgment by combining the actual track pointing information obtained according to the three-dimensional space coordinate information so as to update the target track with the actual track pointing information.
2. The multi-target data association method based on the photoelectric sensor as claimed in claim 1, wherein the obtaining of the angle measurement information of the target by different sensors is specifically obtaining of azimuth angle measurement information, pitch angle measurement information and a measurement model of the target by two sensors, and the expression of the measurement model is as follows:
Figure FDA0002517912820000011
Figure FDA0002517912820000012
the above-mentioned
Figure FDA0002517912820000013
Observed values of the azimuth angle and the pitch angle of the target i by the sensor n, respectively
Figure FDA0002517912820000014
The actual values of the azimuth angle and the pitch angle of the sensor n to the target i are respectively, and the delta A and the delta E are respectively the azimuth angle deviation and the pitch angle deviation of the sensor at the current momentAnd (4) poor.
3. The multi-target data association method based on photoelectric sensors as claimed in claim 2, wherein the step of determining whether the angle measurement information obtained from different sensors is from the same target includes:
preliminarily judging whether two pairs of angle measurement information from different sensors are from the same target or not according to the azimuth angle information;
and preliminarily judging whether the two pairs of angle measurement information from different sensors originate from the same target or not according to the pitch angle information.
4. The multi-target data association method based on photoelectric sensors as claimed in claim 3, wherein the step of preliminarily determining whether two pairs of angle measurement information from different sensors are from the same target according to the azimuth information is specifically to preliminarily determine that two pairs of angle measurement information are from the same target, where the azimuth information satisfies any one of the following conditions:
Figure FDA0002517912820000015
and is
Figure FDA0002517912820000016
Figure FDA0002517912820000021
And is
Figure FDA0002517912820000022
Figure FDA0002517912820000023
And is
Figure FDA0002517912820000024
Figure FDA0002517912820000025
And is
Figure FDA0002517912820000026
Figure FDA0002517912820000027
And is
Figure FDA0002517912820000028
The above-mentioned
Figure FDA0002517912820000029
Is the azimuth information of the object i received by the first sensor, said
Figure FDA00025179128200000210
It is the second sensor that receives the azimuth information of target j.
5. The multi-target data association method based on photoelectric sensors as claimed in claim 4, wherein the step of preliminarily determining whether two pairs of angle measurement information from different sensors originate from the same target and determining again whether two pairs of angle measurement information from different sensors originate from the same target according to the pitch angle information includes the following steps:
determining rays related to the target according to azimuth angle information and pitch angle information of the target by taking coordinates of the two sensors as starting points respectively to obtain a first ray and a second ray;
obtaining x-axis and y-axis coordinate information of an intersection point of projection lines of the first ray and the second ray on the x-axis plane and the y-axis plane according to the azimuth angle information;
obtaining the z-axis coordinate information of the first ray and the second ray under the space coordinate systems of the first sensor and the second sensor for the target according to the pitch angle information;
and judging whether the angle measurement information of the first sensor and the second sensor is from the same target or not according to the gating technology and the z-axis coordinate information.
6. The multi-target data association method based on the photoelectric sensor as claimed in claim 5, wherein the three-dimensional space coordinate information of the target is obtained according to the angle measurement information, and the expression is as follows:
[xR,yR,zR]=[x,y1,0.5(Z1+Z2)];
said [ x ]R,yR,zR]Is the three-dimensional space coordinate information of the target R.
7. The multi-target data association method based on the photoelectric sensor as claimed in claim 6, further comprising determining that angle measurement information from different sensors does not originate from the same target, determining that the angle measurement information is not likely to associate pairs and generating three-dimensional space coordinates of a false target, and the expression is as follows:
[x0,y0,z0]=[0,0,0];
said [ x ]0,y0,z0]Is the three-dimensional space coordinate information of the false target o.
8. The multi-target data association method based on the photoelectric sensor as claimed in claim 7, wherein the step of obtaining the predicted position information of the target at the current moment according to the state of the target at the previous moment, making a decision by combining with the actual track point information obtained according to the three-dimensional space coordinate information, and updating the target track with the actual track point information specifically comprises:
generating position information of the current moment according to the state information of the target at the previous moment to obtain a predicted trace;
obtaining an actual point trace according to the three-dimensional space coordinate information of the target;
and calculating a weighted sum distance according to the predicted trace point and the actual trace point, and selecting the actual trace point with the minimum weighted sum distance to update the track of the target.
9. The multi-target data association method based on the photoelectric sensor as claimed in claim 8, wherein the step of generating the position information of the current time according to the state information of the previous time of the target to obtain the specific position of the predicted trace is as follows:
Figure FDA0002517912820000031
Figure FDA0002517912820000032
Figure FDA0002517912820000033
t is the sensor sampling time interval, xi(k+1)、yi(k +1) and zi(k +1) represents the predicted trace of the target i at time (k + 1).
10. The multi-target data association method based on the photoelectric sensor as claimed in claim 9, wherein the weighted sum distance is calculated according to the predicted trace and the actual trace, and the expression is as follows:
Dli=a*|xl(k+1)-xi(k+1)|+b*|yl(k+1)-yi(k+1)|+c*|zl(k+1)-zi(k+1)|;
said DliExpressed as the weighted sum of the distance between the predicted trace of target i at (k +1) and the trace of point l at (k +1), a, b, c are artificially adjustable weighting factors, and xl(k+1)、yl(k +1) and zl(k +1) represents the three-dimensional space coordinate information of the actual point trace at the time (k + 1).
CN202010482930.0A 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor Active CN111693051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010482930.0A CN111693051B (en) 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010482930.0A CN111693051B (en) 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor

Publications (2)

Publication Number Publication Date
CN111693051A true CN111693051A (en) 2020-09-22
CN111693051B CN111693051B (en) 2022-04-08

Family

ID=72479024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010482930.0A Active CN111693051B (en) 2020-06-01 2020-06-01 Multi-target data association method based on photoelectric sensor

Country Status (1)

Country Link
CN (1) CN111693051B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113611112A (en) * 2021-07-29 2021-11-05 中国第一汽车股份有限公司 Target association method, device, equipment and storage medium
CN114234982A (en) * 2021-12-20 2022-03-25 中南大学 Three-dimensional trajectory planning method, system, device and medium based on azimuth positioning

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007255982A (en) * 2006-03-22 2007-10-04 Mitsubishi Electric Corp Target track correlation device, and correlation determination method of target track
CN102798867A (en) * 2012-09-10 2012-11-28 电子科技大学 Correlation method for flight tracks of airborne radar and infrared sensor
CN103759732A (en) * 2014-01-14 2014-04-30 北京航空航天大学 Angle information assisted centralized multi-sensor multi-hypothesis tracking method
CN104977559A (en) * 2014-04-04 2015-10-14 上海机电工程研究所 Target positioning method in interference environment
CN110672103A (en) * 2019-10-21 2020-01-10 北京航空航天大学 Multi-sensor target tracking filtering method and system
CN110824467A (en) * 2019-11-15 2020-02-21 中山大学 Multi-target tracking data association method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007255982A (en) * 2006-03-22 2007-10-04 Mitsubishi Electric Corp Target track correlation device, and correlation determination method of target track
CN102798867A (en) * 2012-09-10 2012-11-28 电子科技大学 Correlation method for flight tracks of airborne radar and infrared sensor
CN103759732A (en) * 2014-01-14 2014-04-30 北京航空航天大学 Angle information assisted centralized multi-sensor multi-hypothesis tracking method
CN104977559A (en) * 2014-04-04 2015-10-14 上海机电工程研究所 Target positioning method in interference environment
CN110672103A (en) * 2019-10-21 2020-01-10 北京航空航天大学 Multi-sensor target tracking filtering method and system
CN110824467A (en) * 2019-11-15 2020-02-21 中山大学 Multi-target tracking data association method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李卿澜等: "无源测向定位中测向数据关联方法研究", 《计算机技术与发展》 *
杨盼盼: "多目标跟踪的数据关联技术研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113611112A (en) * 2021-07-29 2021-11-05 中国第一汽车股份有限公司 Target association method, device, equipment and storage medium
CN113611112B (en) * 2021-07-29 2022-11-08 中国第一汽车股份有限公司 Target association method, device, equipment and storage medium
CN114234982A (en) * 2021-12-20 2022-03-25 中南大学 Three-dimensional trajectory planning method, system, device and medium based on azimuth positioning
CN114234982B (en) * 2021-12-20 2024-04-16 中南大学 Three-dimensional track planning method, system, equipment and medium based on azimuth positioning

Also Published As

Publication number Publication date
CN111693051B (en) 2022-04-08

Similar Documents

Publication Publication Date Title
CN111693051B (en) Multi-target data association method based on photoelectric sensor
CN106896363B (en) A kind of submarine target active tracing track initiation method
CN108645412B (en) Multi-sensor self-adaptive track starting method
CN110738275A (en) UT-PHD-based multi-sensor sequential fusion tracking method
CN114063056A (en) Ship track fusion method, system, medium and equipment
CN110597056A (en) Large closed-loop calibration control method for antiaircraft gun fire control system
CN109917373B (en) Dynamic planning track-before-detect method for motion compensation search of moving platform radar
Danilov et al. Indication of relative motion intensity of aerodynamic object and meters with different physical nature
CN111735443B (en) Dense target track correlation method based on assignment matrix
CN111222225B (en) Method and device for determining pose of sensor in robot
CN115540854A (en) Active positioning method, equipment and medium based on UWB assistance
CN114359338A (en) Pose estimation method and device, terminal equipment and computer readable storage medium
CN115508824A (en) Multi-target big data association fusion tracking method and system
CN115563574A (en) Multi-sensor air target point trace data fusion method based on comprehensive criterion
CN113376626A (en) High maneuvering target tracking method based on IMMPDA algorithm
CN113866754A (en) Moving target track correlation method based on Gaussian distribution wave gate
CN113743475A (en) Real-time multi-source data fusion method based on UKF
CN112986977A (en) Method for overcoming radar extended Kalman track filtering divergence
CN113566828A (en) Impact-resistant scanning matching method and system based on multi-sensor decision fusion
Zheng et al. Road map extraction using GMPHD filter and linear regression method for ground target tracking
Hem et al. Compensating radar rotation in target tracking
Zhou et al. Radar Error Correlation Analysis under the Simultaneous Observation of Double Targets (December 2023)
CN113281736B (en) Radar maneuvering intersection target tracking method based on multi-hypothesis singer model
CN115235475B (en) MCC-based EKF-SLAM back-end navigation path optimization method
CN115128597B (en) Maneuvering target tracking method under non-Gaussian noise based on IMM-STEKF

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant