CN116340736A - Heterogeneous sensor information fusion method and device - Google Patents

Heterogeneous sensor information fusion method and device Download PDF

Info

Publication number
CN116340736A
CN116340736A CN202211676310.6A CN202211676310A CN116340736A CN 116340736 A CN116340736 A CN 116340736A CN 202211676310 A CN202211676310 A CN 202211676310A CN 116340736 A CN116340736 A CN 116340736A
Authority
CN
China
Prior art keywords
sensor
measurement
heterogeneous
fusion
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211676310.6A
Other languages
Chinese (zh)
Inventor
罗鹏
韩乃军
何鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUNAN NOVASKY ELECTRONIC TECHNOLOGY CO LTD
Original Assignee
HUNAN NOVASKY ELECTRONIC TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUNAN NOVASKY ELECTRONIC TECHNOLOGY CO LTD filed Critical HUNAN NOVASKY ELECTRONIC TECHNOLOGY CO LTD
Priority to CN202211676310.6A priority Critical patent/CN116340736A/en
Publication of CN116340736A publication Critical patent/CN116340736A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a heterogeneous sensor information fusion method and a heterogeneous sensor information fusion device, wherein the method comprises the following steps: s01, acquiring data detected by two heterogeneous sensors, and performing time and space registration to obtain registered data; s02, according to the joint distribution state of the measurement information of the two heterogeneous sensors, acquiring the measurement information of the two heterogeneous sensors at the same moment from the registered data to perform association pairing to obtain an association observation pair; s03, determining a suspected measurement pair from the associated observation pair according to the joint distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, and fusing the measurement information of the suspected measurement pair to obtain a fused suspected measurement pair; s04, carrying out decision-level fusion recognition according to the suspected measurement pair after fusion, and obtaining a final recognition result. The method can realize heterogeneous sensor expression fusion, and has the advantages of simple realization method, high fusion efficiency, good adaptability to complex scenes, strong anti-interference performance and the like.

Description

Heterogeneous sensor information fusion method and device
Technical Field
The invention relates to the technical field of sensor information fusion, in particular to a heterogeneous sensor information fusion method and device.
Background
The sensor has different advantages, such as all-weather operation, small volume, speed and distance measurement, and the infrared sensor has the advantages of little influence of illumination condition, accurate angle measurement, and the like. However, the single sensor is limited by the performance and the function of the sensor, the working mode and the acquired target information are single, for example, the point cloud data of the millimeter wave radar are sparse, the false alarm is more and is easy to interfere, the visible light camera is greatly influenced by weather, the illumination condition is greatly influenced, the monitoring range of the infrared sensor is small, the temperature is greatly influenced, and the like, so that the single sensor is difficult to adapt to the current increasingly complex use requirements. Therefore, on the basis of detection of a single sensor, information of different sensors is fused, and the recognition effect and the adaptability to complex environments are improved.
The heterogeneous sensor detects different types of information, such as angle, distance, speed and the like of millimeter wave radar, and the information obtained by the infrared imaging sensor is usually position, angle and the like in an image pixel, and the information is detected by sensors with different dimensions, so that information fusion cannot be directly carried out. Therefore, in the prior art, the multi-sensor information fusion method is usually performed on the data association level, for example, the common method for information fusion of the millimeter wave radar and the infrared camera is to perform spatial coordinate conversion on the millimeter wave radar and the infrared camera, map the radar point onto the infrared image pixel coordinate system, and finally simply match the radar point with the infrared image target. However, the method is difficult to express and fuse heterogeneous sensor information, and is difficult to obtain better effects in complex environments, for example, millimeter wave radars have the problem of more false alarms, and textures and detailed information of infrared images are relatively simple, so that radar false alarm points or real radar target points without infrared targets can be matched.
Chinese patent application CN114994655A discloses a radar point trace and infrared point trace compound tracking processing method based on AdaBoost, which uses an assumed track as a machine learning training sample to train an AdaBoost classifier to carry out true and false classification on the assumed track and carry out filtering update on the compound track. However, the method needs to rely on an AdaBoost classifier to classify true and false points/tracks and mark samples offline, so that a large amount of prior information is needed, the implementation is complex, and the problem of poor adaptability to complex scenes still exists.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: aiming at the technical problems existing in the prior art, the invention provides the heterogeneous sensor information fusion method and device which can realize heterogeneous sensor expression fusion, and have the advantages of simple realization method, high fusion efficiency, good adaptability to complex scenes and strong anti-interference performance.
In order to solve the technical problems, the technical scheme provided by the invention is as follows:
a heterogeneous sensor information fusion method comprises the following steps:
s01, acquiring data detected by two heterogeneous sensors, and performing time and space registration to obtain registered data;
s02, according to the joint distribution state of the measurement information of the two heterogeneous sensors, acquiring the measurement information of the two heterogeneous sensors at the same moment from the registered data to perform association pairing to obtain an association observation pair;
s03, determining a suspected measurement pair from the associated observation pair according to the joint distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, and fusing the measurement information of the suspected measurement pair to obtain a fused suspected measurement pair;
and S04, carrying out decision-stage identification according to the suspected measurement pair after fusion to obtain a final identification result.
Further, the measurement information includes an azimuth angle and a pitch angle, in the step S02, a first statistical decision T obeying chi-square distribution with a degree of freedom of 2 is constructed by using the azimuth angle and the pitch angle obtained by the two heterogeneous sensors, when the first statistical decision T is greater than a preset threshold λ, it is determined that the measurement values of the two heterogeneous sensors to be determined are associated observation pairs from the same position, and otherwise, it is determined that the measurement values are not associated observation pairs from the same position, where the preset first threshold λ is set according to the chi-square distribution with the degree of freedom of 2.
Further, the calculation expression of the first statistical decision quantity T is:
Figure BDA0004018356410000021
wherein,,
Figure BDA0004018356410000022
and->
Figure BDA0004018356410000023
Azimuth angle measurement values obtained by the first sensor and the second sensor respectively, +.>
Figure BDA0004018356410000024
And->
Figure BDA0004018356410000025
Pitch angle measurement value sigma obtained by the first sensor and the second sensor respectively ,/>
Figure BDA0004018356410000026
Variance, sigma, of azimuth angle and pitch angle measurements obtained by the first sensor, respectively ,/>
Figure BDA0004018356410000027
The variance of the azimuth angle measurement value and the pitch angle measurement value obtained by the second sensor are respectively, and the first sensor and the second sensor are heterogeneous sensors.
Further, in the step S03, determining the suspected measurement pair from the associated observation pair according to the joint distribution relationship between the measurement information of the heterogeneous sensor and the noise variance includes: using azimuth angle and pitch angle obtained by two heterogeneous sensors, and measuring noise variance of azimuth angle of two heterogeneous sensors and measuring noise variance of pitch angle of two heterogeneous sensors to construct chi-square distribution with compliance degree of freedom of 2
Figure BDA0004018356410000028
Alpha is the probability of misjudging two different target observations as the same target observation, when the second statistical judgment quantity is larger than a preset second threshold +.>
Figure BDA0004018356410000029
Determining that the measurement values of the two heterogeneous sensors to be discriminated are suspected measurement pairs, wherein the second threshold is preset +.>
Figure BDA00040183564100000210
According to the chi-square distribution with degree of freedom 2 +.>
Figure BDA00040183564100000211
Setting to obtain the product.
Further, the calculation expression of the second statistical decision quantity is:
Figure BDA0004018356410000031
wherein,,
Figure BDA0004018356410000032
representing a second statistical decision quantity between the ith first sensor measurement information and the jth second sensor measurement information at time k, +.>
Figure BDA0004018356410000033
An i-th azimuth observation of the first sensor, a j-th azimuth observation of the second sensor, an i-th pitch angle observation of the first sensor, and a j-th pitch angle observation of the second sensor at time k, respectively,/">
Figure BDA0004018356410000034
The azimuth angle measurement noise variance of the first sensor, the pitch angle measurement noise variance of the first sensor, the azimuth angle measurement noise variance of the second sensor, the pitch of the second sensorThe angle measurement noise variance, the first sensor and the second sensor are heterogeneous sensors.
Further, in step S03, the azimuth angle of the suspected measurement pair is weighted by using the azimuth angle measurement noise variance of the two heterogeneous sensors to obtain the azimuth angle after fusion, and the pitch angle of the suspected measurement pair is weighted by using the pitch angle measurement noise variance of the two heterogeneous sensors to obtain the pitch angle after fusion.
Further, the azimuth angle after fusion and the pitch angle after fusion are respectively calculated according to the following formulas:
Figure BDA0004018356410000035
Figure BDA0004018356410000036
wherein,,
Figure BDA0004018356410000037
the fusion rear differential angle and the fusion rear pitch angle at the moment k are respectively,
Figure BDA0004018356410000038
an i-th azimuth observation of the first sensor, a j-th azimuth observation of the second sensor, an i-th pitch angle observation of the first sensor, and a j-th pitch angle observation of the second sensor at the time k,
Figure BDA0004018356410000039
the method comprises the steps of measuring noise variance of azimuth angles of a first sensor, measuring noise variance of pitch angles of the first sensor, measuring noise variance of azimuth angles of a second sensor and measuring noise variance of pitch angles of the second sensor, wherein the first sensor and the second sensor are heterogeneous sensors.
Further, in the step S04, decision-level recognition is performed by using an evidence theory method, wherein the mutual support degree and the collision strength between two heterogeneous sensor information are calculated by taking the target generic confidence degree in the recognition results obtained by the two sensors as an evidence body, so as to measure the contribution degree of different sensor information to the final fusion information, and then the two heterogeneous sensor information is weighted according to the mutual support degree and the collision strength, so as to obtain the fusion result.
Further, the step of using the evidence theory method to perform decision stage recognition includes:
s401, initializing parameters: taking the confidence coefficient of the target class in the identification result obtained by the two heterogeneous sensors as an evidence body, setting a basic probability distribution function for any evidence, and calculating a conflict strength value and a mutual support value between the evidence bodies;
s402, conflict detection: judging whether the mutual support degree value is larger than a preset threshold value, if so, fusing the data to obtain the current fusion confidence degree, and then turning to step S404, otherwise, turning to step S403;
s403, respectively calculating the integral distance between the two heterogeneous sensors and the fused evidence at the previous moment, and selecting a sensor with a small distance from the fused evidence at the previous moment as the current period evidence;
s404, calculating the total support degree of all evidence bodies on each evidence body by integrating the evidence obtained in the current period and the previous historical periods;
s405, calculating weight values according to the total support degree of all evidence bodies to weight and correct the evidence m WAE
S406, calculating the weighted correction evidence m WAE And fusing for multiple times to obtain fused confidence coefficient of each category, and judging the category corresponding to the highest confidence coefficient as the final target category.
A heterogeneous sensor information fusion device, the device comprising:
the space-time registration module is used for acquiring data detected by the two heterogeneous sensors and carrying out time and space registration to obtain registered data;
the fusion detection module is used for acquiring measurement information of the two heterogeneous sensors at the same moment from the registered data according to the joint distribution state of the measurement information of the two heterogeneous sensors, and carrying out association pairing to obtain an association observation pair;
the fusion tracking module is used for determining a suspected measurement pair from the association observation pair according to the distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, and fusing the measurement information of the suspected measurement pair to obtain a fused suspected measurement pair;
the fusion identification module is used for carrying out decision-level identification according to the suspected measurement pair after fusion to obtain a final identification result;
or the apparatus comprises a processor for storing a computer program and a memory for executing the computer program to perform the method as described above.
Compared with the prior art, the invention has the advantages that: according to the invention, after the data detected by the two heterogeneous sensors are subjected to time and space registration, the data correlation method is based on the joint distribution state of target points of the two sensors in the fusion detection process, so that the measurement generated by the targets is separated from the measurement generated by clutter, and the fusion of the information of different dimensions of the heterogeneous distributed sensors on the expression level can be simply and efficiently realized; in the fusion tracking process, the measurement information of the correlation observation pair is fused according to the joint distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, so that the matching accuracy can be effectively improved, and finally, an accurate recognition result can be obtained through decision-level recognition, and the recognition performance is effectively improved.
Drawings
Fig. 1 is a schematic diagram of an overall implementation flow of the information fusion method of the radar sensor and the infrared imaging sensor in this embodiment.
Fig. 2 is a schematic diagram of an implementation flow of the information fusion method of the radar sensor and the infrared imaging sensor in this embodiment.
Fig. 3 is a detailed implementation flowchart of implementing fusion tracking by adopting the adaptive variance weighted fusion method in this embodiment.
Fig. 4 is a schematic diagram of an implementation flow of decision-level fusion recognition by using an evidence theory method in this embodiment. .
Detailed Description
The invention is further described below in connection with the drawings and the specific preferred embodiments, but the scope of protection of the invention is not limited thereby.
Heterogeneous sensors are sensors of different structural types, e.g. radar sensors and infrared imaging sensors, that use different detection mechanisms to achieve target detection. The two heterogeneous sensors are subjected to information fusion, so that the advantages of the two sensors can be effectively utilized, target information with different dimensions can be obtained, and the recognition effect in a complex environment can be improved according to the environment and the scene change working mode. For example, the radar sensor and the infrared imaging sensor are subjected to information fusion, the information acquired by the two sensors has high complementarity, the infrared imaging sensor can acquire accurate azimuth pitching angle information, geometric shape and other apparent information of an infrared target, the radar can acquire distance, speed, acceleration, radar scattering cross section (RCS), azimuth pitching angle and other information of the target, and the information jointly form multidimensional characteristic information of the target. In addition, the radar is in an active working mode and is good at detecting a moving target, but is greatly influenced by electronic interference and reflected signals, the infrared imaging sensor is in a passive working mode and is not influenced by the electronic interference, and the environment adaptability and the recognition effect can be effectively improved by complementation of the radar and the infrared imaging sensor.
Although the detection mechanisms of the heterogeneous sensors are different, the clutter distribution in the information detected by the heterogeneous sensors is random in space, namely, the clutter points obtained by the two sensors at the same moment are generally different in distribution, while the target points are relatively stable, namely, the distribution state of the target points detected by the heterogeneous sensors is obviously different from the distribution state of the clutter. By utilizing the distribution characteristics, after the data detected by the two heterogeneous sensors are subjected to time and space registration, the data-based correlation method is used for carrying out correlation pairing (suspicious target points) according to the joint distribution state of the target points of the two sensors in the fusion detection process so as to separate the measurement generated by the targets from the measurement generated by clutter, and the fusion of the information of different dimensions of the heterogeneous distributed sensors on the expression level can be simply and efficiently realized; in the fusion tracking process, the measurement information of the correlation observation pair is fused according to the joint distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, so that the matching accuracy can be effectively improved, and finally, an accurate recognition result can be obtained through decision-level recognition, and the recognition performance is effectively improved.
The present invention will be specifically described below with reference to the implementation of information fusion between a radar sensor and an infrared imaging sensor.
As shown in fig. 1 and 2, the detailed steps of information fusion between the radar sensor and the infrared imaging sensor in this embodiment include:
s01, space-time registration: and acquiring data detected by a radar sensor (such as millimeter wave radar) and an infrared imaging sensor, and performing time and space registration to obtain registered data.
The radar and infrared imaging sensors are sensors with different working modes and larger characteristic differences, for example, the installation positions of the two sensors are usually deviated and the sampling frequencies of the two sensors are usually different, and the detected target information inevitably generates spatial deviation, so that when information fusion is carried out, the observation models and measurement data of the different sensors are required to be subjected to spatial and temporal registration alignment, and the target information obtained by the two sensors is calibrated into the same spatial coordinates. Spatial registration is the process of unifying the observed data acquired by different sensors into one reference coordinate system.
In consideration of small space difference between the installation positions of the millimeter wave radar and the infrared imaging sensor, the target information obtained by the two sensors is in the same reference coordinate system, in the embodiment, the space difference between the two sensors is obtained only through static test, and then the information of the two sensors can be spatially calibrated through conversion and translation.
The millimeter wave radar and the infrared imaging sensor have different sampling frequencies, and the time for starting to detect the target is generally different, so that proper reference time points are required to be selected before the two sensor information are fused to synchronously process the observation data obtained by the millimeter wave radar and the infrared imaging sensor. In this embodiment, the step of time registration specifically includes: and (3) interpolating and extrapolating target data acquired by each sensor in the same time, and estimating data at each moment in two times of observation time to realize the synchronization of information of the two sensors in time.
When the initial moments of the millimeter wave radar and the infrared imaging sensor are the same, the sampling frequency of the infrared imaging sensor is generally higher than that of the millimeter wave radar, and in order to avoid information loss of the infrared sensor, the embodiment specifically synchronizes millimeter wave radar data with infrared data through interpolation and extrapolation; when the initial moments of the millimeter wave radar and the infrared imaging sensor sampling are different, the first sampling data of the infrared imaging sensor and the last millimeter wave radar sampling data are assumed to be sampled at the same moment, and the slight error caused by the assumption is in the receiving range.
Taking the example of implementing the time registration in a specific application embodiment, when implementing interpolation extrapolation using linear interpolation, let x be assumed R1 Is a radar sensor t 1 Time measurement, x R2 Is a radar sensor t 2 Estimated value of time x R3 Is a radar sensor t 3 Time of day measurements. Due to t 1 And t 3 The time interval is short and can be approximately linear, by the method of x R1 And x R3 Linear interpolation is carried out, and the radar sensor at t can be obtained 2 Estimated value of time:
Figure BDA0004018356410000061
thus, the radar sensor data with lower sampling frequency can be aligned with the infrared imaging sensor data.
For space-time registration among other heterogeneous sensors, the same principle as the above can be adopted, and other space-time registration methods can be adopted according to actual requirements.
S02, fusion detection: and according to the joint distribution state of the measurement information of the two heterogeneous sensors, acquiring the measurement information of the two heterogeneous sensors at the same moment from the registered data to perform association pairing to obtain an association observation pair.
After space-time registration, in order to fully utilize information obtained by two heterogeneous distributed sensors, the embodiment uses a data association method to fuse and detect measurement information of the two sensors. The measurement information specifically comprises an azimuth angle and a pitch angle, a first statistical judgment quantity T obeying chi-square distribution with the degree of freedom of 2 is constructed by using the azimuth angle and the pitch angle obtained by the radar sensor and the infrared imaging sensor, when the first statistical judgment quantity T is larger than a preset threshold lambda, the measurement value of the radar sensor to be judged and the measurement value of the infrared sensor are associated observation pairs from the same position, otherwise, the measurement value of the radar sensor to be judged and the measurement value of the infrared sensor are not associated observation pairs from the same position, wherein the preset first threshold lambda is obtained according to chi-square distribution setting with the degree of freedom of 2. The detailed process of constructing the first statistical decision T in this embodiment is:
firstly, measuring values of the radar are obtained at a moment k, wherein the measuring values comprise azimuth angles, pitch angles and distances of suspicious points detected by the radar, and the measuring values are the sum of real values and measuring noise. The measurement noise is assumed to be independent of each other and obeys a normal distribution with zero mean. At the same time, the measurement value of the infrared imaging sensor is obtained, wherein the measurement value comprises the azimuth angle and the pitch angle of the suspicious point detected by the infrared imaging sensor, and the measurement value is the sum of the true value and the measurement noise, and is assumed to be independent of each other and obey normal distribution with zero mean value.
Assuming that the suspicious point at the time k is composed of a real target and a plurality of clutter, the clutter distribution is random in space due to the different structures and detection mechanisms of the radar and the infrared imaging sensor, the clutter point distribution obtained by the two sensors at the same time is generally different, and the target point is relatively stable. Thus, statistics that obey a standard normal distribution can be constructed:
Figure BDA0004018356410000071
Figure BDA0004018356410000072
wherein,,
Figure BDA0004018356410000073
and->
Figure BDA0004018356410000074
Azimuth angle measurement values obtained by a millimeter wave radar and an infrared imaging sensor respectively, < + >>
Figure BDA0004018356410000075
And
Figure BDA0004018356410000076
pitch angle measurement value sigma obtained by millimeter wave radar and infrared imaging sensor respectively ,/>
Figure BDA0004018356410000077
The variances of the azimuth angle measurement value and the pitch angle measurement value of the millimeter wave radar/infrared imaging sensor are respectively, and N (0, 1) is standard normal distribution.
As described above, the joint distribution between the measurement information of the radar and the infrared imaging sensor is compliant with the standard normal distribution, and the clutter does not have the above characteristics, so that the real target point and clutter can be distinguished. And then using formulas (2) and (3), constructing a statistical decision quantity obeying chi-square distribution with the degree of freedom of 2 by using azimuth angles and pitch angles obtained by the radar sensor and the infrared imaging sensor to decide whether the target point is a suspicious target point, wherein the statistical decision quantity is a first statistical decision quantity T, and the calculation expression is specifically as follows:
Figure BDA0004018356410000078
wherein,,
Figure BDA0004018356410000079
and->
Figure BDA00040183564100000710
Azimuth angle measurement values obtained for the radar sensor and the infrared sensor, respectively,>
Figure BDA00040183564100000711
and->
Figure BDA00040183564100000712
Pitch angle measurement value sigma respectively obtained by radar sensor and infrared sensor ,/>
Figure BDA00040183564100000713
Variance, sigma, of azimuth and pitch angle measurements obtained by the radar sensor, respectively ,/>
Figure BDA00040183564100000714
The variances of the azimuth angle measurement value and the pitch angle measurement value obtained by the infrared sensor are respectively.
According to the characteristic that suspicious target points and clutter of two sensors are distributed from different combinations, firstly, measurement generated by targets and measurement generated by clutter are separated, and then fusion is carried out on suspicious points generated by the two sensors, so that matching accuracy can be effectively improved, and final recognition performance is improved.
Since the measurement value generated by the target follows the chi-square distribution and the measurement value generated by the clutter does not follow the chi-square distribution, the embodiment can obtain the first threshold lambda when the degree of freedom is 2 according to the characteristics of the chi-square distribution by setting the quantile first. And (3) judging whether the measured value of the radar sensor and the measured value of the infrared sensor are suspicious target points from the same position or not by using the first statistical judgment quantity T calculated in the formula (4), so as to realize association pairing, and obtaining an association observation pair, namely judging that the measured value of the radar sensor to be judged and the measured value of the infrared sensor are association observation pairs from the same position when the first statistical judgment quantity T is larger than a first threshold lambda, and otherwise judging that the measured value of the radar sensor to be judged and the measured value of the infrared sensor are not association observation pairs from the same position.
In a specific application embodiment, the detailed steps for implementing the fusion detection of the measurement value of the radar sensor and the measurement value of the infrared sensor are as follows:
s21, setting a first threshold lambda, namely setting a quantile first, and obtaining the value of the first threshold lambda when the degree of freedom is 2;
s22, calculating first statistical decision quantities T of different suspicious target points according to a formula (4).
S23, comparing the first statistics T with a first threshold lambda, if T < lambda > judging that the current two measurement values come from suspicious target points at the same position, and if T > lambda > judging that the current two measurement values come from suspicious points at different positions.
S24, traversing the combination of all suspicious target points in the current moment of the millimeter wave radar and the infrared sensor, and repeatedly executing the step S23 until all suspicious target points are traversed;
s25, if the measured values of the two sensors are matched, namely suspicious target points from the same position, recording the positions of the suspicious target points, and transferring to the step S3; otherwise, continuing to detect the next moment.
According to the embodiment, fusion detection is realized from different joint distributions according to the target point and clutter detected by the sensor, so that the real target point and clutter can be effectively distinguished, and the accuracy of point-track association matching is effectively improved.
S03, fusion tracking: and determining a suspected measurement pair from the associated observation pair according to the joint distribution relation between the measurement information of the radar sensor and the infrared imaging sensor and the noise variance, and fusing the measurement information of the suspected measurement pair to obtain a fused suspected measurement pair.
The radar can provide a high-precision distance value of a target, and can provide two angle measurement values of pitching and azimuth with relatively poor target precision, and the infrared imaging sensor can provide two angle measurement values of pitching and azimuth with relatively high precision.
According to the matching result of the fusion detection, the embodiment specifically adopts a variance adaptive fusion tracking method to realize fusion tracking, and adopts a weighted 'trace-trace' association logic, namely weighting is carried out according to the variance of the measurement noise, and the weighted object is to carry out weighted fusion on the original target state measured value (trace) output by the sensor, and then filtering is carried out to obtain more accurate target state description, as shown in fig. 3.
In this embodiment, according to the joint distribution relationship between the measurement information of the radar sensor and the infrared imaging sensor and the noise variance, determining the suspected measurement pair from the correlation observation pair specifically includes: constructing chi-square distribution obeying degree of freedom 2 by using azimuth angles and pitch angles obtained by the radar sensor and the infrared imaging sensor and azimuth angle measurement noise variance of the radar sensor and the infrared imaging sensor and pitch angle measurement noise variance
Figure BDA0004018356410000091
Alpha is the probability of misjudging two different target observations as the same target observation, when the second statistical judgment quantity is larger than a preset second threshold +.>
Figure BDA0004018356410000092
Determining that the measured values of two heterogeneous sensors to be discriminated are suspected measuring pairs, wherein a second threshold is preset>
Figure BDA0004018356410000093
According to the chi-square distribution with degree of freedom 2 +.>
Figure BDA0004018356410000094
Setting to obtain the product.
The specific construction steps of the second statistical judgment amount are as follows:
is provided with
Figure BDA0004018356410000095
The ith azimuth observation of the radar at the moment k respectively,A j-th azimuth observation of the infrared imaging sensor, an i-th pitch observation of the radar, and a j-th pitch observation of the infrared imaging sensor, while
Figure BDA0004018356410000096
The method comprises the steps of measuring the noise variance of a radar azimuth angle, the noise variance of a radar pitch angle, the noise variance of an infrared sensor azimuth angle and the noise variance of an infrared sensor pitch angle. It is assumed that both millimeter wave radar and infrared imaging sensor observation noise are white noise, so there are:
Figure BDA0004018356410000097
Figure BDA0004018356410000098
Figure BDA0004018356410000099
Figure BDA00040183564100000910
and then establishes the hypothesis:
Figure BDA00040183564100000911
H 0 the conditions for establishment are:
Figure BDA00040183564100000912
from equation (6), it is known that the joint distribution between the measurement information of the radar sensor and the infrared sensor and the noise variance is subjected to a normal distribution, and a second statistical decision amount can be constructed according to the same principle as the first statistical decision amount T 2 () The second statistical decision is shown as equation (7)
Figure BDA00040183564100000917
Obeying chi-square distribution with degree of freedom of 2.
Figure BDA00040183564100000913
Wherein alpha is the probability of misjudging two different target observations as the same target observation,
Figure BDA00040183564100000914
representing a second statistical decision quantity between the ith radar sensor measurement information and the jth infrared sensor measurement information at the k moment,
Figure BDA00040183564100000915
respectively an ith azimuth observation of the radar sensor at the moment k, a jth azimuth observation of the infrared sensor, an ith pitch angle observation of the radar sensor and a jth pitch angle observation of the infrared sensor,
Figure BDA00040183564100000916
the method comprises the steps of measuring the noise variance of an azimuth angle of a radar sensor, measuring the noise variance of a pitch angle of the radar sensor, measuring the noise variance of an azimuth angle of an infrared sensor and measuring the noise variance of a pitch angle of the infrared sensor.
Thus (2) 0 The conditions for establishment are:
Figure BDA0004018356410000101
and (3) taking the radar sensor and infrared imaging sensor observation pair meeting the above formula (8) as a suspected measurement pair, namely further weighting and fusing the target which is interested and found from the fusion-detected suspected target point pair by utilizing the joint distribution relation between the measurement information of the sensor and the noise variance.
In this embodiment, specifically, the azimuth angles of the obtained suspected measurement pair are weighted by using the azimuth angle measurement noise variance of the radar sensor and the infrared sensor to obtain the azimuth angle after fusion, and the pitch angle of the obtained suspected measurement pair is weighted by using the pitch angle measurement noise variance of the two heterogeneous sensors to obtain the pitch angle after fusion.
The azimuth angle after fusion and the pitch angle after fusion are specifically obtained by respectively calculating the following formulas:
Figure BDA0004018356410000102
Figure BDA0004018356410000103
wherein,,
Figure BDA0004018356410000104
and->
Figure BDA0004018356410000105
The fusion rear differential angle and the fusion rear pitch angle at the k moment are respectively.
In this embodiment, the suspected pseudo-observation is obtained by weighting the difference between the suspected measurements according to equations (9) and (10)
Figure BDA0004018356410000106
Because the variance of the sensor information with higher reliability in the fusion process has higher contribution degree to fusion, the adaptive fusion of the observation pair can be completed after fusion weighting is carried out through the variance, as shown in fig. 3, after the fusion angle value is obtained, the track is updated by using an extended Kalman filtering algorithm, so that more accurate description of the target state can be obtained, the matching precision is effectively improved, and the fusion effect is further improved.
S04, fusion identification: and carrying out decision-stage identification according to the suspected measurement pair after fusion to obtain a final identification result.
And the decision stage identification is to fuse the primary identification result of the radar sensor and the infrared imaging sensor to determine the final result. The method specifically uses an evidence theory method to perform decision-level recognition, wherein target generic confidence in recognition results obtained by two sensors is used as a evidence body, the target information confidence detected by different sensors is converted into support degree, the mutual support degree and conflict strength between radar sensor and infrared imaging sensor information are calculated, the contribution degree of different sensor information to final fusion information is measured, the two heterogeneous sensor information is weighted according to the mutual support degree and the conflict strength to obtain a fusion result, uncertainty in sensor information can be removed, unknown and uncertain are distinguished, conflict information is effectively processed, heterogeneous sensor information is effectively fused, an optimal information fusion scheme is obtained, the requirements on knowledge are less, the application condition is loose and flexible, and the environmental adaptability of a recognition system can be further enhanced.
In the fusion recognition of the target, through steps S01-S03, the two sensors detect the target respectively, the radar detects the position, distance, speed and other information of the target, the infrared imaging sensor detects the azimuth, pixel position, apparent contour and other information of the target in imaging, and the matching recognition is carried out according to the information obtained by the sensors, so that a preliminary recognition result can be obtained; and on the basis, step S04 further carries out decision-stage fusion processing on the recognition result by using an evidence theory method. The implementation principle of the evidence theory method comprises the following steps:
first, an identification frame Θ is set, m is set to be 2 Θ →[0,1]Is a basic probability distribution (BPA) function, satisfying:
Figure BDA0004018356410000111
wherein A represents an event,
Figure BDA0004018356410000112
representing an empty set. The corresponding trust function (Be l) defining m is:
Figure BDA0004018356410000116
the trust function represents the total degree of trust for a, and the likelihood function (Pl) for a is defined as:
Figure BDA0004018356410000113
indicating that the confidence of a is not negated.
Let two evidences and one element O in the recognition frame i The trust intervals are respectively [ Bel (O) i ) 1 ,Pl(O i ) 1 ]And [ Bel (O) i ) 2 ,Pl(O i ) 2 ]The distance can be expressed as:
Figure BDA0004018356410000114
the distance between trust intervals reflects both the difference in certainty of two evidence pairs and the difference in uncertainty, and is a weighted representation of certainty and uncertainty.
Based on the distance measurement of the two evidences to the trust interval of the same element on the identification frame, the whole distance between the two evidences is:
Figure BDA0004018356410000115
wherein n represents the number of elements in the identification frame, C is a normalization factor and is a constant.
The whole distance K between the evidences reflects the conflict situation between the two evidences, when the two evidences are identical, the conflict is 0, and the larger the evidence difference is, the larger the conflict is. When the evidence conflict exceeds a certain degree, the reliability of the evidence conflict is judged by adopting the sensor characteristic measurement information, and the evidence confidence level is corrected to eliminate the evidence conflict problem.
As shown in fig. 4, the detailed steps of decision-level fusion recognition using the evidence theory method in this embodiment include:
s401, initializing parameters: taking the confidence coefficient of the target class in the recognition results obtained by the two sensors as an evidence body, setting a basic probability distribution function for any evidence, and calculating the conflict intensity value K (m R ,m I ) And a mutual support degree value C (m R ,m I )=1-K(m R ,m I );
S402, conflict detection: the collision strength value is compared with a threshold τ, if K (m R ,m I ) If tau is less than or equal to tau, the mutual support degree of the sensor evidences is high, the evidence is fused to obtain the current fusion confidence m (t), the step S404 is shifted to, if K (m) R ,m I )>τ, explaining that the evidence has strong conflict, and turning to execute step S403;
s403, respectively calculating the integral distance between the two sensors and the fused evidence at the previous moment, selecting the sensor with the small distance from the fused evidence at the previous moment as the evidence of the current period, namely selecting the sensor with the small conflict with the previous moment as the confidence coefficient of the current period, and calculating the expression as follows:
Figure BDA0004018356410000121
s404, calculating the total support degree of all evidence bodies on each evidence body by integrating the evidence obtained in the current period and the previous r-1 historical periods;
firstly, synthesizing evidence obtained in the current period and the previous r-1 historical periods, and calculating to obtain a mutual support matrix:
Figure BDA0004018356410000122
wherein,,
Figure BDA0004018356410000123
C ij ∈[0,1]i, j.epsilon. {1,2, …, r } is the mutual support between evidences and has C ij =C ji ,∑C ij =1。
In the mutual support matrix, all elements of a certain row represent the set of the support of all evidence bodies to a certain evidence body, so the sum of all elements of a certain row represents the total support of all evidence bodies to a certain evidence body, for evidence body m i All evidence bodies have a total support of
Figure BDA0004018356410000124
The greater the overall support of the evidence, the more evidence it is supported, and therefore the greater the confidence, the greater the weight that the multiple evidence should be given when participating in fusion.
S405, calculating weight values according to the total support degree of all evidence bodies to each evidence body to weight and correct the evidence m WAE
Specifically defining the fusion weight of each evidence body as follows:
Figure BDA0004018356410000125
and computing weighted combined correction evidence according to the weights:
Figure BDA0004018356410000126
s406, calculating the weight correction evidence m WAE And fusing for multiple times to obtain fused confidence coefficient of each category, and judging the category corresponding to the highest confidence coefficient as the final target category.
In the embodiment, the information fusion of heterogeneous sensors is completed by sequentially executing space-time registration, fusion detection, fusion tracking and fusion recognition, and firstly, space-time registration alignment and correlation with target information in a field of view are carried out on observation models and measurement data of different sensors; after spatial-temporal registration, fusionThe detection method comprises the steps of comprehensively considering possible clutters of each sensor, carrying out certain logic judgment on detection results of different sensors based on a first statistical judgment quantity T, and realizing association pairing; according to the result of fusion detection, fusion tracking is carried out on the measurement data of different sensors in the tracking state according to the second statistical decision quantity 2 () And (3) correlating, namely finding an interested target, and carrying out self-adaptive variance weighted fusion on target state information to eliminate the influence of measurement errors, clutter, false alarms and interference. Finally, the fusion recognition extracts the associated target characteristic information, and decision-level fusion recognition is realized through an evidence theory method, so that the influence of clutter and interference in a scene on target recognition is reduced, and the target recognition accuracy is improved.
Besides realizing information fusion between the radar sensor and the infrared imaging sensor, the method can be also suitable for realizing information fusion between other heterogeneous sensors.
The heterogeneous sensor information fusion device of the present embodiment includes:
the space-time registration module is used for acquiring data detected by the two heterogeneous sensors and carrying out time and space registration to obtain registered data;
the fusion detection module is used for acquiring measurement information of the two heterogeneous sensors at the same moment from the registered data according to the distribution state of the measurement information of the two heterogeneous sensors, and carrying out association pairing to obtain an association observation pair;
the fusion tracking module is used for determining a suspected measurement pair from the association observation pair according to the distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, and fusing the measurement information of the suspected measurement pair to obtain a fused suspected measurement pair;
and the fusion recognition module is used for carrying out decision-level fusion recognition according to the suspected measurement pair after fusion to obtain a final recognition result.
The heterogeneous sensor information fusion device in this embodiment corresponds to the heterogeneous sensor information fusion method in this embodiment in a one-to-one manner, and will not be described in detail here.
In another embodiment, the heterogeneous sensor information fusion device of the present invention may further include a processor and a memory, the memory for storing a computer program, the processor for executing the computer program to perform the method as described above.
The foregoing is merely a preferred embodiment of the present invention and is not intended to limit the present invention in any way. While the invention has been described with reference to preferred embodiments, it is not intended to be limiting. Therefore, any simple modification, equivalent variation and modification of the above embodiments according to the technical substance of the present invention shall fall within the scope of the technical solution of the present invention.

Claims (10)

1. The heterogeneous sensor information fusion method is characterized by comprising the following steps of:
s01, acquiring data detected by two heterogeneous sensors, and performing time and space registration to obtain registered data;
s02, according to the joint distribution state of the measurement information of the two heterogeneous sensors, acquiring the measurement information of the two heterogeneous sensors at the same moment from the registered data to perform association pairing to obtain an association observation pair;
s03, determining a suspected measurement pair from the associated observation pair according to the joint distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, and fusing the measurement information of the suspected measurement pair to obtain a fused suspected measurement pair;
and S04, carrying out decision-stage identification according to the suspected measurement pair after fusion to obtain a final identification result.
2. The heterogeneous sensor information fusion method according to claim 1, wherein the measurement information includes an azimuth angle and a pitch angle, and in the step S02, a first statistical decision T obeying a chi-square distribution with a degree of freedom of 2 is constructed by using the azimuth angle and the pitch angle obtained by the two heterogeneous sensors, and when the first statistical decision T is greater than a preset threshold λ, it is determined that the measurement values of the two heterogeneous sensors to be discriminated are associated observation pairs from the same location, and otherwise it is determined that the measurement values are not associated observation pairs from the same location, wherein the preset first threshold λ is set according to the chi-square distribution with the degree of freedom of 2.
3. The heterogeneous sensor information fusion method according to claim 2, wherein the calculation expression of the first statistical decision T is:
Figure FDA0004018356400000011
wherein,,
Figure FDA0004018356400000012
and->
Figure FDA0004018356400000013
Azimuth angle measurement values obtained by the first sensor and the second sensor respectively, +.>
Figure FDA0004018356400000014
And->
Figure FDA0004018356400000015
Pitch angle measurement value sigma obtained by the first sensor and the second sensor respectively ,/>
Figure FDA0004018356400000016
Variance, sigma, of azimuth angle and pitch angle measurements obtained by the first sensor, respectively ,/>
Figure FDA0004018356400000017
The variance of the azimuth angle measurement value and the pitch angle measurement value obtained by the second sensor are respectively, and the first sensor and the second sensor are heterogeneous sensors.
4. The variant according to claim 1The mass sensor information fusion method is characterized in that in the step S03, according to a joint distribution relationship between measurement information of the heterogeneous sensor and noise variance, determining a suspected measurement pair from the correlation observation pair includes: using azimuth angle and pitch angle obtained by two heterogeneous sensors, and measuring noise variance of azimuth angle of two heterogeneous sensors and measuring noise variance of pitch angle of two heterogeneous sensors to construct chi-square distribution with compliance degree of freedom of 2
Figure FDA0004018356400000018
Alpha is the probability of misjudging two different target observations as the same target observation, when the second statistical judgment quantity is larger than a preset second threshold +.>
Figure FDA0004018356400000019
Determining that the measurement values of the two heterogeneous sensors to be discriminated are suspected measurement pairs, wherein the second threshold is preset +.>
Figure FDA00040183564000000110
According to the chi-square distribution with degree of freedom 2 +.>
Figure FDA00040183564000000111
Setting to obtain the product.
5. The heterogeneous sensor information fusion method according to claim 4, wherein the calculation expression of the second statistical decision quantity is:
Figure FDA0004018356400000021
wherein,,
Figure FDA0004018356400000022
representing a second statistical decision quantity between the ith first sensor measurement information and the jth second sensor measurement information at time k, +.>
Figure FDA0004018356400000023
An i-th azimuth observation of the first sensor, a j-th azimuth observation of the second sensor, an i-th pitch angle observation of the first sensor, and a j-th pitch angle observation of the second sensor at time k, respectively,/">
Figure FDA0004018356400000024
The method comprises the steps of measuring noise variance of azimuth angles of a first sensor, measuring noise variance of pitch angles of the first sensor, measuring noise variance of azimuth angles of a second sensor and measuring noise variance of pitch angles of the second sensor, wherein the first sensor and the second sensor are heterogeneous sensors.
6. The heterogeneous sensor information fusion method according to any one of claims 1 to 5, wherein in the step S03, the azimuth angles of the suspected measurement pair are weighted by using the azimuth angle measurement noise variance of the two heterogeneous sensors to obtain a fused azimuth angle, and the pitch angles of the suspected measurement pair are weighted by using the pitch angle measurement noise variance of the two heterogeneous sensors to obtain a fused pitch angle.
7. The heterogeneous sensor information fusion method according to claim 6, wherein the post-fusion azimuth angle and the post-fusion pitch angle are respectively calculated according to the following formulas:
Figure FDA0004018356400000025
Figure FDA0004018356400000026
wherein,,
Figure FDA0004018356400000027
and->
Figure FDA0004018356400000028
The fusion rear differential angle and the fusion rear pitch angle at the moment k are respectively,
Figure FDA0004018356400000029
an i-th azimuth observation of the first sensor, a j-th azimuth observation of the second sensor, an i-th pitch angle observation of the first sensor, and a j-th pitch angle observation of the second sensor at the time k,
Figure FDA00040183564000000210
the method comprises the steps of measuring noise variance of azimuth angles of a first sensor, measuring noise variance of pitch angles of the first sensor, measuring noise variance of azimuth angles of a second sensor and measuring noise variance of pitch angles of the second sensor, wherein the first sensor and the second sensor are heterogeneous sensors.
8. The heterogeneous sensor information fusion method according to any one of claims 1 to 5, wherein in the step S04, decision-level recognition is performed by using an evidence theory method, wherein the mutual support degree and the collision strength between two heterogeneous sensor information are calculated by taking the target generic confidence degree in the recognition results obtained by the two sensors as an evidence body, so as to measure the contribution degree of different sensor information to the final fusion information, and then the two heterogeneous sensor information is weighted according to the mutual support degree and the collision strength, so as to obtain the fusion result.
9. The heterogeneous sensor information fusion method of claim 8, wherein the step of decision-level fusion recognition using evidence theory methods comprises:
s401, initializing parameters: taking the confidence coefficient of the target class in the identification result obtained by the two heterogeneous sensors as an evidence body, setting a basic probability distribution function for any evidence, and calculating a conflict strength value and a mutual support value between the evidence bodies;
s402, conflict detection: judging whether the mutual support degree value is larger than a preset threshold value, if so, fusing the data to obtain the current fusion confidence degree, and then turning to step S404, otherwise, turning to step S403;
s403, respectively calculating the integral distance between the two heterogeneous sensors and the fused evidence at the previous moment, and selecting a sensor with a small distance from the fused evidence at the previous moment as the current period evidence;
s404, calculating the total support degree of all evidence bodies on each evidence body by integrating the evidence obtained in the current period and the previous historical periods;
s405, calculating weight values according to the total support degree of all evidence bodies to weight and correct the evidence m WAE
S406, calculating the weighted correction evidence m WAE And fusing for multiple times to obtain fused confidence coefficient of each category, and judging the category corresponding to the highest confidence coefficient as the final target category.
10. A heterogeneous sensor information fusion device, the device comprising:
the space-time registration module is used for acquiring data detected by the two heterogeneous sensors and carrying out time and space registration to obtain registered data;
the fusion detection module is used for acquiring measurement information of the two heterogeneous sensors at the same moment from the registered data according to the joint distribution state of the measurement information of the two heterogeneous sensors, and carrying out association pairing to obtain an association observation pair;
the fusion tracking module is used for determining a suspected measurement pair from the association observation pair according to the distribution relation between the measurement information of the two heterogeneous sensors and the noise variance, and fusing the measurement information of the suspected measurement pair to obtain a fused suspected measurement pair;
the fusion identification module is used for carrying out decision-level identification according to the suspected measurement pair after fusion to obtain a final identification result;
or the apparatus comprises a processor and a memory for storing a computer program, the processor being for executing the computer program to perform the method according to any one of claims 1 to 9.
CN202211676310.6A 2022-12-26 2022-12-26 Heterogeneous sensor information fusion method and device Pending CN116340736A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211676310.6A CN116340736A (en) 2022-12-26 2022-12-26 Heterogeneous sensor information fusion method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211676310.6A CN116340736A (en) 2022-12-26 2022-12-26 Heterogeneous sensor information fusion method and device

Publications (1)

Publication Number Publication Date
CN116340736A true CN116340736A (en) 2023-06-27

Family

ID=86878068

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211676310.6A Pending CN116340736A (en) 2022-12-26 2022-12-26 Heterogeneous sensor information fusion method and device

Country Status (1)

Country Link
CN (1) CN116340736A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117034201A (en) * 2023-10-08 2023-11-10 东营航空产业技术研究院 Multi-source real-time data fusion method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117034201A (en) * 2023-10-08 2023-11-10 东营航空产业技术研究院 Multi-source real-time data fusion method

Similar Documents

Publication Publication Date Title
CN113359097B (en) Millimeter wave radar and camera combined calibration method
CN109975798B (en) Target detection method based on millimeter wave radar and camera
CN113777600B (en) Multi-millimeter wave radar co-location tracking method
CN112731371B (en) Laser radar and vision fusion integrated target tracking system and method
CN109407086B (en) Aircraft trajectory generation method and system and trapping system target guiding method
JP4424272B2 (en) Airport surface monitoring system and track integration device used therefor
CN109214432B (en) Multi-sensor multi-target joint detection, tracking and classification method
CN110058222A (en) A kind of preceding tracking of two-layered spherical particle filtering detection based on sensor selection
CN116340736A (en) Heterogeneous sensor information fusion method and device
CN111999735A (en) Dynamic and static target separation method based on radial velocity and target tracking
CN112462368A (en) Obstacle detection method and device, vehicle and storage medium
CN113988228B (en) Indoor monitoring method and system based on RFID and vision fusion
CN108896989B (en) Millimeter wave radar imaging and pattern recognition
CN114485613B (en) Positioning method for multi-information fusion underwater robot
CN114697165B (en) Signal source detection method based on unmanned aerial vehicle vision and wireless signal fusion
CN114488104B (en) Sky wave beyond-view range radar target tracking method based on interaction consistency
CN113673105A (en) Design method of true value comparison strategy
DK180729B1 (en) System for processing radar data representing intensity values of received power of reflected radar wave signals
CN114705223A (en) Inertial navigation error compensation method and system for multiple mobile intelligent bodies in target tracking
CN114119465A (en) Point cloud data processing method and device
Cao et al. Grayscale Feature Based Multi-Target Tracking Algorithm
CN116125465A (en) Active and passive radar target information fusion method and device
KR102666053B1 (en) Apparatus and method for estimating location
CN117872310B (en) Radar-based water surface target tracking method, device, equipment and medium
CN113960586A (en) Millimeter wave radar target tracking method based on optical image assistance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination