CN112633703A - Air condition fusion precision evaluation system and method - Google Patents

Air condition fusion precision evaluation system and method Download PDF

Info

Publication number
CN112633703A
CN112633703A CN202011558976.2A CN202011558976A CN112633703A CN 112633703 A CN112633703 A CN 112633703A CN 202011558976 A CN202011558976 A CN 202011558976A CN 112633703 A CN112633703 A CN 112633703A
Authority
CN
China
Prior art keywords
precision
data
sensor
track
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011558976.2A
Other languages
Chinese (zh)
Inventor
吴晓朝
王雷钢
王建路
戴幻尧
周波
孔德培
石川
王琼
吴正雄
徐娜娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
UNIT 63892 OF PLA
Original Assignee
UNIT 63892 OF PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by UNIT 63892 OF PLA filed Critical UNIT 63892 OF PLA
Priority to CN202011558976.2A priority Critical patent/CN112633703A/en
Publication of CN112633703A publication Critical patent/CN112633703A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Educational Administration (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention belongs to the technical field of multi-source information fusion and evaluation, and discloses an empty condition fusion precision evaluation system and method, wherein the corresponding relation between space coordinate data and a timestamp is determined according to data stored in a reading protocol, the data is converted into longitude and latitude heights according to set sensor reference coordinates, the detection precision of each sensor is respectively calculated, the priority ranking of the sensors is obtained, the final reference precision is obtained according to a set evaluation method, the reference precision is used as an evaluation reference, the evaluation result of the empty condition fusion is calculated, and the final evaluation information and result are given; the invention can evaluate and identify the fusion precision of the air condition fusion system working in a simulation environment, a real-installation environment or a mixed environment by utilizing the evaluation algorithm of the proportion of the sensor detection data participating in fusion in each time-space domain and the final precision benchmark and the final fusion precision, and has good adaptability and reliability.

Description

Air condition fusion precision evaluation system and method
Technical Field
The invention belongs to the technical field of multi-source information fusion and evaluation, and particularly relates to an air condition fusion precision evaluation system and method suitable for multiple sensors and multiple targets in a complex environment.
Background
The air condition fusion system is one of the core components of the air defense command control system, is an important information source for battlefield situation judgment and combat decision, has important effects on battlefield perception, command decision, fire striking and the like of the command control system, and directly influences the viability of the air defense combat system in performance and efficiency. The precision index is a very important index of the air condition fusion system, and the precision index is the degree of the fusion track relative to the accuracy detected by the sensor after the fusion system carries out observation information extraction, tracking and comprehensive processing on the acquired target information, reflects the tracking capability and information guiding capability of the system, and is also an important basis for measuring the performance of the fusion algorithm. According to the information fusion theory, an excellent fusion algorithm can improve the accuracy of track information, but the battlefield environment is complex, the sensor is mainly influenced by radar, the uncertainty factors of the target detection process are many, such as electromagnetic interference, airflow disturbance, mountain land shielding, sensor detection characteristics and the like, the uncertainty and discontinuity of the detection data of the sensor can be caused by the factors, the fusion accuracy of the air fusion system is influenced, the difficulty of accuracy evaluation is increased, and how to determine the accuracy evaluation method and the reference is a problem which needs to be solved urgently.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides an air condition fusion precision evaluation system and method, which aim at precision evaluation and identification of the air condition fusion system in a complex environment.
In order to achieve the purpose, the invention adopts the following technical scheme:
an air condition fusion precision evaluation system comprises a data input module, a data processing and precision evaluation module and an information display module, wherein the data input module is electrically connected with the data processing and precision evaluation module through a BLH data converter, and the output end of the data processing and precision evaluation module is electrically connected with the information display module through a cable; the data input module is electrically connected with the coordinate conversion module; the data end of the data processing and precision evaluation module is connected with the comprehensive evaluation module through a precision calculation module and a data analysis module to form a loop;
the data input module is a unified information input format module and configures the precision priority of the sensor; the data processing and precision evaluating module consists of a highest precision module, a lowest precision module, an average precision module and a self-adaptive weight module, and the information display module is used for outputting and displaying an evaluation result.
An empty fusion precision evaluation method comprises the steps of determining the corresponding relation between space coordinate data and a timestamp according to data stored in a reading protocol, converting the data into longitude and latitude heights according to set sensor reference coordinates, respectively calculating the detection precision of each sensor, obtaining sensor priority sequencing, obtaining final reference precision according to a set evaluation method, taking the reference precision as an evaluation reference, calculating an evaluation result of empty fusion, and giving final evaluation information and a result; the method comprises the following steps:
reading collected sensor information data and flight path truth value data; if the simulation data is the flight path true value data, the flight path true value data is original data generated by simulation; when the data is the data of the actual sensor, the true value data of the flight path is high-precision measurement data;
determining the corresponding relation between the space coordinate data and the time stamp according to the protocol format, calculating the detection precision of the sensor, and obtaining the priority sequence of the sensor;
protocol format: the method is used for realizing the data compatibility and the unification of the processing method, and defining the standard input protocol format; the data format comprises sensor data, flight path truth value data and fusion data;
the sensor data input format is: track batch number ID + reference coordinate system detection data (xData, yData, zData) + timestamp Time, wherein the track batch number is defined as the ID number of the same target, the reference coordinate system detection data is data of three axes of sensor detection acquired aiming at a sensor reference position, and the timestamp is the Time when a certain discrete data is acquired;
the flight path truth value data has two conditions, one is truth value data acquired by simulation, the other is target data acquired by high-precision detection equipment, and the recording format of the flight path truth value of the first condition is as follows: the method comprises the steps that track lot number ID + original data (B, L, H) + timestamp Time, wherein the original data are data of target simulation data in longitude B, latitude L and elevation H; the track truth value data format of the second case is the same as the sensor data format;
the recording format of the fused data is track batch number ID + original data (B, L, H) + timestamp Time;
thirdly, determining the reference precision: determining reference precision, calculating actual detection precision of the sensors, setting N sensors participating in fusion, and calculating detection precision sigma of each sensor according to sensor detection dataiThe calculation formula is as follows:
Figure BDA0002858874680000031
wherein xijThe detection data of the j point of the ith sensor,
Figure BDA0002858874680000032
is xijCorresponding track true value point data, wherein n is the track data quantity of the ith sensor; if the highest-precision evaluation method is selected, selecting the sensor with the highest precision as the reference precision; if the evaluation method with the lowest precision is selected, selecting the sensor with the lowest precision as the reference precision; if the average accuracy is selectedThe evaluation method selects the average accuracy of the sensors as the reference accuracy, i.e. sigmaw=(∑σi) N; if the adaptive weight rule is selected, the calculation needs to be performed according to the following method:
1) according to the sensor detection precision sigma obtained by calculationiThe sensors are sequenced from high to low to construct a sensor precision sequence Ri∈(R1,R2,…,RN);
2) Matching the sensor track association with a true value track; the method for utilizing the space distance in track association takes the minimum value of the space distance between the sensor track and the true track as the matching principle, and if the minimum value is calculated to obtain the sensor RiThe spatial distance between the jth track and each true value track in the tracks is as follows:
Dj={d1,d2,…,dk} (1)
wherein k is the batch number of the true flight path, and D is takenjThe true track corresponding to the minimum value in (d) is associated with the jth sensor track, i.e., if d iskIf the true value is the minimum, the kth true value track is correspondingly associated with the jth sensor track; in this way, the sensors R are sequentially foundiAll the tracks and the true value tracks are associated and matched with objects, and similarly, the tracks of all other sensors can determine the true value tracks which are associated and matched; a true value track is associated with multiple tracks in a sensor, and true value tracks j and sensors R are setiIf n tracks are associated, the true track j is associated with the matched sensor RiThe set of associated time domain intervals is:
Figure BDA0002858874680000033
wherein the content of the first and second substances,
Figure BDA0002858874680000034
indicating the sensor RiThe time domain interval of the nth detection track,
Figure BDA0002858874680000035
is a sensor RiThe start time of the nth detection track,
Figure BDA0002858874680000041
is a sensor RiSetting the true value of the n-th detection track as K to obtain the sensor RiThe set of associated time domain intervals is:
Figure BDA0002858874680000042
3) the matching method of the statistical fusion flight path and the true value flight path is the same as the step 2), and the time domain interval set of the true value flight path j associated to the m fusion flight paths is calculated as follows:
Fsj={[t11,t12],[t21,t22],…,[tm1,tm2]} (4)
then the total set of fusion track time domain intervals is:
Fs={Fs1,Fs2,…,FsK} (5)
and if the total number of the fused tracks is M, the time cumulant of the fused tracks is as follows:
Figure BDA0002858874680000043
wherein, TsjIs the fusion duration of the jth true track, an
Figure BDA0002858874680000044
LjFor the number of fused tracks associated with the true track j, also set FsjThe number of time domain intervals in (1); namely TsIs FsThe time period sum of all time domain intervals;
4) calculating the precision time cumulant T of each sensori
5) Obtaining a sensor RiPrecision weight value:
wi=Ti/Ts (7)
if sigma w is caused by a calculation erroriNot equal to 1, then w needs to be adjustediCarrying out normalization processing again;
6) according to the formula σw=∑wiσiObtaining the reference accuracy sigmaw
The calculation of the accumulated amount of the precision time of the sensor is a sub-process of acquiring the reference precision of the self-adaptive weight method, wherein the rest is fused with a time domain
Figure BDA0002858874680000045
The calculation is to divide the processed sensor time domain interval from the fusion time domain interval, wherein I is a fusion time domain complete set;
Figure BDA0002858874680000046
is to extract the sensor R which has acted in the fusion for the remaining periodiTime domain interval, then use
Figure BDA0002858874680000047
Calculating sensor RiThe accumulated amount of accuracy time of (1) is:
Figure BDA0002858874680000048
wherein m isiIs a sensor RiThe number of the tracks of (a) is,
Figure BDA0002858874680000051
is to utilize
Figure BDA0002858874680000052
Calculating sensor RiThe calculation method of each track duration, namely the difference between the ending time and the starting time of the track time interval, is the same as the formula (6);
fourthly, an evaluation method: for a flight path, the information is determined by three coordinates, including three groups of data of longitude, latitude and elevation, which represent a flight path data, and then the flight path precision can also correspondingly obtain 3 valuesThus the final precision includes the fusion precision and the reference precision; the final precision would be a 3 value trade-off, then there would be σz=(∑wσw) And/3, wherein w is determined by a scoring expert method according to the importance of three values, when 3 values are scored as a1、a2、a3Then the three values of w are a1/∑ai、a2/∑ai、a3/∑ai(ii) a After the final precision is determined, determining a final evaluation result by comparing and fusing the final precision and the final reference precision; the evaluation method comprises a comparison method, a percentage scoring method and a satisfaction evaluation method,
1) the comparison method comprises the following steps: if the fusion precision is greater than the reference precision, judging that the fusion precision is unqualified, otherwise, judging that the fusion precision is qualified;
2) the percentage scoring method comprises the following steps: determining the percentage of the final fusion precision which is greater than or less than the final reference precision by taking the final reference precision as a standard, determining different percentage areas for scoring, wherein when the percentage is equal to 0, the percentage is 90, the percentage is 80 between 0 and 10 percent, and the percentage is 95 between 0 and-10 percent;
3) the satisfaction evaluation method comprises the following steps: using the formula f ═ exp (-0.1054 σ -lw) Computing a final result, where σlTo fuse the final fusion precisions, σwThe final reference accuracy.
Due to the adoption of the technical scheme, the invention has the following advantages:
an air condition fusion precision evaluation system and method can evaluate and identify the fusion precision of the air condition fusion system working in a simulation environment, a real-installation environment or a mixed environment. The system is suitable for precision evaluation of a multi-sensor multi-target air condition fusion system in a complex environment, namely, the precision evaluation of the air condition fusion system in a simulation environment, a real-mounted environment or a mixed environment. Has good adaptability and reliability.
On the basis of analyzing the precision sequence of actual detection data of each sensor, the system calculates and obtains the reference precision by utilizing the proportion of the detection data of the sensors participating in fusion in each time-space domain, and the reference precision is used as the basis for evaluating the fusion precision. The method is an evaluation algorithm of final precision benchmark and final fusion precision, and can obtain an evaluation result of the air condition fusion precision.
Drawings
Fig. 1 is a block diagram of an air condition fusion precision evaluation system.
Detailed Description
As shown in fig. 1, an empty fusion precision evaluation system includes a data input module, a data processing and precision evaluation module, and an information display module, wherein the data input module is connected to the data processing and precision evaluation module through BLH data conversion, and an output end of the data processing and precision evaluation module is connected to the information display module; the data input module is connected with the coordinate conversion module; the data end of the data processing and precision evaluation module is connected with the comprehensive evaluation module through a precision calculation module and a data analysis module to form a loop;
the data input module is a unified information input format module and configures the precision priority of the sensor; the data processing and precision evaluating module consists of a highest precision module, a lowest precision module, an average precision module and a self-adaptive weight module, and the information display module is used for outputting and displaying an evaluation result.
An air condition fusion precision evaluation method is characterized in that collected sensor information data and flight path truth value data are read, and if the data are simulation data, the flight path truth value data are original data generated by simulation; if the data is the data of the real-mounted sensor, the true flight track data is high-precision measurement data, the corresponding relation between the space coordinate data and the time stamp is determined according to a design protocol format, the detection precision of the sensor is calculated, the priority ranking of the sensor is obtained, the reference precision is determined by four reference precision calculation methods, namely a highest precision method, a lowest precision method, an average precision method and an adaptive weight method, the precision index is evaluated by adopting a selected evaluation method, including a comparison method, an expert scoring method and a satisfaction evaluation method, and final evaluation information and results are given.
The method is characterized in that the weight precision of each sensor is quantitatively designed by reasonably measuring factors such as detection time, detection data measurement precision and the like aiming at the problem of inaccurate fusion track precision evaluation caused by uncertain time and space of multi-sensor track detection information, the air situation data is objectively and scientifically fused based on the reference precision of the evaluation fusion precision, and the performance of a fusion system and the fusion problem traceability are determined by the calculation and analysis of various evaluation methods of the system.
Protocol format: in order to realize data compatibility and uniform processing method, a standard input protocol format is defined. The data formats are mainly three, namely sensor data, track truth value data and fusion data.
The sensor data input format is: track batch number (ID) + reference coordinate system detection data (xData, yData, zData) + timestamp (Time), wherein the track batch number is defined as the ID number of the same target, the reference coordinate system detection data is the data of three axes of sensor detection acquired aiming at the reference position of the sensor, and the timestamp is the current Time for acquiring a certain discrete data.
The flight path truth value data has two conditions, one is truth value data acquired by simulation, the other is target data acquired by high-precision detection equipment, and the recording format of the flight path truth value of the first condition is as follows: track batch number (ID) + original data (B, L, H) + timestamp (Time), the original data being data of the target simulation data in longitude (B), latitude (L) and elevation (H); the second case has the same track truth data format as the sensor data format.
The format of the fused data record is track batch number (ID) + original data (B, L, H) + timestamp (Time).
And (3) determining the reference precision: to determine the reference accuracy, the actual detection accuracy of the sensor needs to be calculated. If N sensors participate in the fusion, the detection precision sigma of each sensor is calculated according to the detection data of the sensorsiThe calculation formula is as follows:
Figure BDA0002858874680000071
wherein xijThe detection data of the j point of the ith sensor,
Figure BDA0002858874680000072
is xijAnd n is the flight path data quantity of the ith sensor. If the highest-precision evaluation method is selected, selecting the sensor with the highest precision as the reference precision; if the evaluation method with the lowest precision is selected, selecting the sensor with the lowest precision as the reference precision; if the average accuracy evaluation method is selected, the average accuracy of the sensor is selected as the reference accuracy, namely sigmaw=(∑σi) N; if the adaptive weight rule is selected, the calculation needs to be performed according to the following method:
1) according to the sensor detection precision sigma obtained by calculationiThe sensors are sequenced from high to low to construct a sensor precision sequence Ri∈(R1,R2,…,RN);
2) And matching the sensor track association with the true track. The method of spatial distance available for track association takes the minimum value of the spatial distance between the sensor track and the true track as the matching principle, if the sensor R is obtained by calculationiThe spatial distance between the jth track and each true value track in the tracks is as follows:
Dj={d1,d2,…,dk} (9)
wherein k is the batch number of the true flight path, and D is takenjThe true track corresponding to the minimum value in (d) is associated with the jth sensor track, i.e., if d iskAnd if the absolute value is minimum, the kth true value track is correspondingly associated with the jth sensor track. In this way, the sensors R can be found out in turniAll tracks match the true track associations, and similarly, all other sensor tracks determine the true track associations. A true value track may be associated with multiple tracks in a sensor, setting true value track j to sensor RiIf n tracks are associated, the true track j is associated with the matched sensor RiThe set of associated time domain intervals is:
Figure BDA0002858874680000081
wherein the content of the first and second substances,
Figure BDA0002858874680000082
indicating the sensor RiThe time domain interval of the nth detection track,
Figure BDA0002858874680000083
is a sensor RiThe start time of the nth detection track,
Figure BDA0002858874680000084
is a sensor RiSetting the true value of the n-th detection track as K to obtain the sensor RiThe set of associated time domain intervals is:
Figure BDA0002858874680000085
3) the matching method of the statistical fusion track and the true value track is the same as 2), and the time domain interval set of the true value track j associated to the m fusion tracks can be calculated as follows:
Fsj={[t11,t12],[t21,t22],…,[tm1,tm2]} (12)
then the total set of fusion track time domain intervals is:
Fs={Fs1,Fs2,…,FsK} (13)
and if the total number of the fused tracks is M, the time cumulant of the fused tracks is as follows:
Figure BDA0002858874680000086
wherein, TsjIs the fusion duration of the jth true track, an
Figure BDA0002858874680000087
LjFor the number of fused tracks associated with the true track j, also set FsjThe number of time domain intervals in (1). It can be seen that TsIs FsThe time period sum of all time domain intervals;
4) calculating the precision time cumulant T of each sensori
5) The sensor R can be obtainediPrecision weight value:
wi=Ti/Ts (15)
if sigma w is caused by a calculation erroriNot equal to 1, then w needs to be adjustediAnd carrying out normalization processing again.
6) According to the formula σw=∑wiσiObtaining the reference accuracy sigmaw
The calculation of the accumulated amount of the precision time of the sensor is a sub-process of acquiring the reference precision of the self-adaptive weight method, wherein the rest is fused with a time domain
Figure BDA0002858874680000091
The calculation is to divide the processed sensor time domain interval from the fusion time domain interval, wherein I is a fusion time domain complete set;
Figure BDA0002858874680000092
is to extract the sensor R which has acted in the fusion for the remaining periodiTime domain interval, then use
Figure BDA0002858874680000093
Calculating sensor RiThe accumulated amount of accuracy time of (1) is:
Figure BDA0002858874680000094
wherein m isiIs a sensor RiThe number of the tracks of (a) is,
Figure BDA0002858874680000095
is to utilize
Figure BDA0002858874680000096
Calculating sensor RiThe calculation method of each track duration, namely the difference between the ending time and the starting time of the track time interval, is the same as the formula (6).
The evaluation method comprises the following steps: for the information of a track, three groups of data such as longitude, latitude and elevation represent one track data, and the track precision also correspondingly obtains 3 values, so the final precision comprises fusion precision and reference precision; the final precision would be a 3 value trade-off, then there would be σz=(∑wσw) [ 3 ] wherein w can be determined by a scoring expert method based on the importance of three values, e.g., 3 values are scored as a1、a2、a3Then the three values of w are a1/∑ai、a2/∑ai、a3/∑ai. After the final precision is determined, a final evaluation result can be determined by comparing and fusing the final precision and the final reference precision, and the evaluation method comprises a comparison method, an expert scoring method and a satisfaction evaluation method, wherein the comparison method comprises the following steps: if the fusion precision is greater than the reference precision, judging that the fusion precision is unqualified, otherwise, judging that the fusion precision is qualified; the expert scoring method comprises the following steps: and determining the percentage of the final fusion precision greater than or less than the final reference precision by taking the final reference precision as a standard, and determining different percentage areas for scoring, for example: 90 at 0, 80 between 0 and 10%, 95 between 0 and-10%; the satisfaction evaluation method comprises the following steps: using the formula f ═ exp (-0.1054 σ -lw) Computing a final result, where σlTo fuse the final fusion precisions, σwThe final reference accuracy.
The working process is as follows: the working process mainly comprises 3 parts, as shown in the figure, software reads data stored according to a protocol, converts the data into longitude and latitude heights according to set sensor reference coordinates, respectively calculates the precision of each sensor, obtains the final reference precision according to a set evaluation method, takes the reference precision as an evaluation reference, calculates the evaluation result of empty fusion, and outputs information for interface display.

Claims (2)

1. The utility model provides an empty condition fuses precision evaluation system which characterized in that: the system comprises a data input module, a data processing and precision evaluating module and an information display module, wherein the data input module is electrically connected with the data processing and precision evaluating module through a BLH data converter, and the output end of the data processing and precision evaluating module is electrically connected with the information display module through a cable; the data input module is electrically connected with the coordinate conversion module; the data end of the data processing and precision evaluation module is connected with the comprehensive evaluation module through a precision calculation module and a data analysis module to form a loop;
the data input module is a unified information input format module and configures the precision priority of the sensor; the data processing and precision evaluating module consists of a highest precision module, a lowest precision module, an average precision module and a self-adaptive weight module, and the information display module is used for outputting and displaying an evaluation result.
2. The assessment method of the air condition fusion precision assessment system according to claim 1, characterized in that: determining the corresponding relation between space coordinate data and a timestamp according to data stored by a reading protocol, converting the data into longitude and latitude heights according to set sensor reference coordinates, respectively calculating the detection precision of each sensor, obtaining the priority ranking of the sensors, obtaining the final reference precision according to a set evaluation method, taking the reference precision as an evaluation reference, calculating an evaluation result of air condition fusion, and giving final evaluation information and result; the method comprises the following steps:
reading collected sensor information data and flight path truth value data; if the simulation data is the flight path true value data, the flight path true value data is original data generated by simulation; when the data is the data of the actual sensor, the true value data of the flight path is high-precision measurement data;
determining the corresponding relation between the space coordinate data and the time stamp according to the protocol format, calculating the detection precision of the sensor, and obtaining the priority sequence of the sensor;
protocol format: the method is used for realizing the data compatibility and the unification of the processing method, and defining the standard input protocol format; the data format comprises sensor data, flight path truth value data and fusion data;
the sensor data input format is: track batch number ID + reference coordinate system detection data (xData, yData, zData) + timestamp Time, wherein the track batch number is defined as the ID number of the same target, the reference coordinate system detection data is data of three axes of sensor detection acquired aiming at a sensor reference position, and the timestamp is the Time when a certain discrete data is acquired;
the flight path truth value data has two conditions, one is truth value data acquired by simulation, the other is target data acquired by high-precision detection equipment, and the recording format of the flight path truth value of the first condition is as follows: the method comprises the steps that track lot number ID + original data (B, L, H) + timestamp Time, wherein the original data are data of target simulation data in longitude B, latitude L and elevation H; the track truth value data format of the second case is the same as the sensor data format;
the recording format of the fused data is track batch number ID + original data (B, L, H) + timestamp Time;
thirdly, determining the reference precision: determining reference precision, calculating actual detection precision of the sensors, setting N sensors participating in fusion, and calculating detection precision sigma of each sensor according to sensor detection dataiThe calculation formula is as follows:
Figure FDA0002858874670000021
wherein xijThe detection data of the j point of the ith sensor,
Figure FDA0002858874670000022
is xijCorresponding track true value point data, wherein n is the track data quantity of the ith sensor; if the highest-precision evaluation method is selected, selecting the sensor with the highest precision as the reference precision; if the evaluation method with the lowest precision is selected, selecting the sensor with the lowest precision as the reference precision; if the average accuracy evaluation method is selectedSelecting the average accuracy of the sensors as the reference accuracy, i.e. sigmaw=(∑σi) N; if the adaptive weight rule is selected, the calculation needs to be performed according to the following method:
1) according to the sensor detection precision sigma obtained by calculationiThe sensors are sequenced from high to low to construct a sensor precision sequence Ri∈(R1,R2,…,RN);
2) Matching the sensor track association with a true value track; the method for utilizing the space distance in track association takes the minimum value of the space distance between the sensor track and the true track as the matching principle, and if the minimum value is calculated to obtain the sensor RiThe spatial distance between the jth track and each true value track in the tracks is as follows:
Dj={d1,d2,…,dk} (1)
wherein k is the batch number of the true flight path, and D is takenjThe true track corresponding to the minimum value in (d) is associated with the jth sensor track, i.e., if d iskIf the true value is the minimum, the kth true value track is correspondingly associated with the jth sensor track; in this way, the sensors R are sequentially foundiAll the tracks and the true value tracks are associated and matched with objects, and similarly, the tracks of all other sensors can determine the true value tracks which are associated and matched; a true value track is associated with multiple tracks in a sensor, and true value tracks j and sensors R are setiIf n tracks are associated, the true track j is associated with the matched sensor RiThe set of associated time domain intervals is:
Figure FDA0002858874670000031
wherein the content of the first and second substances,
Figure FDA0002858874670000032
indicating the sensor RiThe time domain interval of the nth detection track,
Figure FDA0002858874670000033
is a sensor RiThe start time of the nth detection track,
Figure FDA0002858874670000034
is a sensor RiSetting the true value of the n-th detection track as K to obtain the sensor RiThe set of associated time domain intervals is:
Figure FDA0002858874670000035
3) the matching method of the statistical fusion flight path and the true value flight path is the same as the step 2), and the time domain interval set of the true value flight path j associated to the m fusion flight paths is calculated as follows:
Fsj={[t11,t12],[t21,t22],…,[tm1,tm2]} (4)
then the total set of fusion track time domain intervals is:
Fs={Fs1,Fs2,…,FsK} (5)
and if the total number of the fused tracks is M, the time cumulant of the fused tracks is as follows:
Figure FDA0002858874670000036
wherein, TsjIs the fusion duration of the jth true track, an
Figure FDA0002858874670000037
LjFor the number of fused tracks associated with the true track j, also set FsjThe number of time domain intervals in (1); namely TsIs FsThe time period sum of all time domain intervals;
4) calculating the precision time cumulant T of each sensori
5) Obtaining a sensor RiPrecision weight value:
wi=Ti/Ts (7)
if sigma w is caused by a calculation erroriNot equal to 1, then w needs to be adjustediCarrying out normalization processing again;
6) according to the formula σw=∑wiσiObtaining the reference accuracy sigmaw
The calculation of the accumulated amount of the precision time of the sensor is a sub-process of acquiring the reference precision of the self-adaptive weight method, wherein the rest is fused with a time domain
Figure FDA0002858874670000041
The calculation is to divide the processed sensor time domain interval from the fusion time domain interval, wherein I is a fusion time domain complete set;
Figure FDA0002858874670000042
is to extract the sensor R which has acted in the fusion for the remaining periodiTime domain interval, then use
Figure FDA0002858874670000043
Calculating sensor RiThe accumulated amount of accuracy time of (1) is:
Figure FDA0002858874670000044
wherein m isiIs a sensor RiThe number of the tracks of (a) is,
Figure FDA0002858874670000045
is to utilize
Figure FDA0002858874670000046
Calculating sensor RiThe calculation method of each track duration, namely the difference between the ending time and the starting time of the track time interval, is the same as the formula (6);
fourthly, an evaluation method: the information about a flight path is composed of threeThree groups of data including longitude, latitude and elevation, which are determined by each coordinate, represent a flight path data, and the flight path precision also correspondingly obtains 3 values, so that the final precision comprises fusion precision and reference precision; the final precision would be a 3 value trade-off, then there would be σz=(∑wσw) And/3, wherein w is determined by a scoring expert method according to the importance of three values, when 3 values are scored as a1、a2、a3Then the three values of w are a1/∑ai、a2/∑ai、a3/∑ai(ii) a After the final precision is determined, determining a final evaluation result by comparing and fusing the final precision and the final reference precision; the evaluation method comprises a comparison method, a percentage scoring method and a satisfaction evaluation method;
1) the comparison method comprises the following steps: if the fusion precision is greater than the reference precision, judging that the fusion precision is unqualified, otherwise, judging that the fusion precision is qualified;
2) the percentage scoring method comprises the following steps: determining the percentage of the final fusion precision which is greater than or less than the final reference precision by taking the final reference precision as a standard, determining different percentage areas for scoring, wherein when the percentage is equal to 0, the percentage is 90, the percentage is 80 between 0 and 10 percent, and the percentage is 95 between 0 and-10 percent;
3) the satisfaction evaluation method comprises the following steps: using the formula f ═ exp (-0.1054 σ -lw) Computing a final result, where σlTo fuse the final fusion precisions, σwThe final reference accuracy.
CN202011558976.2A 2020-12-25 2020-12-25 Air condition fusion precision evaluation system and method Pending CN112633703A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011558976.2A CN112633703A (en) 2020-12-25 2020-12-25 Air condition fusion precision evaluation system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011558976.2A CN112633703A (en) 2020-12-25 2020-12-25 Air condition fusion precision evaluation system and method

Publications (1)

Publication Number Publication Date
CN112633703A true CN112633703A (en) 2021-04-09

Family

ID=75324928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011558976.2A Pending CN112633703A (en) 2020-12-25 2020-12-25 Air condition fusion precision evaluation system and method

Country Status (1)

Country Link
CN (1) CN112633703A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010281782A (en) * 2009-06-08 2010-12-16 Mitsubishi Electric Corp Target tracking apparatus
CN105354356A (en) * 2015-09-29 2016-02-24 中国人民解放军63892部队 Radar intelligence simulation based air intelligence fusion performance evaluation system and method
CN106570311A (en) * 2016-10-12 2017-04-19 武汉数字工程研究所(中国船舶重工集团公司第七0九研究所) Flight path continuity assessment method and system under complex conditions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010281782A (en) * 2009-06-08 2010-12-16 Mitsubishi Electric Corp Target tracking apparatus
CN105354356A (en) * 2015-09-29 2016-02-24 中国人民解放军63892部队 Radar intelligence simulation based air intelligence fusion performance evaluation system and method
CN106570311A (en) * 2016-10-12 2017-04-19 武汉数字工程研究所(中国船舶重工集团公司第七0九研究所) Flight path continuity assessment method and system under complex conditions

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴晓朝等: "一种基于时空匹配权重的空情融合精度评估算法", 《航空兵器》 *
吴晓朝等: "防空空情融合性能评估方法研究", 《系统仿真学报》 *

Similar Documents

Publication Publication Date Title
Hansen et al. Measurement of galaxy cluster sizes, radial profiles, and luminosity functions from SDSS photometric data
CN109783967B (en) Landslide prediction method and device based on random forest model and storage medium
CN114168906B (en) Mapping geographic information data acquisition system based on cloud computing
CN116431975B (en) Environment monitoring method and system for data center
CN111680870B (en) Comprehensive evaluation method for quality of target motion trail
CN109581359B (en) Method for associating ESM passive information with radar active information
CN115687983A (en) Bridge health state monitoring method and system and electronic equipment
CN116985183B (en) Quality monitoring and management method and system for near infrared spectrum analyzer
CN113075648A (en) Clustering and filtering method for unmanned cluster target positioning information
CN113887380B (en) Intelligent sample preparation system for coal samples
EP2580772B1 (en) A method computer program and system to analyze mass spectra
CN117724059A (en) Multi-source sensor fusion track correction method based on Kalman filtering algorithm
CN112633703A (en) Air condition fusion precision evaluation system and method
CN115841049B (en) Equipment life reliability assessment method and system based on multi-source information fusion
CN108427837B (en) Assembly gross error determination method and system
CN113673105A (en) Design method of true value comparison strategy
CN112906746B (en) Multi-source track fusion evaluation method based on structural equation model
CN116340876A (en) Spatial target situation awareness method for local multisource data fusion
CN111811515B (en) Multi-target track extraction method based on Gaussian mixture probability hypothesis density filter
CN112580741B (en) Gas type identification method and system based on multi-sensor fast learning
CN108984881A (en) In conjunction with manufacturing process and the electrical type single machine Estimation of The Storage Reliability method of emulation
CN113534129B (en) Method and system for evaluating high-speed target detection performance of foundation broadband radar
CN115372919B (en) Radar target echo simulation performance evaluation method based on t-test
CN117196736B (en) Graph number integrated intelligent evaluation system based on position and scene
CN115017462B (en) User shooting behavior discrimination method based on computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination