CN115374652A - Evidence reasoning-based unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method - Google Patents
Evidence reasoning-based unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method Download PDFInfo
- Publication number
- CN115374652A CN115374652A CN202211290927.4A CN202211290927A CN115374652A CN 115374652 A CN115374652 A CN 115374652A CN 202211290927 A CN202211290927 A CN 202211290927A CN 115374652 A CN115374652 A CN 115374652A
- Authority
- CN
- China
- Prior art keywords
- index
- data
- unmanned aerial
- aerial vehicle
- layer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000011156 evaluation Methods 0.000 title claims abstract description 145
- 238000012360 testing method Methods 0.000 title claims abstract description 53
- 238000006243 chemical reaction Methods 0.000 claims abstract description 14
- 230000002776 aggregation Effects 0.000 claims abstract description 11
- 238000004220 aggregation Methods 0.000 claims abstract description 11
- 238000013100 final test Methods 0.000 claims abstract description 5
- 238000000034 method Methods 0.000 claims description 42
- 230000015572 biosynthetic process Effects 0.000 claims description 34
- 238000012545 processing Methods 0.000 claims description 27
- 238000004364 calculation method Methods 0.000 claims description 26
- 230000008569 process Effects 0.000 claims description 22
- 238000004422 calculation algorithm Methods 0.000 claims description 21
- 230000003993 interaction Effects 0.000 claims description 11
- 230000005540 biological transmission Effects 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 9
- 230000006870 function Effects 0.000 claims description 7
- 230000004913 activation Effects 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 6
- 239000000126 substance Substances 0.000 claims description 5
- 230000007613 environmental effect Effects 0.000 claims description 4
- 230000002195 synergetic effect Effects 0.000 claims description 4
- 239000007788 liquid Substances 0.000 claims description 3
- 238000012423 maintenance Methods 0.000 claims description 2
- 230000014759 maintenance of location Effects 0.000 claims description 2
- 238000004088 simulation Methods 0.000 description 26
- 230000006399 behavior Effects 0.000 description 14
- 230000008859 change Effects 0.000 description 11
- 238000006116 polymerization reaction Methods 0.000 description 9
- 230000008901 benefit Effects 0.000 description 8
- 230000004888 barrier function Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 5
- 238000011160 research Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000008447 perception Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013142 basic testing Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000008092 positive effect Effects 0.000 description 1
- 230000011273 social behavior Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M99/00—Subject matter not provided for in other groups of this subclass
- G01M99/005—Testing of complete machines, e.g. washing-machines or mobile phones
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M99/00—Subject matter not provided for in other groups of this subclass
- G01M99/008—Subject matter not provided for in other groups of this subclass by doing functionality tests
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0069—Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0073—Surveillance aids
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
- H04L43/0876—Network utilisation, e.g. volume of load or congestion level
- H04L43/0882—Utilisation of link capacity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
- H04L43/0876—Network utilisation, e.g. volume of load or congestion level
- H04L43/0888—Throughput
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
- H04L43/0876—Network utilisation, e.g. volume of load or congestion level
- H04L43/0894—Packet rate
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/08—Testing, supervising or monitoring using real traffic
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Environmental & Geological Engineering (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Artificial Intelligence (AREA)
- Geometry (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Hardware Design (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention provides an unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning, which comprises the following steps: constructing an index system of unmanned aerial vehicle cluster cooperative obstacle avoidance; the index system comprises a plurality of bottom-layer evaluation indexes and middle evaluation indexes; acquiring data of each bottom evaluation index in the index system; setting a reference value of each bottom layer evaluation index, and converting the data of each bottom layer evaluation index into a reliability structure according to the reference value; constructing a basic credibility, and obtaining the joint confidence of the bottom-layer evaluation indexes according to the basic credibility and the credibility structure after data conversion of each bottom-layer evaluation index; and taking the obtained joint confidence coefficient of the bottom-layer evaluation index as the input of the upper-layer middle evaluation index to continue the index aggregation, and finally obtaining the top-layer evaluation index, namely the confidence coefficient of the unmanned aerial vehicle cluster cooperative obstacle avoidance capability, which is taken as the final test evaluation result.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicle cluster obstacle avoidance, in particular to an unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning.
Background
At present, many main researches on the aspect of cooperative obstacle avoidance behavior strategies for unmanned aerial vehicle clusters mainly include: the Leader-Follower law, the Behavior-Based law, the Virtual Structure law, and the Artificial Potential Field law. But there is little research on how the specific strategy can avoid the obstacle. The existing research in the aspect of test evaluation does not provide a solution for the cooperative obstacle avoidance behavior of the unmanned aerial vehicle cluster.
Therefore, a novel technology of a method for evaluating unmanned aerial vehicle cluster collaborative obstacle avoidance capability test is urgently needed in the industry.
Disclosure of Invention
In order to solve the problem of how to perform test evaluation on the collaborative obstacle avoidance capability of the unmanned aerial vehicle cluster, the invention provides an unmanned aerial vehicle cluster collaborative obstacle avoidance capability test evaluation method based on evidence reasoning. The uncertainty problem specifically means that partial data cannot obtain an accurate value in the acquisition process of the test evaluation data, so that uncertainty of the test evaluation is caused.
In order to achieve the purpose, the invention provides a set of unmanned aerial vehicle cluster cooperative obstacle avoidance test evaluation flow based on evidence reasoning, which comprises a set of complete and effective index system, and an evidence reasoning method for cooperative obstacle avoidance is designed, so that the test evaluation problem of unmanned aerial vehicle cooperative obstacle avoidance can be well solved.
The technical scheme of the invention is as follows: an unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning comprises the following steps:
constructing an index system of unmanned aerial vehicle cluster cooperative obstacle avoidance; the index system comprises a plurality of bottom layer evaluation indexes and middle evaluation indexes;
acquiring data of each bottom layer evaluation index in the index system;
setting a reference value of each bottom-layer evaluation index, and converting data of each bottom-layer evaluation index into a reliability structure required by an evidence reasoning algorithm according to the reference value;
constructing a basic credibility, and obtaining the joint confidence of the bottom-layer evaluation indexes according to the basic credibility and the converted confidence structure of each bottom-layer evaluation index;
and taking the obtained joint confidence coefficient of the bottom-layer evaluation index as the input of the upper-layer middle evaluation index to continue to carry out index aggregation, and finally obtaining the top-layer evaluation index, namely the confidence coefficient of the unmanned aerial vehicle cluster cooperative obstacle avoidance capability, which is taken as the final test evaluation result.
Further, the index system comprises individual ability, interaction ability, cooperation ability and system ability;
the bottom-layer evaluation indexes of the individual ability comprise maximum endurance time, image processing time, flight path errors, obstacle visibility, image resolution and shooting frequency;
the middle evaluation index of the upper layer of the image resolution and the shooting frequency is the reliability of the detector;
image processing is carried out according to the image processing time and the upper-layer middle evaluation index of the obstacle;
the reliability of the detector and the upper-layer middle evaluation index of the image processing are environmental information acquisition capacity;
the upper-layer middle evaluation index of the maximum endurance time and the track error is the track retention capacity;
the last-layer evaluation index of the environmental information acquisition capacity and the track maintenance capacity is the individual capacity;
the bottom-layer evaluation index of the interaction capacity comprises data transmission rate, frequency band utilization rate and data throughput; the upper-layer middle evaluation index of the data transmission rate, the frequency band utilization rate and the data throughput is the communication equipment capacity;
the bottom-layer evaluation indexes of the synergistic ability comprise tail end position errors, tail end attitude errors, formation planning time and formation transformation time consumption; the last-layer middle evaluation index of the tail end position error, the tail end attitude error, the formation planning time and the formation transformation time is the obstacle avoidance path transformation capability;
the bottom-level evaluation indexes of the system capacity comprise obstacle avoidance time and task planning time.
Further, acquiring data of each bottom-layer evaluation index in the index system includes:
if part of the bottom layer evaluation indexes are calculated through data, the reliability of the indexes is calculatedIs set to 1;
part of bottom-layer evaluation indexes cannot be obtained through calculation, belong to statistical data or assumed data and have uncertainty, namely index reliability is not 1;
all index data need to be converted into a reliability structure for further calculation in the processing process.
Furthermore, the reliability structure after data conversion is the confidence of index data on five levels of excellent, good, medium, passing and failing respectively.
Further, setting a reference value of each bottom-layer evaluation index, and converting the data of each bottom-layer evaluation index into a reliability structure required by an evidence reasoning algorithm according to the reference value, wherein the reliability structure comprises:
suppose that the input data is represented as,Denotes the firstThe input value of each bottom-layer evaluation index,denotes the firstThe reliability corresponding to the input data of the individual bottom-layer evaluation index reflects the uncertainty of the input value;
the input data is converted using the following formula:
wherein the content of the first and second substances,,representing input dataTo pairThe degree of matching of (c);representing previous indicators in a confidence ruleTo (1) aThe value of each reference value;
selecting a matching function for calculating the matching degree according to the property of the evaluation index; if the reference value precondition is a discrete value, the following formula is adopted:
reference value for indexValue ofFor convenience of description, assume that the sequence is monotonic, and the function:
The above formula means when the index data is inputtedBetween two reference values, i.e. satisfyWith respect to the reference valueIs calculated using a first equation with respect toIs matched withDegree is calculated using a second equation; in practice, the amount of the liquid to be used,。
further, constructing a basic confidence number, and obtaining a joint confidence degree of the bottom-layer evaluation indexes according to the basic confidence number and a confidence degree structure after data conversion of each bottom-layer evaluation index, wherein the joint confidence degree of the bottom-layer evaluation indexes comprises the following steps:
basic credits are different types of degrees of trust, wherein,i.e. by,Is a basic confidence number indicating the probability quality assigned to the result setRepresenting an uncertainty associated with the activation weight;representing uncertainty arising from incomplete input information;representing a reliability structure;indicating indexThe weight of (c);is the label of the input index,reference signs being given to the reference values;
The results of calculation in the constructed basic credibility numbers are stored by using a matrix, so that the calculation in evidence combination is facilitated, for example, each row of the matrix represents the calculation result respectivelyI.e. shareEach column respectively represents a label corresponding to one activated rule,representing the entire output result set.
Further, constructing a basic confidence number, and obtaining a joint confidence degree of the bottom-layer evaluation indexes according to the basic confidence number and a confidence degree structure after data conversion of each bottom-layer evaluation index, wherein the joint confidence degree of the bottom-layer evaluation indexes comprises the following steps:
wherein the content of the first and second substances,,indicating the number of input index data;a label indicating all lower layer indexes corresponding to the input index;indicating the probability quality assigned to the result set,representing an uncertainty associated with the activation weight;representing uncertainty arising from incomplete input information;representing the whole output result set;、、、in order to obtain the intermediate process quantities,is the final joint confidence, i.e., the confidence value of the combat effectiveness.
The invention has the following beneficial effects:
1. the evidence reasoning method can be used for processing 'uncertainty problems' in the test data by means of the concept of the credibility structure. For example, the image resolution is the underlying index data, which cannot be directly provided, and the confidence structure and the confidence value are subjectively given by manually observing the blurring degree of the picture.
2. The current evidence reasoning method needs to consider a confidence rule base, but the method gives up the view of the confidence rule base, directly obtains a confidence structure by calculating index data, and is more suitable for directly processing test data. Through the step, the method can include the data information and the uncertainty degree of the original data, the original information can be utilized more in the subsequent test evaluation, and the obtained test evaluation result is more accurate. Furthermore, uncertainty information can be better handled using a belief structure.
3. According to the theory of evidence reasoning, when the unmanned aerial vehicle cluster uses the evidence reasoning, firstly, the evaluation grade and the reference value of each evaluation index are determined, so that the expert experience or historical data can be combined; secondly, when the unmanned aerial vehicle cluster is subjected to collaborative obstacle avoidance test evaluation, the method adopts a quantitative calculation mode, qualitative data can be converted into quantitative data, and the final evaluation result is more scientific. Meanwhile, the final evaluation result is generally expressed in an uncertain form, and the evaluation result also contains original data information to the greatest extent, so that deeper analysis and decision can be conveniently carried out. For an unmanned aerial vehicle system, strong uncertainty in the process of executing tasks by an unmanned aerial vehicle cluster is considered, the uncertainty data can be processed by using the concept of a credibility structure based on an evaluation method of an evidence reasoning rule, and the method has effectiveness in the aspects of uncertainty expression and cluster environment adaptability research.
In addition to the above-described objects, features and advantages, the present invention has other objects, features and advantages. The present invention will be described in further detail below with reference to the drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. In the drawings:
fig. 1 is a process diagram of how the present invention switches from the actual cooperative obstacle avoidance behavior process of the drone cluster to the cooperative obstacle avoidance indicator system.
Fig. 2 is a schematic diagram of an indicator system for collaborative obstacle avoidance of an unmanned aerial vehicle cluster, which is constructed by the present invention, and shows a complete indicator system design for collaborative obstacle avoidance of the unmanned aerial vehicle cluster;
FIG. 3 is a plot of the distance between drones in an embodiment of the invention;
fig. 4 is a distance from the drone to the door frame in an embodiment of the invention.
Detailed Description
Embodiments of the invention will be described in detail below with reference to the drawings, but the invention can be implemented in many different ways, which are defined and covered by the claims.
The invention provides an unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning, which comprises the following steps:
101, constructing an index system of unmanned aerial vehicle cluster cooperative obstacle avoidance; the index system comprises a plurality of bottom layer evaluation indexes and middle evaluation indexes.
Specifically, as shown in fig. 1 and fig. 2, according to the research on the cooperative obstacle avoidance behavior, from the viewpoint of the architecture and the cooperative obstacle avoidance process, the behavior of the unmanned aerial vehicle cluster is divided into an individual behavior, an interactive behavior, a cooperative behavior, and a system behavior according to the cluster operation level and the number of the participating unmanned aerial vehicles. And simultaneously dividing the indexes of all clusters into individual capacity, interactive capacity, cooperative capacity and system capacity. The method comprises the following specific steps:
analyzing the individual capacity:
for the bottom physical structure of the execution layer of the unmanned aerial vehicle, the individual capability of each unmanned aerial vehicle in the unmanned aerial vehicle cluster is considered at first, the individual capability of the individual unmanned aerial vehicle and the external environment except the unmanned aerial vehicle are in no interaction, and the capabilities of the unmanned aerial vehicle such as finishing instruction accuracy, self-control, obstacle detection and motion performance are generally expressed. This capability is determined primarily by the hardware parameters of the drone itself. Aiming at the cooperative obstacle avoidance behavior of the unmanned aerial vehicle cluster, the individual ability is divided into the environment information acquisition ability and the track keeping ability. The environment information acquisition capability can be found to be a relatively important capability by combining the collaborative obstacle avoidance process and environment information and acquisition, situation perception and sharing and the like in the system structure; by combining key technologies, the track keeping capability belongs to the individual capability of an individual analyzer, and plays a certain role in the cooperative obstacle avoidance behavior. The environment sensing ability can be divided into two kinds of abilities of detector reliability and image processing.
Table 1 shows the underlying indicators selected by the present invention to measure the ability of an individual.
TABLE 1 Individual competence floor index
Bottom layer index name | Introduction to |
Maximum time of flight | The flight time is the maximum flight time after one-time energy charging of an unmanned aerial vehicle, and is mainly reflected by the battery capacity and the oil carrying capacity. |
Image processing time | The time for the long-range unmanned aerial vehicle to process collected information according to the clusters mainly depends on an image processing algorithm. |
Track error | Means that no one isDeviation between the planned path and the actual path. |
Visibility of obstacles | The farthest distance between the unmanned aerial vehicle and the obstacle in the picture capable of identifying the obstacle is referred to. |
Image resolution | The resolution ratio of the target picture shot by the individual unmanned aerial vehicle is defined. |
Frequency of shooting | The frequency of shooting target pictures by an individual unmanned aerial vehicle is referred to. |
And (3) interactive capacity analysis:
architecture and situation sharing are considered. The invention considers the evaluation of the interaction capability, the interaction capability refers to the capability of information interaction and feedback among the unmanned aerial vehicles in the unmanned aerial vehicle cluster, smooth, complete and accurate interaction is an important basis for coordinating and avoiding obstacles of the unmanned aerial vehicle cluster, and is the group capability caused by various factors and various capabilities, and the evaluation is carried out by considering the functions among the communication devices.
Table 2 shows the underlying indicators selected by the present invention to measure interaction capability.
TABLE 2 interaction capability bottom layer index
Bottom level index name | Introduction to |
Data transmission rate | Indicate unmanned aerial vehicle in certain time horizonThe cluster finger control system can reach the instantaneous maximum value by means of the data volume transmitted by the wireless data link. |
Band utilization | Refers to the information rate that can be achieved within a unit frequency band. The unit is the limited channel of bit/second Hertz (b/s.Hz) passing band, called the band limited channel for short, and the transmission system is usually measured by the band utilization rate The effectiveness of the system. |
Data throughput | Refers to the amount of data successfully transmitted per unit time (measured in bits, bytes, packets, etc.) between communication devices carried by the drone |
And (3) analyzing the synergic ability:
the cooperative capability is based on the interactive capability and is the capability of the subsystem level. The invention evaluates the cooperative capability, and mainly refers to the capability embodied under the common influence of independent behaviors and interactive behaviors between individuals, between individuals and clusters and between groups. The invention is embodied by considering the path transformation and the grouping transformation capability through grouping combined with the autonomous flight and the flight path planning.
Table 3 shows the underlying indicators selected by the present invention to measure synergy.
TABLE 3 synergistic capability bottom layer index
Bottom level index name | Introduction to |
Error of end position | After the unmanned aerial vehicle cluster finishes formation transformation, the average value of errors between the current position of each unmanned aerial vehicle and the position planned in advance by the control node is obtained. |
End attitude error | And the average value of the errors between the current attitude of each unmanned aerial vehicle and the pre-planned attitude of the command control node. |
Formation planning time | And the time refers to the time for completing formation planning for avoiding the obstacle after the command control nodes in the cluster detect the position of the obstacle. |
Time consumed for formation change | The total time it takes for the entire cluster to begin transitioning to completion of the transform. |
Analyzing the system capacity:
the system capability is the evaluation of the process capability of the unmanned aerial vehicle cluster to complete the task. The comprehensive performance of the whole process from decision making to task completion in the index system refers to the capability of controlling the unmanned aerial vehicle cluster to complete the battle task, and the passing time measurement is mainly considered.
Table 4 shows the underlying indicators selected by the present invention to measure system performance.
TABLE 4 System capabilities bottom layer indicators
Bottom layer index name | Introduction to |
Time to avoid obstacle | The obstacle avoidance time refers to the time from the unmanned aerial vehicle cluster to successfully avoiding the obstacle after completing the obstacle avoidance task planning, and includes the time of formation change. |
Mission planning time | And the cluster performs task planning immediately after the obstacle is detected, and reaches the total time of completing planning and issuing the instruction. |
And 102, acquiring data of each bottom-layer evaluation index in the index system.
An evidence reasoning test evaluation method aiming at unmanned aerial vehicle cluster cooperative obstacle avoidance is realized on the basis of the existing evidence reasoning theory. Firstly, a proper method is selected to assign values to each bottom-layer evaluation index of each feasible scheme.
The data of the test evaluation index of the invention is obtained by analyzing the original simulation test data, and the simulation test is completed by the 20 th group of electronic science and technology of China. The evaluation index of the model is the index system.
Because part of the data of the bottom-layer evaluation indexes are deterministic data, and part of the data of the bottom-layer evaluation indexes are non-deterministic data. This is illustrated here by way of example. For deterministic data, for example, graphics processing power is manifested by barrier visibility, image processing time. So here the two underlying indicators of obstacle visibility, image processing time are the direct inputs to the ER algorithm. The original data needed in the process of calculating the visibility of the obstacle is the distance between the long-range unmanned aerial vehicle and the obstacle and whether the target recognition algorithm of the long-range unmanned aerial vehicle detects the obstacle. The image processing time is determined by the algorithm and the data volume of the long machine, and the image processing time is artificially set to be a fixed value in each simulation test of the invention. Two underlying indexes of barrier visibility and image processing time are calculated through data or are assumed to be constant, so the invention sets the reliability of the indexes to be 1, namely the reliability in the formula。
For uncertain data, since some indexes cannot be obtained by calculation, the data belong to statistical data or assumed data and have uncertainty, namelyIs not 1. Such data needs to be converted into a confidence structure for calculation during processing. For example, the image resolution is the underlying index data, and if the index data cannot be directly provided, the confidence structure and the confidence value need to be subjectively given by manually observing the blurring degree of the picture.
103, setting a reference value of each bottom-layer evaluation index, and converting the data of each bottom-layer evaluation index into a reliability structure required by an evidence reasoning algorithm (ER algorithm for short) according to the reference value.
The conversion of the input data refers to converting the data of the underlying indexes into a reliability structure required by an ER algorithm. Translating the input data into an untrusted structure is key to dealing with "uncertainty".
The conversion of the input data refers to converting the data of the underlying indexes into a reliability structure required by an ER algorithm.
The confidence structure after data conversion is the confidence of index data in five levels of goodness, middle, passing and failing, for example, assuming that the confidence structure after image resolution conversion is (0.4, 0.5, 0.0), each number represents the proximity of the input data to the corresponding reference value.
The reference value of the index needs to be determined in the process of converting the data into the credibility structure, namely the reference value is in the formula{}. There are various methods for determining the reference value. The invention determines the reference valueThe method is subjectively given by combining a data set and the existing literature, and the method can be used for referring to the existing experience and combining the expert experience.
The invention combines each group of indexes with simulation results and actual application to set reference values for evaluating the quality of each index input. In order to exert the advantages of an evidence theory algorithm, the method sets excellent, good, medium, passing and failing 5-grade evaluation, sets 5 reference values for each certainty index respectively, and directly converts uncertainty indexes into confidence distribution for index aggregation.
Input data is represented as,Is shown asThe input value of each bottom-layer evaluation index,is shown asAnd the reliability corresponding to the input data of the individual bottom-layer evaluation index reflects the uncertainty of the input value.
The input data is converted using the following formula:
wherein, the first and the second end of the pipe are connected with each other,,representing input dataTo pairThe degree of matching of (c);representing previous indicators in a confidence ruleTo (1)The value of each reference value.
To calculate the matching function of the degree of matching, a selection is made according to the nature of the evaluation index. If the reference value precondition is a discrete value, the following formula is adopted:
reference value for indexValue ofFor convenience of description, assume that the sequence is monotonic, and the function:
The above formula means when data is inputtedBetween two reference values, i.e. satisfyWith respect to the reference valueIs calculated using a first equation with respect toThe degree of matching of (a) is calculated using a second equation. In practice, the amount of the liquid is,。
and 104, constructing a basic credibility number, and obtaining the joint confidence of the bottom-layer evaluation indexes according to the basic credibility number and the converted confidence structure of each bottom-layer evaluation index.
The basic credibility is different types of credibility and is reflected in the next joint confidence calculation. WhereinI.e. by,Which is a basic confidence number, indicating the probability mass assigned to a result set, which is not assigned to any individual result, but is divided into two parts:representing an uncertainty associated with the activation weight;representing the uncertainty that results from the incomplete input information.Are a subscript to the index value,are subscripts to the reference values.
The results of the calculations in this step are stored using a matrix to facilitate the calculations in the combination of evidence, e.g. each row of the matrix represents a separate representationI.e. shareEach column respectively represents a label corresponding to one activated rule,representing the entire output result set.
wherein the content of the first and second substances,,indicating the number of input index data;the labels indicating all the lower-layer indices corresponding to the input indices, for example: the lower layer indexes of the reliability of the detector comprise image resolution and shooting frequency=2, which represents the number of lower layer indices;indicating the probability quality assigned to the result set,representing an uncertainty associated with the activation weight;representing uncertainty arising from incomplete input information;representing the whole output result set;、、、in order to obtain the intermediate process quantities,is the final joint confidence, i.e., the confidence value of the combat effectiveness.
And 105, taking the obtained joint confidence coefficient of the bottom-layer evaluation index as the input of the upper-layer middle evaluation index to continue the index aggregation, and finally obtaining the top-layer evaluation index, namely the confidence coefficient of the unmanned aerial vehicle cluster collaborative obstacle avoidance capability, which is used as the final test evaluation result.
Specifically, the compound obtained in the previous stepNamely, the confidence value of the battle effectiveness (excellent, good, medium, and poor) is used as the input of the index of the previous layer to continue the polymerization. For example: the upper layer index of the image resolution and the shooting frequency in the index system is the reliability of the detector, the confidence coefficient of the image resolution and the shooting frequency is obtained through the calculation of the combined confidence coefficient and is used as the matching degree of the reference value of the reliability of the detector, namely the matching degree in the formulaThen combining the image resolution and the weight of the shooting frequency relative to the reliability of the detectorBy means of a formula, e.g. combining。
And finally, obtaining the confidence coefficient of the uppermost layer index of the unmanned aerial vehicle cluster collaborative obstacle avoidance capability as a final test evaluation conclusion.
The present invention is explained below with reference to specific examples.
Firstly, processing each bottom layer index parameter in the index system specifically comprises the following steps:
(1) Image resolution
The parameter reflects the picture quality of scene information of a shooting field of the camera, and is determined by the performance of the camera, and the index cannot be obtained in the simulation test of the invention, so the index is uncertainty data. In the subsequent index aggregation, the fuzzy degree of the direct artificial observation picture is selected to be given to the credibility structure, and the image resolution is converted into the credibility structure (0.5, 0.0).
(2) Frequency of shooting
The parameter reflects the time interval of the camera shooting field scene information, is determined by the performance of the camera, and is unified to 0.1s in the test and subsequent index fusion.
(3) Time of picture processing
The index is determined by the algorithm and the data volume of the long machine in the actual scene, and is artificially set to be a fixed value in each simulation test.
(4) Visibility of obstacles
In the photo that can discern the door frame, unmanned aerial vehicle can detect the farthest distance between door frame and the door frame. The following table records the distance from the long-range unmanned aerial vehicle to the door frame, and the distance is part of an original data table. Specifically, the data from 68.08 to 68.086 is the time period during which the detection value changes from 0 to 1, meaning that the drone 68.084 starts to find the door frame at the moment, where the position records the drone position (starting to record from the start of finding the door frame). As in table 5 below:
TABLE 5 distance of the longplane unmanned aerial vehicle from the door frame
Time | Detection of | Location-X | position-Y | position-Z | Distance from door frame |
68.08 | 0 | 0.0000 | 0.0000 | 0.0000 | 0.0000 |
68.081 | 0 | 0.0000 | 0.0000 | 0.0000 | 0.0000 |
68.084 | 1 | 15.8747 | -2.6100 | 11.3457 | 19.6861 |
68.086 | 1 | 15.8091 | -2.5686 | 11.2988 | 19.6007 |
As shown in the table, the longplane detected the door frame at 68.084s, and the position away from the door frame was calculated to be the obstacle visibility, which was 19.686m in this test.
(5) Track error
Calculating the error between the real position of each unmanned plane and the planned position of each time, and comparing the time points with reference values if the planned position of each time is not available. In addition to position, velocity errors, angle errors, and the like may be calculated. The control information given to each unmanned aerial vehicle by the long aircraft is seen in the original data, and the control information comprises the following contents: position references (position, velocity, acceleration, jerk); angle references (yaw angle, yaw angular velocity, yaw angular acceleration). When calculating the track error, mainly consider the deviation between instruction position and the actual position, table 6 is one portion of original data table, has shown unmanned aerial vehicle control command's control command serial number, control mode, the planning position of main concern instruction. And calculating the data and the actual track data of the unmanned aerial vehicle to obtain the flight path error of the test.
TABLE 6 planning Instructions for Long machines
(6) Bandwidth utilization, data throughput, data transmission rate
The indexes are used for measuring transmission capacity between communication equipment on the unmanned aerial vehicle, belong to self performance of the communication equipment carried by the unmanned aerial vehicle, belong to preset conditions in simulation, cannot be obtained through simulation result data, and are directly calculated by using preset values in subsequent index calculation.
(7) Error of end position
And after the tail end attitude error is mainly commanded by the long machine, the unmanned aerial vehicle reaches a preset position and then has an error with an actual planning position. The index is used for averaging the error of each instruction, and the simulation data is the same as the data for calculating the flight path error.
(8) End attitude error
The index mainly depends on the difference between the flight attitude (mainly referring to the flight angle of the unmanned aerial vehicle) and the planning attitude when each unmanned aerial vehicle reaches the planning path position in the simulation test. The tail end posture of each unmanned aerial vehicle is recorded in the tail end posture table. The control instruction table records control instructions. And calculating the average value of each component in the two tables, taking the difference value, and adding the difference values to obtain the final attitude error. As shown in tables 7-10, the attitude error calculation process is shown. Unmanned aerial vehicles (unmanned aerial vehicle) are subsequently abbreviated uav. X, Y, Z and W in the following table are quaternion coordinates and reflect the space information of the unmanned aerial vehicle.
TABLE 7 uav1 attitude error calculation
X | Y | Z | W | |
Planning value | -0.00027825 | 0.080744455 | 0.000764816 | 0.996734504 |
Actual value | 0.000131748 | 0.080429203 | 0.001091044 | 0.996759718 |
uav1 unmanned aerial vehicle attitude error: 0.000612.
TABLE 8 uav2 attitude error calculation
X | Y | Z | W | |
Planning value | 0.000683229 | 0.079880162 | 0.000995949 | 0.996803743 |
Actual value | -0.00031544 | -0.08063731 | -0.00077343 | -0.99674315 |
uav1 unmanned aerial vehicle attitude error: 1.99999981.
TABLE 9 uav3 attitude error calculation
X | Y | Z | W | |
Planning value | 0.004101784 | -0.10504985 | -0.00043318 | -0.99445843 |
Actual value | 0.007859793 | -0.07968863 | -0.00052909 | -0.99678867 |
uav1 unmanned aerial vehicle attitude error: 0.025743993.
TABLE 10 uav8 (LONG MACHINE) attitude error calculation
X | Y | Z | W | |
Planning value | 0.000504 | 0.080393 | 0.000642 | 0.996763 |
Actual value | 0.006486 | -0.08285 | -0.00064 | -0.99654 |
uav1 unmanned aerial vehicle attitude error: 1.999986271
And summing and averaging to obtain the final attitude error.
(9) Formation planning time
The formation planning time is the time required by the long machine to finish sending the planning instruction after the target is found, and the time is started by detecting the time (a detection information table-line 607, 68.084 s) when the door frame starts and all the planning instructions are completely sent (a control instruction 2 table-line 2, 68.084s, is the time for starting sending the instructions, and a control instruction 2 table-line 200919, 69.073s, is the time for finishing sending the instructions). The difference value of the time is the formation planning time, and belongs to the statistical data of the reading table.
(10) Formation change time
The index is measured by the time difference between the beginning of the conversion formation and the end of the conversion formation. After the formation is changed, the relative distance between the unmanned aerial vehicles is not changed obviously any more and is used as a mark. E.g. indicating that the position distance between drones no longer changes significantly. Wherein, 1-8 represents the distance between the first unmanned aerial vehicle and the eighth unmanned aerial vehicle, 2-8 represents the distance between the second unmanned aerial vehicle and the eighth unmanned aerial vehicle, and the rest 3-8, 1-2, 1-3 and 2-3 also represent the distance between the two unmanned aerial vehicles respectively. Table 11 is a portion of the raw data table E2, illustrating the segments where no significant change in relative position between drones occurs.
TABLE 11 uav1 attitude error calculation
Time | 1-8 | 2-8 | 3-8 | 1-2 | 1-3 | 2-3 |
53.548 | 4.0000 | 14.3305 | 2.0000 | 13.1551 | 2.0000 | 13.6092 |
53.692 | 13.5456 | 2.0000 | 2.0000 | 14.2696 | 13.0899 | 4.0000 |
56.384 | 2.0000 | 2.0000 | 13.4819 | 4.0000 | 13.0248 | 14.2083 |
56.384 | 2.0000 | 2.0000 | 13.4285 | 4.0000 | 12.9704 | 14.1570 |
Fig. 3 is obtained by plotting the inter-drone distance input matlab, and it can be intuitively found that the inter-drone distance is substantially constant between 320 groups of data and 380 groups of data.
Meanwhile, according to the description of the simulation process, the unmanned aerial vehicle slowly approaches the door frame in the process of adjusting the formation, so that the formation change time in the whole cooperative obstacle avoidance process can be obtained by drawing the distance between the formation of the unmanned aerial vehicle and the door frame and by the inflection point of the obvious fluctuation change of the distance. Fig. 4 shows the distance of the drone from the door frame, where the process of 600 to 1100 is non-linear with respect to the distance of the door frame, and after 1100, the formation is completed and the formation is accelerated straight to the door frame, which can also be regarded as the formation change time.
(11) Time to avoid obstacle
The index shows the time for the unmanned aerial vehicle to form groups to pass through the barrier door frame, and the time for the unmanned aerial vehicle to form groups to pass through the barrier is the time for the unmanned aerial vehicle to form groups to not change relative to the door frame. As shown at 1380 until the end of the test.
(12) Mission planning time
The time from the door frame to the wing plane sending the control command is identified by the leader, and is consistent with the formation planning time in the simulation test.
The following is the experimental evaluation procedure:
the results of the experimental evaluation were obtained by previously analyzing the resulting data for the underlying indicators (as shown in table 12) and using the ER method to polymerize the indicator system.
Table 12 calculation of the underlying index values
Index (es) | Test No.) | Test No. two | Experiment three | Experiment four | Attribute |
Frequency of image capture | 0.1s | 0.1s | 0.1s | 0.1s | Cost of |
Image processing time | 0.15s | 0.05s | 0.13s | 0.16s | Cost of |
Visibility of obstacles | 19.686m | 22.54m | 19.3663m | 18.9656m | Benefit of |
Maximum time of flight | 30min | 30min | 30min | 30min | Benefits of |
Track error | 1.58 | 1.47 | 1.64 | 1.55 | Cost of |
Band utilization | 0.5b/s.Hz | 0.6b/s.Hz | 0.7b/s.Hz | 0.8b/s.Hz | Benefits of |
Data throughput | 2M | 4M | 5M | 8M | Benefit of |
Data transmission rate | 3072kbps | 2048kbps | 1024kbps | 2048kbps | Benefit of |
Error of end position | 0.05 | 0.06 | 0.08 | 0.04 | Cost of |
Tip attitude error | 0.09 | 0.12 | 0.06 | 0.07 | Cost of |
Formation planning time | 17.964s | 17.5s | 13.4s | 7.6s | Cost of |
Time consuming for formation change | 8.2s | 8s | 7.8s | 5.6s | Cost of |
Time to avoid obstacle | 14.916s | 13.7s | 15s | 15.5s | Cost of |
Mission planning time | 17.964s | 17.5s | 13.4s | 7.6s | Cost of |
The data of the first test is obtained by real calculation of original data of the simulation test, and then the three groups of tests are modified on the basis of the first group in order to compare the effect of using different cooperative obstacle avoidance modes, and the change of the simulated obstacle avoidance strategy is assumed. Among them, test one is used as a basic test of the present invention. And the second test modifies the picture processing time and the obstacle visibility on the basis of the first test to simulate and replace different image recognition algorithms. And the third test modifies the formation planning time and the formation conversion time on the basis of the first test to simulate the algorithm of changing the formation planning. And the fourth test is used for simulating the strategy of replacing the door frame passing through by modifying the obstacle avoiding time on the basis of the first test.
Index aggregation is performed by ER algorithm as follows: the reference values of the underlying indicators are determined prior to the aggregation of the indicators. The method of determining the reference value of the present invention is subjectively presented in conjunction with data and existing literature for use as input to the algorithm, as shown in table 13.
TABLE 13 reference values
The indexing weights are also input to the ER algorithm and are subjectively given in the present invention in conjunction with expert experience, as shown in Table 14.
TABLE 14 index weights
p1 | p2 | p3 | p4 | p5 | p6 | p7 | p8 | p9 | p10 | p11 | p12 | p13 |
0.4 | 0.6 | 0.4 | 0.6 | 0.35 | 0.65 | 0.3 | 0.3 | 0.4 | 0.2 | 0.2 | 0.3 | 0.3 |
p14 | P15 | P16 | P17 | P18 | P19 | P20 | P21 | P22 | P23 | P24 | P25 | |
0.4 | 0.6 | 0.4 | 0.6 | 0.3 | 0.7 | 1 | 1 | 0.2 | 0.1 | 0.4 | 0.3 |
And then, inputting bottom layer data through an ER algorithm realized by matlab to obtain the evaluation level of each group of data on the cooperative obstacle avoidance capability of the unmanned aerial vehicle. The final evaluation results are not normalized because there is an unassigned trust value part in the calculation process.
The evaluation results of the first set of test data as the basic test data, i.e., the aggregated results of the first set of simulation data, are shown in table 15:
TABLE 15 first set of polymerization results
Grade | Polymerization results | Normalizing the result |
Is excellent in | 0.0806 | 0.1576 |
Good effect | 0.1102 | 0.2156 |
Medium and high grade | 0.1940 | 0.3796 |
Passing and lattice | 0.1264 | 0.2472 |
Failing to meet the specification | 0.0000 | 0.0000 |
The final rating was medium for the first set of simulation data, which was the highest in proportion, so the present invention considers the conclusion of the first simulation data to be medium.
The second group of test data simulates and updates an image recognition algorithm, and the group of data is improved in image processing and barrier visibility and is used for analyzing the influence after the situation perception capability is improved.
The results after aggregation of the second set of simulation data are shown in table 16:
TABLE 16 second set of polymerization results
Grade | Polymerization results | Normalizing the result |
Is excellent in | 0.1930 | 0.3743 |
Good effect | 0.1339 | 0.2597 |
Medium and high grade | 0.1148 | 0.2226 |
Passing and lattice | 0.0369 | 0.0717 |
Failing to meet the specification | 0.0369 | 0.0717 |
The ratio of the final evaluation as excellent was the highest for the second set of simulation data, so the conclusion of the second simulation data can be considered as excellent. After the situation perception capability is improved, the evaluation result is improved, and the situation perception can be considered to have an influence on the cooperative obstacle avoidance.
The third group adjusts the strategy of formation change relative to the first group of simulations, and the strategy is reduced in the time of formation planning and task planning.
The results after aggregation of the third set of simulation data are shown in table 17:
TABLE 17 third group polymerization results
Grade | Polymerization results | Normalizing the result |
Is excellent in | 0.1212 | 0.2353 |
Good effect | 0.0969 | 0.1883 |
Medium grade | 0.0680 | 0.1321 |
Passing lattice | 0.2289 | 0.4443 |
Failing to reach the grid | 0.0000 | 0.0000 |
For the third set of simulation data, the final evaluation is that the ratio of passing is the highest, so the conclusion of the third simulation data is passed. The evaluation results are reduced relative to the first group, and the influence of the planning time on the cooperative obstacle avoidance behavior is proved.
The fourth group adjusts the strategy that the cluster passes through the door frame relative to the first group, the task planning time is reduced, and the task completion time is improved.
The results after aggregation of the fourth set of simulation data are shown in table 18:
TABLE 18 fourth set of polymerization results
Grade | Polymerization results | Normalizing the result |
It is excellent in | 0.2567 | 0.4942 |
Good effect | 0.1483 | 0.2855 |
Medium and high grade | 0.0651 | 0.1254 |
Passing and lattice | 0.0330 | 0.0636 |
Failing to meet the specification | 0.0162 | 0.0312 |
The proportion of the fourth set of simulation data that was finally evaluated as excellent was the highest, so the present inventors considered the conclusion of the fourth set of simulation data as excellent. The evaluation result is improved relative to the first group, and the system capability and the coordination capability are proved to have positive effects on the overall coordination obstacle avoidance behavior.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (7)
1. An unmanned aerial vehicle cluster collaborative obstacle avoidance capability test evaluation method based on evidence reasoning is characterized by comprising the following steps:
constructing an index system of unmanned aerial vehicle cluster cooperative obstacle avoidance; the index system comprises a plurality of bottom layer evaluation indexes and middle evaluation indexes;
acquiring data of each bottom layer evaluation index in the index system;
setting a reference value of each bottom layer evaluation index, and converting the data of each bottom layer evaluation index into a reliability structure required by an evidence reasoning algorithm according to the reference value;
constructing a basic credibility, and obtaining the joint confidence of the bottom-layer evaluation indexes according to the basic credibility and the converted confidence structure of each bottom-layer evaluation index;
and taking the obtained joint confidence coefficient of the bottom-layer evaluation index as the input of the upper-layer middle evaluation index to continue to carry out index aggregation, and finally obtaining the top-layer evaluation index, namely the confidence coefficient of the unmanned aerial vehicle cluster cooperative obstacle avoidance capability, which is taken as the final test evaluation result.
2. The unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning according to claim 1, wherein the index system comprises individual capability, interaction capability, cooperative capability and system capability;
the bottom-layer evaluation indexes of the individual ability comprise maximum endurance time, image processing time, flight path errors, obstacle visibility, image resolution and shooting frequency;
the upper-layer middle evaluation index of the image resolution and the shooting frequency is the reliability of the detector;
image processing is carried out according to the image processing time and the upper-layer middle evaluation index of the obstacle;
the reliability of the detector and the upper-layer middle evaluation index of the image processing are environmental information acquisition capacity;
the upper-layer middle evaluation index of the maximum endurance time and the track error is the track retention capacity;
the last-layer evaluation index of the environmental information acquisition capacity and the track maintenance capacity is the individual capacity;
the bottom-layer evaluation index of the interaction capacity comprises data transmission rate, frequency band utilization rate and data throughput; the upper-layer middle evaluation index of the data transmission rate, the frequency band utilization rate and the data throughput is the communication equipment capacity;
the bottom-layer evaluation indexes of the synergistic ability comprise tail end position errors, tail end attitude errors, formation planning time and formation transformation time consumption; the last-layer middle evaluation index of the tail end position error, the tail end attitude error, the formation planning time and the formation transformation time is the obstacle avoidance path transformation capability;
the bottom-level evaluation indexes of the system capacity comprise obstacle avoidance time and task planning time.
3. The unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning according to claim 1, wherein the step of obtaining data of each bottom layer evaluation index in the index system comprises the following steps:
if part of the bottom layer evaluation indexes are calculated through data, the reliability of the indexes is calculatedIs set to 1;
part of bottom-layer evaluation indexes cannot be obtained through calculation, belong to statistical data or assumed data and have uncertainty, namely index reliability is not 1;
all index data need to be converted into a reliability structure for further calculation in the processing process.
4. The unmanned aerial vehicle cluster collaborative obstacle avoidance capability test evaluation method based on evidence reasoning according to claim 1, wherein a confidence level structure after data conversion is confidence levels of index data in five levels of good, medium, passing and failing respectively.
5. The unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning according to claim 1, wherein a reference value of each bottom layer evaluation index is set, and data of each bottom layer evaluation index is converted into a reliability structure required by an evidence reasoning algorithm according to the reference value, and the method comprises the following steps:
suppose the input data is represented as,Denotes the firstThe input value of each bottom-layer evaluation index,denotes the firstThe reliability corresponding to the input data of the individual bottom-layer evaluation index reflects the uncertainty of the input value;
the input data is converted using the following formula:
wherein the content of the first and second substances,,representing input dataTo pairThe degree of matching of (c);representing previous indicators in a confidence ruleTo (1)The value of each reference value;
selecting a matching function for calculating the matching degree according to the property of the evaluation index; if the reference value precondition is a discrete value, the following formula is adopted:
reference value for indexValue of (c)For convenience of description, assume that the sequence is monotonic, and the function:
6. the unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning according to claim 1, wherein a basic confidence number is constructed, and a joint confidence degree of bottom layer evaluation indexes is obtained according to the basic confidence number and a confidence degree structure after data conversion of each bottom layer evaluation index, and the method comprises the following steps:
basic credits are different types of degrees of trust, wherein,i.e. by,Is a basic confidence number indicating the probability quality assigned to the result setRepresenting an uncertainty associated with the activation weight;representing uncertainty arising from incomplete input information;representing a reliability structure;indicating the indexThe weight of (c);is the label of the input index,reference signs being given to the reference values;
The result of calculation in the constructed basic credible number is stored by using a matrix, so that the calculation in evidence combination is facilitated, for example, each row of the matrix respectively representsI.e. shareRows, each column representing a respective reference number for an active rule,representing the entire output result set.
7. The unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method based on evidence reasoning according to claim 1, wherein a basic confidence number is constructed, and a joint confidence degree of bottom layer evaluation indexes is obtained according to the basic confidence number and a confidence degree structure after data conversion of each bottom layer evaluation index, and the method comprises the following steps:
wherein the content of the first and second substances,,indicating the number of input index data;a label indicating all lower layer indexes corresponding to the input index;indicating the probability quality assigned to the result set,representing an uncertainty associated with the activation weight;representing uncertainty due to incomplete input information;representing the whole output result set;、、、in order to obtain the intermediate process quantities,is the final joint confidence, i.e., the confidence value of the combat effectiveness.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211290927.4A CN115374652A (en) | 2022-10-21 | 2022-10-21 | Evidence reasoning-based unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211290927.4A CN115374652A (en) | 2022-10-21 | 2022-10-21 | Evidence reasoning-based unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115374652A true CN115374652A (en) | 2022-11-22 |
Family
ID=84072711
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211290927.4A Pending CN115374652A (en) | 2022-10-21 | 2022-10-21 | Evidence reasoning-based unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115374652A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115576358A (en) * | 2022-12-07 | 2023-01-06 | 西北工业大学 | Unmanned aerial vehicle distributed control method based on machine vision |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106597840A (en) * | 2017-01-16 | 2017-04-26 | 杭州电子科技大学 | PID parameter setting method based on production rule reasoning |
CN112749882A (en) * | 2020-12-28 | 2021-05-04 | 广东电网有限责任公司佛山供电局 | Transformer state evaluation method based on cloud model and fuzzy evidence reasoning |
CN113239244A (en) * | 2021-07-12 | 2021-08-10 | 中国人民解放军国防科技大学 | Multi-uncertainty preference obtaining method and device based on credibility structure and electronic equipment |
-
2022
- 2022-10-21 CN CN202211290927.4A patent/CN115374652A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106597840A (en) * | 2017-01-16 | 2017-04-26 | 杭州电子科技大学 | PID parameter setting method based on production rule reasoning |
CN112749882A (en) * | 2020-12-28 | 2021-05-04 | 广东电网有限责任公司佛山供电局 | Transformer state evaluation method based on cloud model and fuzzy evidence reasoning |
CN113239244A (en) * | 2021-07-12 | 2021-08-10 | 中国人民解放军国防科技大学 | Multi-uncertainty preference obtaining method and device based on credibility structure and electronic equipment |
Non-Patent Citations (2)
Title |
---|
姜江: "证据网络建模、推理及学习方法研究", 《中国博士学位论文全文数据库 基础科学辑》 * |
程贲 等: "基于证据推理的武器装备体系能力需求满足度评估方法", 《系统工程理论与实践》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115576358A (en) * | 2022-12-07 | 2023-01-06 | 西北工业大学 | Unmanned aerial vehicle distributed control method based on machine vision |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112131786B (en) | Target detection and distribution method and device based on multi-agent reinforcement learning | |
CN107479368A (en) | A kind of method and system of the training unmanned aerial vehicle (UAV) control model based on artificial intelligence | |
CN106873628A (en) | A kind of multiple no-manned plane tracks the collaboration paths planning method of many maneuvering targets | |
CN109960148B (en) | Autonomy evaluation method and system for intelligent unmanned system | |
CN110703802A (en) | Automatic bridge detection method and system based on multi-unmanned aerial vehicle cooperative operation | |
CN113791634A (en) | Multi-aircraft air combat decision method based on multi-agent reinforcement learning | |
CN110322104A (en) | Performance indicators in interactive computer simulation | |
CN111767789A (en) | Crowd evacuation method and system based on multi-carrier intelligent guidance | |
CN112712193A (en) | Multi-unmanned aerial vehicle local route planning method and device based on improved Q-Learning | |
CN115374652A (en) | Evidence reasoning-based unmanned aerial vehicle cluster cooperative obstacle avoidance capability test evaluation method | |
CN112198892A (en) | Multi-unmanned aerial vehicle intelligent cooperative penetration countermeasure method | |
CN115900433A (en) | Decision method of multi-agent unmanned countermeasure system based on SWOT analysis and behavior tree | |
CN116560409A (en) | Unmanned aerial vehicle cluster path planning simulation method based on MADDPG-R | |
CN115293022A (en) | Aviation soldier intelligent agent confrontation behavior modeling method based on OptiGAN and spatiotemporal attention | |
Ghosh et al. | A deep ensemble method for multi-agent reinforcement learning: A case study on air traffic control | |
CN117150757A (en) | Simulation deduction system based on digital twin | |
Yang et al. | Learning graph-enhanced commander-executor for multi-agent navigation | |
Roldán et al. | A proposal of multi-UAV mission coordination and control architecture | |
CN106250663A (en) | A kind of Architecture simulation method based on quantum electronics description | |
CN110322098A (en) | S.O.P. feedback during interactive computer simulation | |
Kang et al. | Application and analysis of computer network technology in electronic information engineering | |
Rincon et al. | Rapid deployment of mobile robots under temporal, performance, perception, and resource constraints | |
Sun et al. | Learning-based perception contracts and applications | |
Zhang et al. | Evaluation method of UAV cluster simulation models with different granularity based on sensitivity analysis | |
Hao et al. | Flight Trajectory Prediction Using an Enhanced CNN-LSTM Network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20221122 |
|
RJ01 | Rejection of invention patent application after publication |