CN113177583B - Aerial target clustering method - Google Patents
Aerial target clustering method Download PDFInfo
- Publication number
- CN113177583B CN113177583B CN202110413139.9A CN202110413139A CN113177583B CN 113177583 B CN113177583 B CN 113177583B CN 202110413139 A CN202110413139 A CN 202110413139A CN 113177583 B CN113177583 B CN 113177583B
- Authority
- CN
- China
- Prior art keywords
- clustering
- class
- attribute
- swbwp
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 35
- 238000005259 measurement Methods 0.000 claims abstract description 9
- 238000004364 calculation method Methods 0.000 claims abstract description 7
- 235000012907 honey Nutrition 0.000 claims description 60
- 239000011159 matrix material Substances 0.000 claims description 24
- 230000008569 process Effects 0.000 claims description 24
- 230000000694 effects Effects 0.000 claims description 16
- 241000257303 Hymenoptera Species 0.000 claims description 9
- 239000006185 dispersion Substances 0.000 claims description 9
- 239000002131 composite material Substances 0.000 claims description 6
- 239000000126 substance Substances 0.000 claims description 5
- 238000013016 damping Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 claims description 4
- 150000001875 compounds Chemical class 0.000 claims description 3
- 238000010276 construction Methods 0.000 claims description 3
- 238000005192 partition Methods 0.000 claims description 3
- 238000002474 experimental method Methods 0.000 description 10
- 230000015572 biosynthetic process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000007123 defense Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 241000122205 Chamaeleonidae Species 0.000 description 1
- 102100035261 FYN-binding protein 1 Human genes 0.000 description 1
- 108091011190 FYN-binding protein 1 Proteins 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 206010012601 diabetes mellitus Diseases 0.000 description 1
- 125000000524 functional group Chemical group 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses an aerial target clustering method, which comprises the following steps of S1: subjective based on comprehensive weighting theory and combined with attributesObjective weight, generating attribute comprehensive weight influencing the target grouping result; s2: considering the difference of different attributes on clustering influence, introducing comprehensive weight into similarity calculation, optimizing similarity measurement, constructing SWBWP index for determining optimal clustering number of target grouping, and determining optimal clustering number c opt The model of (2); s3: roughly searching a value interval [ P ] of a deviation parameter P by adopting a half-division method min ,P max ]If the cluster number is found [2,c max ]All corresponding P (i), i =1,2 max -1, calculating SWBWP indexes corresponding to different cluster numbers, and determining the optimal cluster number c by using the SWBWP indexes opt (ii) a S4: using the SWBWP value of the clustering result as a fitness function value, and adopting an ABC algorithm to precisely search a deviation parameter subspace [ P n ,P x ]Determining an optimum bias parameter P b . The method aims at the problem of target grouping with unknown target group number, and can achieve the purpose of efficient and accurate target grouping in the air high-confrontation environment.
Description
Technical Field
The invention relates to the technical field of assistant decision-making, in particular to an aerial target clustering method.
Background
In recent years, with the increasing complexity and mutability of the air defense, a cluster defense mode becomes an important mode of the air defense in the future. The targets in the same confrontation cluster have a common overall target, in order to realize the target, an opponent commander divides the aerial target in the cluster into a plurality of groups with different scales, and each group realizes one sub-target in the overall target; each group consists of a plurality of teams which are mutually cooperated, and each team executes a specific combat task; the formation is composed of more than 1 aerial target, and the targets in the formation need to keep a certain height and distance difference when flying, so that the targets in the same formation have similar maneuvering states.
The targets are grouped according to the maneuvering state of the aerial targets, such as distance, azimuth angle, course angle, speed, height and the like, and the grouping division condition of the targets of the other party can be obtained. The opposite director establishes the cooperative relationship among the targets in the process of forming the belonged targets, and hides the action intention of forming. The target grouping is the reverse process of establishing formation, can obtain the cooperative relationship among targets, and is the basis for identifying the action intention of the formation of the opposite party.
The clustering problem is essentially a clustering problem under the condition of unknown class number, and is a process of dividing targets with high similarity of characteristic attributes into the same class, and one party cannot acquire accurate target cluster number information of the other party before clustering. In the current research, the number of target groups of the opposite party is assumed to be known. For example, yuan Deping and the like, a multi-formation target clustering method is provided, namely, target clustering is realized by adopting a chameleon algorithm based on certain constraints according to the geometric situation of an opposite target, then, an attack advantage function of the opposite is calculated according to the geometric situation of an opposite group, an attack matrix is obtained through subjective and objective weight derivation, and then, the target group is divided. Liu Jijun, etc., an improved ISODATA algorithm is proposed, which optimizes the selection of the initial class centers based on the argument distribution, reduces the clustering computation amount, and improves the clustering effect. Wu Wenlong and the like adopt k-dist descending order diagrams to realize the division of different density targets, realize the grouping of the different density targets and improve the applicability of the ISODATA algorithm. Zhang Xuliang et al improve the k-means algorithm by introducing modularity and achieve clustering of land battlefield targets with this method.
The method is based on a clustering idea to research an aerial target clustering method under a countermeasure condition, and converts a target clustering problem into a clustering problem to perform problem modeling and optimization solution, but the methods assume that the number of target clusters of the opposite side is constant, and the applicability of the method is insufficient.
Disclosure of Invention
Aiming at the existing problems, the invention aims to provide an aerial target clustering and clustering method, which can realize the purpose of high-efficiency and accurate target clustering in an aerial high-confrontation environment aiming at the problem of target clustering with unknown target cluster number.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
an aerial target clustering method is characterized by comprising the following steps,
s1: based on a comprehensive weighting theory, combining subjective and objective weights of all attributes in the moving process of the aerial target to generate an attribute comprehensive weight influencing a target grouping result;
s2: based on the difference of cluster influence caused by different attributes in the air target motion process, the comprehensive weight is introduced into the similarity calculation, the similarity measurement is optimized, an SWBWP index for determining the optimal cluster number of the target grouping is constructed, and the optimal cluster number c is determined opt The model of (2);
s3: roughly searching a value interval [ P ] of a deviation parameter P by adopting a half-division method min ,P max ]If the cluster number is found [2,c max ]All corresponding P (i), i =1,2 max 1, calculating SWBWP indexes corresponding to different cluster numbers, and determining the optimal cluster number c by using the SWBWP indexes opt ;
S4: using SWBWP value of clustering result as fitness function value, adopting ABC algorithm to bias parameter subspace [ P n ,P x ]Performing a fine search to determine an optimal bias parameter P b 。
Further, the specific operation of step S1 includes the following steps,
s101: let the subjective weight of jth attribute in the moving process of aerial target be T j Then the subjective weight of the aerial target motion attribute T = (T) 1 ,T 2 ,...,T m ) Wherein, in the step (A),m is the number of the target motion attributes; t = (1/m, 1/m.,. 1/m) when the domain expert does not evaluate the attribute;
s102: for attribute set B = (B) ij ) n*m Regularized to D = (D) ij ) n*m Then the entropy of the jth attribute isWherein n is nullThe total number of the medium targets,when d is ij ' =0, d ij ln d ij ′=0;
S103: let the objective weight of the jth attribute be U j Then attribute set B = (B) ij ) n*m Has an objective weight ofWherein the content of the first and second substances,k represents the kth attribute, and k ≠ j;
s104: let the composite weight of the jth attribute be w j If the subjective weight and the objective weight of the attribute are comprehensively considered and the comprehensive weight of the attribute is obtained by weighting, the attribute set B = (B) ij ) n*m The composite weight of the j th attribute is w j =(1-α)T j +αU j Wherein, in the process,alpha is a preference coefficient and is the proportion of the objective weight to the comprehensive weight.
Further, the specific operation of step S2 includes the following steps,
s201: optimizing similarity measurement; introducing the comprehensive weight generated in the step S1 into the calculation of the similarity matrix S, and optimizing the similarity metric, so that S (i, j) = - (d) i -d j ) T W(d i -d j ) (i ≠ j), where W is an attribute weight W j (j =1,2,... M) is a diagonal matrix of diagonal elements;
s202: calculating weighted minimum inter-class distance swbd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining the weighted minimum inter-class distance swbd (j, i) of the ith sample of the jth class as the minimum value of the average weighted distances from the sample to the samples in other classes, thenWhere k and j are the class of the sample, d i (j) Is the ith sample of the jth class, d p (k) For the p sample of the k class, n k The number of samples contained in the kth class is W, and the W is an attribute weight diagonal matrix;
s203: calculating weighted minimum intra-class distance swwd (j, i); let K = { D } be the clustering space, where D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted minimum intra-class distance swwd (j, i) of ith sample of jth class as an average weighted distance value from the sample to other samples of jth classIn the formula (d) q (j) For the qth sample of class j, q ≠ i, n j The number of samples of the j-th class;
s204: calculating weighted clustering distance swbawd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted clustering distance swbawd (j, i) of the ith sample of the jth class as the sum of weighted minimum inter-class distance and intra-class distance of the samples
S205: calculating a weighted clustering dispersion distance swbswd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted cluster dispersion distance swbswd (j, i) of ith sample of jth class as the difference between weighted minimum inter-class distance and intra-class distance of the samples, then
S206: computing weighted inter-cluster intra-class delineationSub-indices SWBWP (j, i); let K = { D } be the clustering space, where D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted inter-cluster inter-class partition index SWBWP (j, i) of ith sample of jth class as the ratio of weighted cluster dispersion distance to cluster distance, thenThe SWBWP index can reflect the clustering condition of a single sample, and the larger the SWBWP index value is, the better the clustering effect of the sample is; for the attribute set, the larger the average SWBWP value of all samples is, the better the clustering effect of the attribute set is;
s207: construction of the optimal clustering number c opt Model (2)Wherein the content of the first and second substances,
further, the specific operation of step S3 includes the following steps,
s301: searching a value interval [ P ] of a deviation parameter P by adopting a half-division method min ,P max ]Obtaining the clustering number [2,c max ]Corresponding P (i), i =1,2 max -1; wherein, c max GetP min =min s(i,j),i≠j,j=1,2,...,n,P max =max s(i,j),i≠j,j=1,2,...,n;
S302: calculate P (i), i =1,2 max -1 SWBWP value of clustering result corresponding to the maximum value, finding out the clustering number corresponding to the maximum value as the optimal clustering number c opt ;
S303: partial parameter subspace [ p (c) is searched by adopting a half-division method opt -1),p(c opt )]Determining the optimal cluster number c opt Corresponding lower bound p of the deviation parameter subspace n (ii) a Search of a biased parameter subspace [ p (c) by a bisection method opt ),p(c opt +1)]Determining the optimal cluster number c opt Corresponding upper bound p of the deviation parameter subspace x 。
Further, in step S301, an improved AP algorithm is used for clustering, all samples are regarded as potential class representations, and the probability of being selected as the class representations is the same, i.e. S (i, i) are both biased parameters P, and S (i, i) is the median of the corresponding row in the similarity matrix S; in order to select a suitable class representation in the sample, it is necessary to continuously search the attraction degree r and the attribution degree a, wherein r (i, j) represents the sample d j Is suitable for making i Class (ii) represents the degree of a point, a (i, j) represents d i Is suitable for making j Degree of class representation point of (1);
for sample d i Calculating the sum of the attraction degree r (i, j) and the attribution degree a (i, j) of other samples, and selecting the sample d with the maximum sum j As d i The update process of the attraction degree r (i, j) and the attribution degree a (i, j) is as follows: else,in the formula, t is iteration times; lambda is a damping factor, and lambda is more than 0.5 and less than 1.
Further, the specific operation of step S4 includes the following steps,
s401: an initialization stage; initial honey sources are all randomly generated in a feasible interval, and the number of rows of the matrix is the number N of the honey sources fs The column of the matrix is a deviation parameter value, and the ith row and the jth column of the element x in the matrix ij Is calculated by the formulaIn the formula (I), the compound is shown in the specification,andrespectively representing the upper and lower bounds of j column variables in the matrix, and rand representing a random number with the value in a (0,1) interval;
s402: a bee hiring stage; after the initial honey source position is generated, the hiring bee searches for a better honey source nearby the honey source according to the honey source position in the memory, and the update formula is as followsIn the formula, x ij ' is an element for newly generating a honey source,for randomly selecting the corresponding position element, x, of the honey source i'j Corresponding position elements of other honey sources, and RAND is an updating threshold value which is generally 0.5;
S403: a bee observation stage; after all hired bees finish searching, honey source position information is exchanged with the observation bees, and the observation bees adopt a returned tournament selection operator to perform selection operation; at random in N fs Selecting 2 honey sources from the honey sources, calculating SWBWP values which are F respectively 1 And F 2 If F is 1 Is superior to F 2 I.e. satisfy F 1 >F 2 If so, selecting the 1 st honey source for updating, otherwise, selecting the 2 nd honey source for updating;
s404: a bee scouting stage; after iter _ limit iterations are carried out, if the solution quality of the honey source is not improved, the hiring bee corresponding to the honey source becomes a scout bee, the original honey source is abandoned, and the formula is adoptedGenerating a new honey source position;
s405: repeating the steps S402-S404 until reaching the maximum iteration number maxcycle, thereby carrying out deviation on the parameter subspace [ P ] n ,P x ]Performing a fine search to determine an optimal bias parameter P b 。
The beneficial effects of the invention are:
compared with the prior art, the aerial clustering method for generating the attribute comprehensive weight influencing the target grouping result can effectively combine subjective and objective information in the aerial confrontation opposite target grouping process, and the attribute comprehensive weight is more scientific and reasonable to determine; the improved AP algorithm is divided into a coarse search part and a fine search part for clustering, so that the clustering efficiency and the time cost are integrated; the ABC algorithm carries out fine search on the P subspace, so that the problem solving space is reduced to a certain extent, and the searching efficiency is improved. In a word, the method can realize the rapid and accurate grouping of the targets of the other side, thereby ensuring better situation perception effect.
Drawings
FIG. 1 is a flow chart of an aerial target clustering method of the present invention;
FIG. 2 is a relation diagram of cluster number-BWP/SWBWP index value of an attribute set Pid in SWBWP index feasibility verification experiments;
FIG. 3 is a graph showing the relationship between the Data attribute aggregation class number at the coarse search stage and the SWBWP index value in the clustering effect comparison experiment of the present invention;
FIG. 4 is a diagram of a fitness change process when an APBMABC (Affinity development Based on a section Method and Artificial Bee colony Algorithm) algorithm searches for an optimal bias parameter, which is obtained by combining two optimization methods of a half-section Method and an Artificial Bee colony algorithm in a clustering effect comparison experiment of the present invention with an AP algorithm;
FIG. 5 (a) is the result of APBWMMP algorithm clustering in the comparison experiment of clustering effect of the present invention;
FIG. 5 (b) is a clustering result of the adAP algorithm in the clustering effect comparison experiment of the present invention;
FIG. 5 (c) is the result of APBMABC algorithm clustering in the comparison experiment of the clustering effect of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the following further describes the technical solution of the present invention with reference to the drawings and the embodiments.
An aerial target clustering method, as shown in figure 1, comprises the following steps,
s1: based on a comprehensive weighting theory, combining subjective and objective weights of all attributes in the moving process of the aerial target to generate an attribute comprehensive weight influencing a target grouping result;
specifically, S101: let the subjective weight of jth attribute in the moving process of aerial target be T j The domain expert evaluates the clustering attributes according to the specific characteristics of the attributes in the attribute domain and by combining self experience, and gives subjective weight T = (T) to the aerial target motion attributes 1 ,T 2 ,...,T m ) Wherein, in the step (A),m is the number of attributes; t = (1/m, 1/m,. 1/m, 1/m) when the domain expert does not evaluate the attribute;
s102: for attribute set B = (B) ij ) n*m Regularized to D = (D) ij ) n*m Then the entropy of the jth attribute isWherein n is the total number of targets in the air,when d is ij ' =0, d ij lnd ij ′=0;
S103: let the objective weight of jth attribute be U j Then attribute set B = (B) ij ) n*m Has an objective weight ofWherein, the first and the second end of the pipe are connected with each other,k meterShowing the kth attribute, wherein k is not equal to j; when the difference of certain attribute of the sample is larger, the entropy value of the attribute is smaller, and the obtained weight is higher;
s104: let the composite weight of jth attribute be w j If the subjective weight and the objective weight of the attribute are comprehensively considered and the comprehensive weight of the attribute is obtained by weighting, the attribute set B = (B) ij ) n*m The composite weight of the j th attribute is w j =(1-α)T j +αU j Wherein, in the process,alpha is a preference coefficient and is the proportion of the objective weight to the comprehensive weight.
Further, step S2: based on the difference of cluster influence caused by different attributes in the air target motion process, the comprehensive weight is introduced into the similarity calculation, the similarity measurement is optimized, an SWBWP index for determining the optimal cluster number of the target grouping is constructed, and the optimal cluster number c is determined opt The model of (2);
specifically, S201: optimizing similarity measurement; most clustering algorithms take the Euclidean distance between samples as the similarity measurement, the default weights of all attributes are the same, and the difference of the influence of different attributes on clustering is not considered, namely s (i, j) = - | d i -d j || 2 ,(i≠j);
In the invention, the difference of the influence of the attributes on clustering is considered, and the similarity measurement is optimized; introducing the comprehensive weight generated in the step S1 into the calculation of the similarity matrix S, and optimizing the similarity metric, so that S (i, j) = - (d) i -d j ) T W(d i -d j ) (i ≠ j), where W is an attribute weight W j (j =1,2,... M) is a diagonal matrix of diagonal elements;
s202: calculating weighted minimum inter-class distance swbd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into class c, defining the weighted minimum inter-class distance swbd (j, i) of the ith sample of the jth class as the minimum value of the average weighted distances from the sample to the samples in other classes, thenWhere k and j are the class of the sample, d i (j) Is the ith sample of the jth class, d p (k) For the p sample of the k class, n k The number of samples contained in the kth class is W, and the W is an attribute weight diagonal matrix;
s203: calculating weighted minimum intra-class distance swwd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted minimum intra-class distance swwd (j, i) of ith sample of jth class as an average weighted distance value from the sample to other samples of jth classIn the formula (d) q (j) For the qth sample of class j, q ≠ i, n j The number of samples of the jth class;
s204: calculating a weighted clustering distance swbawd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted clustering distance swbawd (j, i) of the ith sample of the jth class as the sum of weighted minimum inter-class distance and intra-class distance of the samples
S205: calculating a weighted clustering dispersion distance swbswd (j, i); let K = { D } be the clustering space, where D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted cluster dispersion distance swbswd (j, i) of ith sample of jth class as the difference between weighted minimum inter-class distance and intra-class distance of the samples, then
S206: calculating weighted inter-cluster intra-class division indexes SWBWP (j, i); let K = { D } be a clustering space, whichIn, D = { D = { (D) 1 ,d 2 ,...,d n Assuming that all samples are clustered into c classes, defining weighted inter-cluster intra-class partition index SWBWP (j, i) of ith sample of j class as the ratio of weighted cluster dispersion distance to cluster distance, thenThe SWBWP index can reflect the clustering condition of a single sample, and the larger the SWBWP index value is, the better the clustering effect of the sample is; for the attribute set, the larger the average SWBWP value of all samples is, the better the clustering effect of the attribute set is. (ii) a
S207: construction of the optimal clustering number c opt Model (2)Wherein the content of the first and second substances,
further, step S3: roughly searching a value interval [ P ] of a deviation parameter P by adopting a half-division method min ,P max ]If the cluster number is found [2,c max ]All corresponding P (i), i =1,2 max 1, calculating SWBWP indexes corresponding to different cluster numbers, and determining the optimal cluster number c by using the SWBWP indexes opt ;
Specifically, S301: searching a value interval [ P ] of a deviation parameter P by adopting a half-division method min ,P max ]Obtaining the poly number [2,c max ]Corresponding P (i), i =1,2 max -1; wherein, c max Get theP min =mins(i,j),i≠j,j=1,2,...,n,P max =max s(i,j),i≠j,j=1,2,...,n;
Clustering by adopting an improved AP algorithm, regarding all samples as potential class representations, and selecting the samples as the class representations with the same probability, namely S (i, i) are all deviation parameters P, and S (i, i) is the median of a corresponding row in a similarity matrix S; is composed ofSelecting a suitable class representative from the samples requires searching for the attraction degree r and the attribution degree a, where r (i, j) represents the sample d j Is suitable for making i Class (ii) represents the degree of a point, a (i, j) represents d i Is suitable for making j Degree of class representation point of (1);
for sample d i Calculating the sum of the attraction degree r (i, j) and the attribution degree a (i, j) of other samples, and selecting the sample d with the maximum sum j As d i The update process of the attraction degree r (i, j) and the attribution degree a (i, j) is as follows: else,
in the formula, t is iteration times; lambda is damping factor, lambda is more than 0.5 and less than 1.
S302: calculate P (i), i =1,2 max -1 SWBWP value of clustering result corresponding to the maximum value, finding out the clustering number corresponding to the maximum value as the optimal clustering number c opt ;
S303: partial parameter subspace [ p (c) is searched by adopting a half-division method opt -1),p(c opt )]Determining the optimal cluster number c opt Corresponding lower bound p of the bias parameter subspace n (ii) a Search for a biased parameter subspace [ p (c) ] by a bisection method opt ),p(c opt +1)]Determining the optimal cluster number c opt Corresponding upper bound p of the deviation parameter subspace x 。
Further, step S4: using SWBWP value of clustering result as fitness function value, adopting ABC algorithm to bias parameter subspace [ P n ,P x ]Performing a fine search to determine an optimal bias parameter P b 。
Specifically, S401: an initialization stage; initial honey sources are all randomly generated in a feasible interval, and the number of rows of the matrix is the number N of the honey sources fs The columns of the matrix are biased parameter values, momentsRow i and column j elements x in the array ij Is calculated by the formulaIn the formula (I), the compound is shown in the specification,andrespectively representing the upper and lower bounds of j-th column variables in the matrix, and rand representing a random number with the value in a (0,1) interval;
s402: a bee hiring stage; after the initial honey source position is generated, the hiring bee searches for a better honey source nearby the honey source according to the honey source position in the memory, and the update formula is as followsIn the formula, x ij ' is an element for newly generating a honey source,for randomly selecting the corresponding position element, x, of the honey source i'j Corresponding position elements of other honey sources, and RAND is an updating threshold value which is generally 0.5;
S403: a bee observation stage; after all hired bees finish the search, honey source position information is exchanged with the observation bees, and the observation bees adopt returned championship selection operators to perform selection operation; at random in N fs Selecting 2 honey sources from the honey sources, calculating SWBWP values which are F respectively 1 And F 2 If F is 1 Is superior to F 2 I.e. satisfy F 1 >F 2 When the user wants to update the honey source, the user selects the 1 st honey source to update, otherwise, the user selects the 2 nd honey source to updateUpdating;
s404: a bee scouting stage; after iter _ limit iterations are carried out, if the solution quality of the honey source is not improved, the hiring bee corresponding to the honey source becomes a scout bee, the original honey source is abandoned, and the formula is adoptedGenerating a new honey source position;
s405: repeating the steps S402-S404 until reaching the maximum iteration number maxcycle, thereby carrying out deviation on the parameter subspace [ P ] n ,P x ]Performing a fine search to determine an optimal bias parameter P b 。
In summary, in the aerial target clustering method of the present invention, the preference coefficient α and the total number of honey sources N can be set first fs The maximum iteration times maxcycle and the maximum honey source staying times iter _ limit, and then the aerial target clustering can be carried out.
Simulation test:
to verify the effectiveness of the method proposed by the present invention, the following computer simulation experiments were performed.
The experimental environment is as follows: the simulation experiment adopts an Intel Core i7-6700HQ quad-Core processor, an 8GB memory and a computer of a Windows7 operating system, and uses MATLAB2016a to realize the simulation of the algorithm. The preference coefficient alpha is set to 0.5, the damping factor lambda is 0.8, and the total number of honey sources N fs The maximum iteration number maxcycle is 20, the maximum iteration number maxcycle is 80, and the maximum number of honey source stays iter _ limit is 8.
(1) SWBWP index feasibility verification experiment
The experiment used the real property sets Pima-indians-diabetes (Pid), break-cancer-wisconsin (Bcw) and Wine in the 3 university of california european schools (UCI) database, and two artificial property sets Model1 and Model2 as test property sets. And evaluating the clustering result by respectively using the Between-withProport (BWP) index and the SWBWP index in the invention to determine the optimal clustering number. The following table 1 is the best cluster number evaluated by the above 5 attribute sets, fig. 2 is a relational graph of the cluster number of the Pid attribute clusters and BWP and SWBWP index values, and the following table 2 is the BWP and SWBWP index values corresponding to different cluster numbers of the Model2 attribute set.
TABLE 1 optimal cluster number for attribute set evaluated by BWP and SWBWP metrics
TABLE 2 Cluster index values for Model2 attribute set
Number of | BWP | SWBWP | |
2 | 0.5090 | 0.6369 | |
3 | 0.5350 | 0.7630 | |
4 | 0.3439 | 0.5764 | |
5 | 0.3443 | 0.5106 | |
6 | 0.1996 | 0.5344 | |
7 | 0.2059 | 0.4965 | |
8 | 0.2141 | 0.5134 | |
9 | 0.2233 | 0.5241 | |
10 | 0.2266 | 0.5253 |
As can be seen from Table 1, the SWBWP index provided by the invention can obtain the same cluster number as the actual cluster number on three UCIs and two artificial attribute sets; the BWP index can obtain the same cluster number as the correct cluster number on the attribute sets Pid, bcw and Model2, but cannot obtain the correct cluster number on the Wine and Model1 attribute sets.
The number of sample classes of the attribute set Pid is 2, and as can be seen from fig. 2, the BWP and SWBWP index values both have the maximum value when the number of clusters is 2, and are the same as the number of true classes.
As can be seen from Table 2, the index value of the BWP index reaches a maximum value of 0.5350 when the number of clusters is 3, the index value of the SWBWP index reaches a maximum value of 0.7630 when the number of clusters is 3, and both the BWP index and the SWBWP index can obtain the optimal number of clusters for the Model2 attribute set.
(2) Clustering effect contrast experiment
Assuming that an enemy sends 300 airplanes together to spread a cluster to fight, the cluster is divided into four functional groups of reconnaissance, attack, penetration, monitoring and the like, the attribute set Data is the characteristic values of the motion attributes of the 300 airplanes, such as azimuth angle, distance, horizontal speed, course angle, height and the like, and part of Data is shown in the following table 3. Attribute subjective weight value T = (0.35,0.35,0.5,0.65,0.65). Based on attribute sets Bcw, wine, model1 and Data, the clustering precision is compared between the clustering method (APBMABC algorithm) and Adaptive Affinity propagation clustering algorithm (Adaptive Affinity clustering, ADAP) and Affinity propagation clustering algorithm (Affinity propagation based on weighted Mahalanobis Distance and membership optimization).
TABLE 3 object motion Attribute eigenvalues and cluster classes
FIG. 3 shows SWBWP index values under different clustering numbers searched in the rough search stage based on attribute set Data; FIG. 4 is a process diagram of searching for optimal bias parameters by the APBMABC algorithm based on the attribute set Data; as can be seen from fig. 3 and fig. 4, the APBMABC algorithm proposed in the present invention can effectively determine the optimal cluster number of the attribute set through the coarse search process, and determine the search range for the fine search process. The fine search process adopting the ABC algorithm is stable after being iterated for about 40 times, and the optimal clustering result is obtained.
Table 4 below shows the clustering results of three different algorithms based on the attribute set Data, and fig. 5 (a) -fig. 5 (c) show the comparison of the clustering results of three different algorithms based on the attribute set Data. From fig. 5 (a) -fig. 5 (c) and table 4, the APBWMMP, adAP and APBMABC algorithms all can obtain the same optimal cluster number as the actual cluster number, but the APBMABC algorithm clustering effect based on the two search processes of coarse search and fine search is better than the APBWMMP and adAP algorithms as a whole.
TABLE 4 APBWMMP, adaP and APBMABC algorithm clustering effect contrast
In summary, the aerial target clustering method based on the algorithm combining the coarse search and the fine search provided by the invention can provide a better target clustering scheme, and compared with other methods, the method provided by the invention has higher grouping precision.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (6)
1. An aerial target clustering method is characterized by comprising the following steps,
s1: based on a comprehensive weighting theory, combining subjective and objective weights of all attributes in the moving process of the aerial target to generate an attribute comprehensive weight influencing a target grouping result;
s2: based on the difference of cluster influence caused by different attributes in the air target motion process, the comprehensive weight is introduced into the similarity calculation, the similarity measurement is optimized, an SWBWP index for determining the optimal cluster number of the target grouping is constructed, and the optimal cluster number c is determined opt The model of (2);
s3: roughly searching a value interval [ P ] of a deviation parameter P by adopting a half-division method min ,P max ]If the cluster number is found [2,c max ]All corresponding P (i), i =1,2 max 1, calculating SWBWP indexes corresponding to different cluster numbers, and determining the optimal cluster number c by using the SWBWP indexes opt ;
S4: using SWBWP value of clustering result as fitness function value, adopting ABC algorithm to bias parameter subspace [ P n ,P x ]Performing a fine search to determine an optimal bias parameter P b 。
2. The aerial target clustering method according to claim 1, wherein the specific operation of step S1 comprises the following steps,
S101:let the subjective weight of jth attribute in the moving process of aerial target be T j Then the subjective weight of the aerial target motion attribute T = (T) 1 ,T 2 ,...,T m ) Wherein, in the step (A),m is the number of the target motion attributes; t = (1/m, 1/m.,. 1/m) when the domain expert does not evaluate the attribute;
s102: for attribute set B = (B) ij ) n*m Regularized to D = (D) ij ) n*m Then the entropy value of the jth attribute isWherein n is the total number of the targets in the air,when d is ij ' =0, d ij lnd ij ′=0;
S103: let the objective weight of the jth attribute be U j Then attribute set B = (B) ij ) n*m Has an objective weight ofWherein the content of the first and second substances,k represents the kth attribute, and k ≠ j;
s104: let the composite weight of jth attribute be w j If the subjective weight and the objective weight of the attribute are comprehensively considered and the comprehensive weight of the attribute is obtained by weighting, the attribute set B = (B) ij ) n*m The composite weight of the j th attribute is w j =(1-α)T j +αU j Wherein, in the step (A),alpha is a preference coefficient and is the proportion of the objective weight to the comprehensive weight。
3. The aerial target clustering method according to claim 2, wherein the specific operation of step S2 comprises the following steps,
s201: optimizing similarity measurement; introducing the comprehensive weight generated in the step S1 into the calculation of the similarity matrix S, and optimizing the similarity metric, so that S (i, j) = - (d) i -d j ) T W(d i -d j ) (i ≠ j), where W is an attribute weight W j (j =1,2,... M) is a diagonal matrix of diagonal elements;
s202: calculating weighted minimum inter-class distance swbd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining the weighted minimum inter-class distance swbd (j, i) of the ith sample of the jth class as the minimum value of the average weighted distances from the sample to the samples in other classes, thenWhere k and j are the class of the sample, d i (j) Is the ith sample of the jth class, d p (k) For the p sample of the k class, n k The number of samples contained in the kth class is W, and the W is an attribute weight diagonal matrix;
s203: calculating weighted minimum intra-class distance swwd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted minimum intra-class distance swwd (j, i) of ith sample of jth class as an average weighted distance value from the sample to other samples of jth classIn the formula (d) q (j) For the qth sample of class j, q ≠ i, n j The number of samples of the j-th class;
s204: calculating weighted clustering distance swbawd (j, i); let K = { D } be the clustering space, where D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted clustering distance swbawd (j, i) of the ith sample of the jth class as the sum of weighted minimum inter-class distance and intra-class distance of the samples
S205: calculating a weighted clustering dispersion distance swbswd (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted cluster dispersion distance swbswd (j, i) of ith sample of jth class as the difference between weighted minimum inter-class distance and intra-class distance of the samples, then
S206: calculating weighted inter-cluster intra-class division indexes SWBWP (j, i); let K = { D } be a clustering space, where D = { D = { D = 1 ,d 2 ,...,d n And assuming that all samples are clustered into c classes, defining weighted inter-cluster inter-class partition index SWBWP (j, i) of ith sample of jth class as the ratio of weighted cluster dispersion distance to cluster distance, thenThe SWBWP index can reflect the clustering condition of a single sample, and the larger the SWBWP index value is, the better the clustering effect of the sample is; for the attribute set, the larger the average SWBWP value of all samples is, the better the clustering effect of the attribute set is;
4. the aerial target clustering method according to claim 3, wherein the specific operation of step S3 comprises the following steps,
s301: adopting a semi-division method to search a value interval [ P ] of a deviation parameter P min ,P max ]Obtaining the poly number [2,c max ]Corresponding P (i), i =1,2 max -1; wherein, c max GetP min =min s(i,j),i≠j,j=1,2,...,n,P max =max s(i,j),i≠j,j=1,2,...,n;
S302: calculate P (i), i =1,2 max -1 SWBWP value of clustering result corresponding to the maximum value, finding out the clustering number corresponding to the maximum value as the optimal clustering number c opt ;
S303: partial parameter subspace [ p (c) is searched by adopting a half-division method opt -1),p(c opt )]Determining the optimal cluster number c opt Corresponding lower bound p of the bias parameter subspace n (ii) a Search for a biased parameter subspace [ p (c) ] by a bisection method opt ),p(c opt +1)]Determining the optimal clustering number c opt Corresponding upper bound p of the deviation parameter subspace x 。
5. The aerial target clustering method according to claim 4, wherein in step S301, clustering is performed by using an improved AP algorithm, all samples are considered as potential class representatives, and the probability of being selected as the class representatives is the same, i.e. S (i, i) are all deviation parameters P, and S (i, i) is the median of the corresponding row in the similarity matrix S; in order to select a suitable class representation in a sample, it is necessary to continuously search for an attraction degree r and an attribution degree a, where r (i, j) represents a sample d j Is suitable for making i Class (ii) represents the degree of a point, a (i, j) represents d i Is suitable for making j Degree of class representation point of (1);
for sample d i Calculating the attraction r (i, j) and attribution of other samplesSum of the degrees a (i, j), and the sample d with the largest sum is selected j As d i The update process of the attraction degree r (i, j) and the attribution degree a (i, j) is as follows:If i≠j,else,in the formula, t is iteration times; lambda is damping factor, lambda is more than 0.5 and less than 1.
6. The aerial target clustering method according to claim 4, wherein the specific operation of step S4 comprises the following steps,
s401: an initialization stage; initial honey sources are all randomly generated in a feasible interval, and the number of rows of the matrix is the number N of the honey sources fs The column of the matrix is a deviation parameter value, and the ith row and the jth column of the element x in the matrix ij Is calculated by the formulaIn the formula (I), the compound is shown in the specification,andrespectively representing the upper and lower bounds of j-th column variables in the matrix, and rand representing a random number with the value in a (0,1) interval;
s402: a bee hiring stage; after the initial honey source position is generated, the hiring bee searches for a better honey source nearby the honey source according to the honey source position in the memory, and the update formula is as followsIn the formula, x ij ' is an element of newly generated honey source,For randomly selecting the corresponding position element, x, of the honey source i'j Corresponding position elements of other honey sources, and RAND is an updating threshold value which is generally 0.5;
S403: a bee observation stage; after all hired bees finish searching, honey source position information is exchanged with the observation bees, and the observation bees adopt a returned tournament selection operator to perform selection operation; at random in N fs Selecting 2 honey sources from the honey sources, and calculating SWBWP values which are respectively F 1 And F 2 If F is 1 Is superior to F 2 I.e. satisfy F 1 >F 2 When the honey source is updated, the 1 st honey source is selected, otherwise, the 2 nd honey source is selected for updating;
s404: a bee scouting stage; after iter _ limit iterations are carried out, if the solution quality of the honey source is not improved, the hiring bee corresponding to the honey source becomes a scout bee, the original honey source is abandoned, and the formula is adoptedGenerating a new honey source position;
s405: repeating the steps S402-S404 until reaching the maximum iteration number maxcycle, thereby carrying out deviation on the parameter subspace [ P ] n ,P x ]Performing a fine search to determine an optimal bias parameter P b 。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110413139.9A CN113177583B (en) | 2021-04-16 | 2021-04-16 | Aerial target clustering method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110413139.9A CN113177583B (en) | 2021-04-16 | 2021-04-16 | Aerial target clustering method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113177583A CN113177583A (en) | 2021-07-27 |
CN113177583B true CN113177583B (en) | 2022-10-18 |
Family
ID=76923522
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110413139.9A Expired - Fee Related CN113177583B (en) | 2021-04-16 | 2021-04-16 | Aerial target clustering method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113177583B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115658990B (en) * | 2022-09-27 | 2023-04-21 | 中国人民解放军军事科学院战略评估咨询中心 | Data processing method and device for target space grouping |
CN117291603B (en) * | 2023-09-08 | 2024-04-05 | 湖北谊嘉金融仓储有限公司 | Risk assessment system with large data ratio corresponding receipt confirming right |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109086831A (en) * | 2018-08-16 | 2018-12-25 | 李宏伟 | Hybrid Clustering Algorithm based on Fuzzy C-Means Algorithm and artificial bee colony clustering algorithm |
CN111275132A (en) * | 2020-02-24 | 2020-06-12 | 电子科技大学 | Target clustering method based on SA-PFCM + + algorithm |
-
2021
- 2021-04-16 CN CN202110413139.9A patent/CN113177583B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109086831A (en) * | 2018-08-16 | 2018-12-25 | 李宏伟 | Hybrid Clustering Algorithm based on Fuzzy C-Means Algorithm and artificial bee colony clustering algorithm |
CN111275132A (en) * | 2020-02-24 | 2020-06-12 | 电子科技大学 | Target clustering method based on SA-PFCM + + algorithm |
Non-Patent Citations (1)
Title |
---|
基于改进人工蜂群的核模糊聚类算法;梁冰等;《计算机应用》;20170910(第09期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113177583A (en) | 2021-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113177583B (en) | Aerial target clustering method | |
CN110544011B (en) | Intelligent system combat effectiveness evaluation and optimization method | |
Xue et al. | A multi-objective particle swarm optimisation for filter-based feature selection in classification problems | |
CN107886184B (en) | Multi-type air defense weapon mixed-programming fire group target distribution optimization method | |
CN114841055B (en) | Unmanned aerial vehicle cluster task pre-allocation method based on generation countermeasure network | |
CN115420294A (en) | Unmanned aerial vehicle path planning method and system based on improved artificial bee colony algorithm | |
CN112070418B (en) | Weapon target distribution method of multi-target whale optimization algorithm | |
CN112861257A (en) | Aircraft fire control system precision sensitivity analysis method based on neural network | |
CN114926026B (en) | Target distribution optimization method for multi-dimensional feature deep learning | |
Mao et al. | Secure deep neural network models publishing against membership inference attacks via training task parallelism | |
Lei et al. | A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking | |
Yan et al. | A novel clustering algorithm based on fitness proportionate sharing | |
Wan et al. | Operation loop-based optimization model for resource allocation to military countermeasures versus probabilistic threat | |
CN116739428A (en) | Method for analyzing target value of fire striking of subjective and objective weighted TOPSIS (top-down sequence of steps of analysis) soldier | |
CN116088586B (en) | Method for planning on-line tasks in unmanned aerial vehicle combat process | |
CN113919425B (en) | Autonomous aerial target allocation method and system | |
Zhang et al. | Weapon–target assignment using a whale optimization algorithm | |
CN112612300B (en) | Multi-platform intelligent decision-making multi-target game method and device | |
CN110930054A (en) | Data-driven battle system key parameter rapid optimization method | |
CN113110595A (en) | Heterogeneous unmanned aerial vehicle group cooperation method for target verification | |
CN106910148B (en) | Collaborative filtering-based adaptive pushing method for command elements | |
Cai | Exploring characteristics of an effective mix of precision and volume indirect fire in urban operations using agent-based simulation | |
CN117590757B (en) | Multi-unmanned aerial vehicle cooperative task allocation method based on Gaussian distribution sea-gull optimization algorithm | |
CN116739431B (en) | Aircraft real-time threat assessment method based on analytic hierarchy process | |
Tianhan et al. | Target Threat Assessment Using Particle Swarm Optimization and BP Neural Network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20221018 |