CN112800999A - Intelligent control system target identification fusion method based on intelligent networking - Google Patents

Intelligent control system target identification fusion method based on intelligent networking Download PDF

Info

Publication number
CN112800999A
CN112800999A CN202110159125.9A CN202110159125A CN112800999A CN 112800999 A CN112800999 A CN 112800999A CN 202110159125 A CN202110159125 A CN 202110159125A CN 112800999 A CN112800999 A CN 112800999A
Authority
CN
China
Prior art keywords
identification information
target identification
target
intelligent
correlation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110159125.9A
Other languages
Chinese (zh)
Other versions
CN112800999B (en
Inventor
刘庆利
王文广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University
Original Assignee
Dalian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University filed Critical Dalian University
Priority to CN202110159125.9A priority Critical patent/CN112800999B/en
Publication of CN112800999A publication Critical patent/CN112800999A/en
Application granted granted Critical
Publication of CN112800999B publication Critical patent/CN112800999B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides an intelligent control system target identification fusion method based on intelligent networking. The method comprises the steps of firstly constructing a target identification information fusion framework based on intelligent conjuncted networking, secondly calculating the correlation among target identification information according to Pearson correlation coefficients on the basis of the framework, meanwhile introducing classification ideas and paying attention to probability assignment of a maximum target, modifying the target identification information by taking the correlation and the maximum target importance of the target identification information as weights, and finally fusing the modified target identification information by using a DS fusion rule. The result shows that the probability distribution of the method to the target is more reasonable, the support degree of the unknown target is reduced, and the support degree of the correct target is improved.

Description

Intelligent control system target identification fusion method based on intelligent networking
Technical Field
The invention relates to the field of data mining, in particular to an intelligent finger control system target identification fusion method based on intelligent networking.
Background
The attribute or type identification of the target to be detected is a precondition for realizing accurate striking of the command and control system, and the intelligentization degree of the command and control system is embodied, which is the key for improving the core competitiveness of the military field. In the practical application of target identification, joint identity information reports given by a plurality of sensors are mostly adopted for target identification, and a plurality of methods are emerged, but some defects exist.
Specifically, a traditional Chinese medicine diagnosis model based on an improved DS evidence theory is proposed in the document "application of the improved DS evidence theory in the traditional Chinese medicine diagnosis model". The model has high prediction accuracy, the prediction time is in an acceptable range, the feasibility of the model is verified, but the method can be used for assigning values to unknown fields, and the fusion effect is not good in the traditional DS evidence theory when non-conflicting evidence is processed in certain scenes.
In the document, "intrusion detection algorithm based on improved DS evidence fusion and ELM", it is mentioned that, when evidence conflicts are too large, the similarity of the evidence should be considered, a weight coefficient is obtained by calculating the similarity between the evidence, and the influence of the weight coefficient is considered in combination, so that a corresponding evidence combination formula is proposed as a thought, but improving the combination rule brings some new problems, which not only increases the calculation amount in the evidence combination process, but also destroys the exchange law and the combination law of the evidence combination rule.
In the document "An Ensemble Deep conditional Neural Network Model with Improved D-S Evidence Fusion for Bearing Fault Diagnosis", the similarity degree of the Evidence body is evaluated by calculating the 1-norm distance of the Evidence body, the weight distribution of the Evidence body is carried out according to the similarity degree, and 3 of 4 common paradoxs are combined with the corrected combination rule to obtain the result superior to other classical improvement methods.
In the document, a weighted fusion algorithm is used for fusing infrared and visible light images in an infrared and visible light image fusion algorithm based on an improved guided filtering and dual-channel pulse transmission cortical model, then effective characteristic vectors of the fused images are extracted by using an improved depth network, and then classification and identification are carried out.
A multi-scale transformation-based Infrared and Visible light fusion algorithm is proposed in a Scene Matching for infracted and Visible Images with Compressive SIFT Feature registration in Bandelet Domain, features of different scales are fused on the basis of mutual information registration, and the fusion algorithm has high fusion precision and is easily influenced by jitter and noise.
In the document visible light-near infrared HSV image fusion scene dictionary sparse identification method, a dictionary-like sparse identification method is used for carrying out scene classification on the fused visible light-near infrared HSV image. Because the method adopts the search tree characteristics and the hierarchical gradient direction histogram to extract the characteristics, the scene classification performance is only suitable for typical targets in simple scenes.
Disclosure of Invention
According to the technical problem that target identification is ambiguous or even wrong due to the fusion of the conflict identification information, the intelligent network-based target identification fusion method of the intelligent finger control system is provided. The technical means adopted by the invention are as follows:
an intelligent control system target identification fusion method based on intelligent networking comprises the following steps:
s1, calculating the correlation between the two target identification information according to the Pearson correlation coefficient and constructing a correlation matrix;
s2, uniformly assigning values less than or equal to 0 in the correlation matrix to preset values to obtain correlation weights among all target identification information;
s3, constructing an information matrix based on the target identification information and the identification target identity in the identification frame, counting the maximum target number of each piece of target identification information, and meanwhile counting the position of the maximum target;
s4, defining the statistical maximum target information category as a correction factor, calculating a representative factor of each piece of target identification information by combining the statistical characteristics of the target identification information, and obtaining the distribution weight of each piece of target identification information according to the fuzzy entropy of the representative factor, the correction factor and the maximum target number;
and S5, reassigning the target identification information according to the similarity weight and the target distribution weight of the target identification information by using a DS fusion rule, and fusing the reassigned target identification information.
Further, step S0 is further provided before step S1, and specifically includes: and constructing a target identification information fusion framework based on the intelligent Internet of things, and transmitting the monitored battlefield target information to the identification framework in real time through a sensor.
Further, two pieces of target identification information m are calculated according to the Pearson correlation coefficientiAnd mjThe correlation between them is specifically:
Figure BDA0002934953320000031
in formula (1), cov is the covariance, σ is the standard deviation, and E is the mathematical expectation;
Figure BDA0002934953320000032
and
Figure BDA0002934953320000033
the calculation formulas of (A) and (B) are respectively as follows:
Figure BDA0002934953320000034
Figure BDA0002934953320000035
the constructed correlation matrix is
Figure BDA0002934953320000036
Further, in the step S2,
defining a correlation weight crid (m) between object identification informationi),cred(mi)∈[0,1]And is
Figure BDA0002934953320000037
The specific calculation method of the relevance weight comprises the following steps:
Figure BDA0002934953320000038
further, in step S3, the constructed information matrix specifically includes:
Figure BDA0002934953320000041
in the formula, n is target identification information, and c is identification target identity;
maximum number tau of objects of each piece of object identification informationiAnd the position of the statistical maximum object, i.e. its object class piRespectively expressed by the following relations:
τi=|max(mi)|,i=1,2,…,n (7)
pi(cj)=cj(max(mi)),j=1,2,…,c (8)
where, | max (m)i) | represents max (m)i) Is the number of (c), i.e. max (m)i) Potential of (c) miIndicating the i-th item identification information, cjIndicating the class to which the object belongs
Counting the category of the maximum target information appearing in all the target identification information, calculating the number of the target identification information supporting each category, and recording as nc (c)j)
Figure BDA0002934953320000042
Further, in step S4, the correction factor is specifically:
Figure BDA0002934953320000043
wherein, variIdentifying information m for a targetiThe variance of (a);
calculating a representative factor sigma of target identification informationiThe obtaining of the fuzzy value of each piece of target identification information specifically comprises:
Hi=-σilogσi+(1-σi)log(1-σi) (12)
each piece of target identification information is assigned a weight ω (m)i) The formula is as follows:
Hti=Hi·τi+1/nc(cj)α (13)
Figure BDA0002934953320000044
wherein alpha is a correction factor nc (c)j) The degree of importance parameter of; htiThe fuzzy entropy value after adding the interference removal factor is obtained.
Further, in step S5, the re-assigned target identification information is obtained by the following formula:
Figure BDA0002934953320000051
the fusion formula of the two pieces of target identification information is as follows:
Figure BDA0002934953320000052
wherein the content of the first and second substances,
Figure BDA0002934953320000053
the method comprises the steps of firstly constructing a target identification information fusion framework based on intelligent conjuncted networking, secondly calculating the correlation among target identification information according to Pearson correlation coefficients on the basis of the framework, meanwhile introducing classification ideas and paying attention to probability assignment of a maximum target, modifying the target identification information by taking the correlation and the maximum target importance of the target identification information as weights, and finally fusing the modified target identification information by using a DS fusion rule. The result shows that the probability distribution of the method to the target is more reasonable, the support degree of the unknown target is reduced, and the support degree of the correct target is improved. By using the improved DS evidence theory method, the probability of identifying the wrong target is reduced under various conflict conditions, the identification of the correct target is improved, and the target identification capability of the system is effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic diagram of an information fusion framework based on the intelligent agent networking of the present invention.
FIG. 2 is a graph comparing different methods for correct targets in different amounts of evidence.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the embodiment of the present invention provides an improved DS fusion algorithm based on modified evidence sources, aiming at the problem that the target identification information fusion result in the battlefield environment is ambiguous or even incorrect in identifying the target. The intelligent network-based target identification fusion method for the intelligent control system mainly comprises the following steps:
s0, constructing an object identification information fusion framework based on the intelligent body networking, and transmitting the monitored battlefield object information to the identification framework in real time through a sensor.
S1, calculating the correlation between the two target identification information according to the Pearson correlation coefficient and constructing a correlation matrix;
s2, uniformly assigning values less than or equal to 0 in the correlation matrix to preset values to obtain correlation weights among all target identification information;
s3, constructing an information matrix based on the target identification information and the identification target identity in the identification frame, counting the maximum target number of each piece of target identification information, and meanwhile counting the position of the maximum target;
s4, defining the statistical maximum target information category as a correction factor, calculating a representative factor of each piece of target identification information by combining the statistical characteristics of the target identification information, and obtaining the distribution weight of each piece of target identification information according to the fuzzy entropy of the representative factor, the correction factor and the maximum target number;
and S5, reassigning the target identification information according to the similarity weight and the target distribution weight of the target identification information by using a DS fusion rule, and fusing the reassigned target identification information.
Specifically, two pieces of target identification information m are calculated from the Pearson correlation coefficientiAnd mjThe correlation between them is specifically:
Figure BDA0002934953320000061
in formula (1), cov is the covariance, σ is the standard deviation, and E is the mathematical expectation;
Figure BDA0002934953320000062
and
Figure BDA0002934953320000063
the calculation formulas of (A) and (B) are respectively as follows:
Figure BDA0002934953320000071
Figure BDA0002934953320000072
the constructed correlation matrix is
Figure BDA0002934953320000073
The pearson correlation coefficient reflects the degree of linear correlation between two pieces of target identification information, and the range is. The negative value indicates that the two pieces of target identification information are negatively correlated, that is, the degree of collision of the two pieces of target identification information is large, and the negative value is smaller, the negative correlation is higher, and the degree of collision is larger. In order to control the weight and correlation weight calculation of the target identification information with non-positive correlation in the identification framework and reduce the influence on the whole identification framework as much as possible, the assignment less than or equal to 0 in the correlation matrix is changed into 0.001. In the step S2, a correlation weight crid (m) between the object identification information is definedi),cred(mi)∈[0,1]And is
Figure BDA0002934953320000074
The specific calculation method of the relevance weight comprises the following steps:
Figure BDA0002934953320000075
in step S3, the constructed information matrix specifically includes:
Figure BDA0002934953320000076
in the formula, n is target identification information, and c is identification target identity;
since the object with the highest probability assignment is the most important object in the object identification information, which represents the bias of the object identification information to the object in the classification concept, the maximum number of objects τ per piece of object identification information is countediAnd the position of the statistical maximum object, i.e. its object class piRespectively expressed by the following relations:
τi=|max(mi)|,i=1,2,…,n (7)
pi(cj)=cj(max(mi)),j=1,2,…,c (8)
where, | max (m)i) | represents max (m)i) Is the number of (c), i.e. max (m)i) Potential of (c) miIndicating the i-th item identification information, cjIndicating the class to which the object belongs
Counting the category of the maximum target information appearing in all the target identification information, calculating the number of the target identification information supporting each category, and recording as nc (c)j)
Figure BDA0002934953320000081
The importance degree of the target identification information is measured by whether the maximum target information positions of the target identification information are the same or not, when the maximum target information positions, namely the types, among the evidences are the same, the target identification information is high in consistency and close in connection, and the more the number of the target identification information supporting the types is, the more the target identification information has support degree relative to other information, the more the weight can be distributed to the target identification information, so that the maximum target information type is defined as a correction factor, and the variance of each piece of target identification information is modified by the correction factor. In step S4, the correction factor is specifically:
Figure BDA0002934953320000082
wherein, variIdentifying information m for a targetiThe variance of (a);
evaluating the target identification information according to the range and the modified variance to obtain a representative factor sigmaiDue to fuzzy entropy in the interval 0,0.5]And [0.5,1]While changing monotonically, will represent the factor σiTransition to interval [0,0.5]In the above, the conversion formula is:
Figure BDA0002934953320000083
wherein a is an adjustment coefficient, and the value of a is 0.5; riIdentifying information m for a targetiThe difference in polarity of (a).
Calculating a representative factor sigma of target identification informationiThe obtaining of the fuzzy value of each piece of target identification information specifically comprises:
Hi=-σi logσi+(1-σi)log(1-σi) (12)
representing factor sigmaiThe larger the fuzzy entropy of (a) is, the more vague and less distinct the ambiguity of the identification information of the item is, the smaller the role played in all the target identification information is, so that a smaller weight should be assigned.
Fuzzy entropy value H according to representative factoriCorrection factor nc (c)j) And the maximum identification number tau of each piece of target identification informationiThe distribution weight ω (m) of each piece of target identification information is calculatedi) The formula is as follows:
Hti=Hi·τi+1/nc(cj)α (13)
Figure BDA0002934953320000091
wherein alpha is a correction factor nc (c)j) The larger the value of the importance parameter, the correction factor nc (c) is representedj) The more important, in this embodiment, the selected empirical value is 5; htiThe fuzzy entropy value after adding the interference removal factor is obtained.
In step S5, similarity weight crid (m) based on the target identification informationi) And target classification weight ω (m)i) Reassigning the target identification information, wherein the reassigned target identification information is obtained through the following formula:
Figure BDA0002934953320000092
the fusion formula of the two pieces of target identification information is as follows:
Figure BDA0002934953320000093
wherein the content of the first and second substances,
Figure BDA0002934953320000094
the basic probability assignment of three common conflict evidences is shown in table 1, the fusion results of the three common evidence conflict types through different methods are shown in table 2, the initial target identification information is shown in table 3, the influence of a plurality of evidence bodies on the fusion results is shown in table 4, the comparison graph of different methods on correct targets under different numbers of finally obtained evidence bodies is shown in fig. 2, and it can be seen from fig. 2 that compared with other existing methods, the improved DS evidence theoretical method reduces the probability of identifying wrong targets under various conflict conditions, improves the identification of correct targets, and effectively improves the target identification capability of the system.
TABLE 1 basic probability assignments for three common conflicting evidences
Figure BDA0002934953320000101
TABLE 2 fusion results of three common evidence conflict types
Figure BDA0002934953320000102
Figure BDA0002934953320000111
TABLE 3 initial object identification information
Figure BDA0002934953320000112
TABLE 4 Effect of multiple evidence bodies on fusion results
Figure BDA0002934953320000113
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (7)

1. An intelligent control system target identification fusion method based on intelligent networking is characterized by comprising the following steps:
s1, calculating the correlation between the two target identification information according to the Pearson correlation coefficient and constructing a correlation matrix;
s2, uniformly assigning values less than or equal to 0 in the correlation matrix to preset values to obtain correlation weights among all target identification information;
s3, constructing an information matrix based on the target identification information and the identification target identity in the identification frame, counting the maximum target number of each piece of target identification information, and meanwhile counting the position of the maximum target;
s4, defining the statistical maximum target information category as a correction factor, calculating a representative factor of each piece of target identification information by combining the statistical characteristics of the target identification information, and obtaining the distribution weight of each piece of target identification information according to the fuzzy entropy of the representative factor, the correction factor and the maximum target number;
and S5, reassigning the target identification information according to the similarity weight and the target distribution weight of the target identification information by using a DS fusion rule, and fusing the reassigned target identification information.
2. The intelligent network-based intelligent finger control system target identification fusion method of claim 1, wherein step S0 is further provided before step S1, and specifically comprises: and constructing a target identification information fusion framework based on the intelligent Internet of things, and transmitting the monitored battlefield target information to the identification framework in real time through a sensor.
3. The intelligent target identification fusion method based on intelligent networking of claim 1 or 2, characterized in that two target identification information m are calculated according to Pearson's correlation coefficientiAnd mjThe correlation between them is specifically:
Figure FDA0002934953310000011
in formula (1), cov is the covariance, σ is the standard deviation, and E is the mathematical expectation;
Figure FDA0002934953310000012
and
Figure FDA0002934953310000013
the calculation formulas of (A) and (B) are respectively as follows:
Figure FDA0002934953310000021
Figure FDA0002934953310000022
the constructed correlation matrix is
Figure FDA0002934953310000023
4. The intelligent network-based intelligent finger control system target identification fusion method according to claim 3, wherein in the step S2,
defining a correlation weight crid (m) between object identification informationi),cred(mi)∈[0,1]And is
Figure FDA0002934953310000024
The specific calculation method of the relevance weight comprises the following steps:
Figure FDA0002934953310000025
5. the intelligent network-based intelligent finger control system target identification fusion method according to claim 4, wherein in the step S3, the constructed information matrix is specifically:
Figure FDA0002934953310000026
in the formula, n is target identification information, and c is identification target identity;
maximum number tau of objects of each piece of object identification informationiAnd the position of the statistical maximum object, i.e. its object class piRespectively expressed by the following relations:
τi=|max(mi)|,i=1,2,…,n (7)
pi(cj)=cj(max(mi)),j=1,2,…,c (8)
where, | max (m)i) | represents max (m)i) Is the number of (c), i.e. max (m)i) Potential of (c) miIndicating the i-th item identification information, cjIndicating the class to which the object belongs
Counting the maximum target information appearing in all target identification informationInformation category, and the number of target identification information supporting each category is calculated and is recorded as nc (c)j)
Figure FDA0002934953310000031
6. The intelligent network-based intelligent finger control system target identification fusion method according to claim 5, wherein in the step S4, the correction factors are specifically:
Figure FDA0002934953310000032
wherein, variIdentifying information m for a targetiThe variance of (a);
calculating a representative factor sigma of target identification informationiThe obtaining of the fuzzy value of each piece of target identification information specifically comprises:
Hi=-σilogσi+(1-σi)log(1-σi) (12)
each piece of target identification information is assigned a weight ω (m)i) The formula is as follows:
Hti=Hi·τi+1/nc(cj)α (13)
Figure FDA0002934953310000033
wherein alpha is a correction factor nc (c)j) The degree of importance parameter of; htiThe fuzzy entropy value after adding the interference removal factor is obtained.
7. The intelligent network-based intelligent finger control system target identification fusion method according to claim 6, wherein in step S5, the reassigned target identification information is obtained by the following formula:
Figure FDA0002934953310000041
the fusion formula of the two pieces of target identification information is as follows:
Figure FDA0002934953310000042
wherein the content of the first and second substances,
Figure FDA0002934953310000043
CN202110159125.9A 2021-02-04 2021-02-04 Intelligent command system target identification fusion method based on intelligent networking Active CN112800999B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110159125.9A CN112800999B (en) 2021-02-04 2021-02-04 Intelligent command system target identification fusion method based on intelligent networking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110159125.9A CN112800999B (en) 2021-02-04 2021-02-04 Intelligent command system target identification fusion method based on intelligent networking

Publications (2)

Publication Number Publication Date
CN112800999A true CN112800999A (en) 2021-05-14
CN112800999B CN112800999B (en) 2023-12-01

Family

ID=75814364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110159125.9A Active CN112800999B (en) 2021-02-04 2021-02-04 Intelligent command system target identification fusion method based on intelligent networking

Country Status (1)

Country Link
CN (1) CN112800999B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104077601A (en) * 2014-07-08 2014-10-01 中国航空无线电电子研究所 Method for carrying out synthetic target recognition through information of different types
CN105243667A (en) * 2015-10-13 2016-01-13 中国科学院自动化研究所 Target re-identification method based on local feature fusion
CN106778847A (en) * 2016-12-02 2017-05-31 河南大学 The fusion method of evidences conflict is weighed based on logarithmic function
CN107622275A (en) * 2017-08-21 2018-01-23 西安电子科技大学 A kind of Data Fusion Target recognition methods based on combining evidences
US20180060758A1 (en) * 2016-08-30 2018-03-01 Los Alamos National Security, Llc Source identification by non-negative matrix factorization combined with semi-supervised clustering
CN109785064A (en) * 2019-01-14 2019-05-21 南京信息工程大学 A kind of mobile e-business recommended method and system based on Multi-source Information Fusion
CN110533091A (en) * 2019-08-22 2019-12-03 贵州大学 A kind of more evident information fusion methods for improving DS evidence theory
CN111064706A (en) * 2019-11-25 2020-04-24 大连大学 Method for detecting spatial network data stream of mRMR-SVM
CN111667193A (en) * 2020-06-12 2020-09-15 中国矿业大学(北京) Coal mine gas safety evaluation method based on D-S evidence theory
US20210248822A1 (en) * 2018-04-23 2021-08-12 The Regents Of The University Of Colorado, A Body Corporate Mobile And Augmented Reality Based Depth And Thermal Fusion Scan

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104077601A (en) * 2014-07-08 2014-10-01 中国航空无线电电子研究所 Method for carrying out synthetic target recognition through information of different types
CN105243667A (en) * 2015-10-13 2016-01-13 中国科学院自动化研究所 Target re-identification method based on local feature fusion
US20180060758A1 (en) * 2016-08-30 2018-03-01 Los Alamos National Security, Llc Source identification by non-negative matrix factorization combined with semi-supervised clustering
CN106778847A (en) * 2016-12-02 2017-05-31 河南大学 The fusion method of evidences conflict is weighed based on logarithmic function
CN107622275A (en) * 2017-08-21 2018-01-23 西安电子科技大学 A kind of Data Fusion Target recognition methods based on combining evidences
US20210248822A1 (en) * 2018-04-23 2021-08-12 The Regents Of The University Of Colorado, A Body Corporate Mobile And Augmented Reality Based Depth And Thermal Fusion Scan
CN109785064A (en) * 2019-01-14 2019-05-21 南京信息工程大学 A kind of mobile e-business recommended method and system based on Multi-source Information Fusion
CN110533091A (en) * 2019-08-22 2019-12-03 贵州大学 A kind of more evident information fusion methods for improving DS evidence theory
CN111064706A (en) * 2019-11-25 2020-04-24 大连大学 Method for detecting spatial network data stream of mRMR-SVM
CN111667193A (en) * 2020-06-12 2020-09-15 中国矿业大学(北京) Coal mine gas safety evaluation method based on D-S evidence theory

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张欢;陆见光;唐向红;: "面向冲突证据的改进DS证据理论算法", 北京航空航天大学学报, no. 03 *

Also Published As

Publication number Publication date
CN112800999B (en) 2023-12-01

Similar Documents

Publication Publication Date Title
CN113378632B (en) Pseudo-label optimization-based unsupervised domain adaptive pedestrian re-identification method
CN109558823B (en) Vehicle identification method and system for searching images by images
CN109190544B (en) Human identity recognition method based on sequence depth image
CN112906770A (en) Cross-modal fusion-based deep clustering method and system
CN108537790B (en) Different-source image change detection method based on coupling translation network
CN113157678B (en) Multi-source heterogeneous data association method
CN110276746B (en) Robust remote sensing image change detection method
CN110751027B (en) Pedestrian re-identification method based on deep multi-instance learning
CN112802054A (en) Mixed Gaussian model foreground detection method fusing image segmentation
CN115309860B (en) False news detection method based on pseudo twin network
WO2023174304A1 (en) Systems, methods, and storage devices for data clustering
CN110580510A (en) clustering result evaluation method and system
Iqbal et al. Mitochondrial organelle movement classification (fission and fusion) via convolutional neural network approach
CN110852292B (en) Sketch face recognition method based on cross-modal multi-task depth measurement learning
CN113808166B (en) Single-target tracking method based on clustering difference and depth twin convolutional neural network
Zhang et al. Adaptive image segmentation based on color clustering for person re-identification
CN115861715A (en) Knowledge representation enhancement-based image target relation recognition algorithm
Hossny et al. Enhancing keyword correlation for event detection in social networks using SVD and k-means: Twitter case study
Pang et al. Federated Learning for Crowd Counting in Smart Surveillance Systems
Haraksim et al. Validation of likelihood ratio methods for forensic evidence evaluation handling multimodal score distributions
Zhou et al. Partial fingerprint indexing: a combination of local and reconstructed global features
CN113065395A (en) Radar target new class detection method based on generation countermeasure network
CN112800999A (en) Intelligent control system target identification fusion method based on intelligent networking
CN111832402A (en) Face recognition method and device
CN116935057A (en) Target evaluation method, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant