CN109478228A - Fusion method, device and the electronic equipment of classification results - Google Patents

Fusion method, device and the electronic equipment of classification results Download PDF

Info

Publication number
CN109478228A
CN109478228A CN201680087583.3A CN201680087583A CN109478228A CN 109478228 A CN109478228 A CN 109478228A CN 201680087583 A CN201680087583 A CN 201680087583A CN 109478228 A CN109478228 A CN 109478228A
Authority
CN
China
Prior art keywords
classification
result
classification result
ith
results
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201680087583.3A
Other languages
Chinese (zh)
Inventor
伍健荣
刘晓青
白向晖
谭志明
东明浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Publication of CN109478228A publication Critical patent/CN109478228A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Developing Agents For Electrophotography (AREA)

Abstract

A kind of fusion method of classification results, device and electronic equipment.The device includes: acquiring unit, is used to obtain the classification results of different moments in predetermined amount of time;Integrated unit is used to merge the classification results of different moments, with the classification results merged.According to the present embodiment, the classification results of different moments in predetermined amount of time can be merged, improve the accuracy of classification results as a result,.

Description

Method and device for fusing classification results and electronic equipment Technical Field
The present invention relates to the field of information technologies, and in particular, to a method and an apparatus for fusing classification results, and an electronic device.
Background
With the continuous development of information technology, computer vision and intelligent traffic systems are increasingly widely applied. Based on the requirements of these applications, it is necessary to identify and classify target objects in the acquired video images. When the video image is blurred, distorted or incomplete due to the influence of the imaging environment such as illumination, smoke, rain and snow, and motion blur, the accuracy of the single classification result for the target object is greatly reduced. Therefore, a plurality of classification results of the target object are obtained and fused, and the accuracy of identification and classification of the target object can be improved compared with a single classification result.
Currently, there is a method of fusing a plurality of classification results, for example, searching for a classifier with the lowest error rate among a plurality of classifiers and outputting the classification result of the classifier.
It should be noted that the above background description is only for the sake of clarity and complete description of the technical solutions of the present invention and for the understanding of those skilled in the art. Such solutions are not considered to be known to the person skilled in the art merely because they have been set forth in the background section of the invention.
Disclosure of Invention
When the fusion of multiple classification results is performed using the above-mentioned existing methods, the accuracy of the obtained classification results is still low.
The embodiment of the invention provides a classification result fusion device and method, which are used for fusing classification results at different moments in a preset time period and can effectively improve the accuracy of the classification results.
According to a first aspect of the embodiments of the present invention, there is provided an apparatus for fusing classification results, the apparatus including: the device comprises an acquisition unit, a classification unit and a classification unit, wherein the acquisition unit is used for acquiring classification results at different moments in a preset time period; and the fusion unit is used for fusing the classification results at different moments to obtain a fused classification result.
According to a second aspect of embodiments of the present invention, there is provided an electronic apparatus, including: the fusion device for the classification results according to the first aspect of the embodiment of the present invention.
According to a third aspect of the embodiments of the present invention, there is provided a fusion method of classification results, the method including: obtaining classification results at different moments in a preset time period; and fusing the classification results at different moments to obtain a fused classification result.
The invention has the beneficial effects that: the classification results at different moments in a preset time period are fused, so that the accuracy of the classification results can be effectively improved.
Specific embodiments of the present invention are disclosed in detail with reference to the following description and drawings, indicating the manner in which the principles of the invention may be employed. It should be understood that the embodiments of the invention are not so limited in scope. The embodiments of the invention include many variations, modifications and equivalents within the spirit and scope of the appended claims.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments, in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 is a schematic view of a classification result fusion device according to embodiment 1 of the present application;
FIG. 2 is a schematic diagram of the embodiment 1 of the present application for fusing classification results at different times;
FIG. 3 is a schematic view of a fusion unit according to example 1 of the present application;
FIG. 4 is another schematic view of the fusion unit of example 1 of the present application;
FIG. 5 is another schematic view of the fusion unit of example 1 of the present application;
FIG. 6 is a schematic diagram of a third fusion subunit of example 1 of the present application;
FIG. 7 is a schematic diagram of a fourth fusion subunit of example 1 of the present application;
fig. 8 is a schematic view of a configuration of an electronic device according to embodiment 2 of the present application;
FIG. 9 is a schematic diagram of a fusion method of classification results according to embodiment 3 of the present application;
fig. 10 is a flowchart illustrating a method for fusing classification results according to embodiment 4 of the present application.
Detailed Description
The foregoing and other features of the invention will become apparent from the following description taken in conjunction with the accompanying drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the embodiments in which the principles of the invention may be employed, it being understood that the invention is not limited to the embodiments described, but, on the contrary, is intended to cover all modifications, variations, and equivalents falling within the scope of the appended claims.
Example 1
Fig. 1 is a schematic view of a classification result fusion device according to embodiment 1 of the present invention. As shown in fig. 1, the apparatus 100 includes an acquisition unit 101 and a fusion unit 102.
The acquiring unit 101 is configured to acquire classification results at different times within a predetermined time period; the fusion unit 102 is configured to fuse the classification results at different times to obtain a fused classification result.
According to the embodiment, the classification results at different moments in the preset time period can be fused, so that the accuracy of the classification results is improved.
In this embodiment, the obtaining unit 101 may obtain the classification results at different times through a plurality of existing manners, for example, the obtaining unit 101 may have a Deep Convolutional Neural Network (DCNN), and the image acquisition device may input the image data acquired at different times to the obtaining unit 101, and obtain a plurality of classification results at different times after calculation; alternatively, the obtaining unit 101 may receive the classification results at different times directly from the classifier. The embodiment of the present application does not limit the manner in which the obtaining unit 101 obtains the classification results at different times.
Fig. 2 is a schematic diagram of fusing the classification results at different times by using the apparatus 100 of the present embodiment, and as shown in fig. 2, 201 and 210 represent the classification results at different times within a predetermined time period, respectively, and these classification results are fused by the fusion unit 102 to generate a fused classification result 211.
Fig. 3 is a schematic diagram of the fusion unit 102 in this embodiment 1, and as shown in fig. 3, the fusion unit 102 may have a first fusion subunit 301, where the first fusion subunit 301 performs weighted summation on the classification results at different time instants according to the first weight values of the classification results at the time instants to obtain the fused classification result.
In this embodiment, the classification Result at each time may include confidence of each category, and the confidence may be a score, for example, the classification Result (n) at the nth time includes confidence of each category Result (p, n), where p is less than or equal to Q, p and Q are natural numbers, Q represents the total number of categories in the classification Result (n), and p represents one of the categories.
In this embodiment, the first fusion subunit 301 may fuse the classification results of M times in the time period from the nth-M +1 time to the nth time, where M is a natural number, and the fused classification result may be obtained by, for example, the following equation (1):
RFresult (p, n) represents the confidence corresponding to the category p in the fused classification result, and w (t) represents the first weight of the classification result at the t-th moment, where t is an integer.
In this embodiment, the category of the object to be classified may be determined according to the confidence corresponding to each category in the fused classification result.
In this embodiment, the classification result at the later time in time has a larger first weight than the classification result at the earlier time in time, for example, the first weight w (t) of the classification result at the tth time can be obtained by the following formula (2) or (3):
w(t)=2-(n-t+1) (2)
w(t)=1/((n-t+1)2+0.5) (3)
in addition, the present embodiment is not limited to the above equation (2) or (3), and the first weight w (t) of the classification result at the t-th time may be calculated according to other manners.
Fig. 4 is another schematic diagram of the fusion unit 102 in this embodiment 1, and as shown in fig. 4, the fusion unit 102 may have a second fusion sub-unit 401, and the second fusion sub-unit 401 votes for the classification results at different time instants to obtain the fused classification result.
In this embodiment, the classification Result at each time may include a voting Result for each category based on a threshold, where the voting Result may be an integer, for example, in the classification Result _1(n) at the nth time, if the confidence of the category p is smaller than the threshold, the voting Result _1(p, n) of the category takes a value of 1, and otherwise, the voting Result takes a value of 0.
In this embodiment, the second fusion subunit 401 may vote for the classification results of M times in the time period from the n-M +1 th time to the n-th time according to the following equation (4) to obtain a fused classification result:
RFresult (p, n) represents the total sum of voting results corresponding to the category p in the fused classification results.
In this embodiment, the category of the object to be classified may be determined according to the sum of the voting results corresponding to each category in the fused classification results.
Fig. 5 is another schematic diagram of the fusion unit 102 in this embodiment 1, and as shown in fig. 5, the fusion unit 102 may have a third fusion sub-unit 501, and the third fusion sub-unit 501 is capable of fusing the classification results at different time instants according to the difference between the classification results at different time instants to obtain the fused classification result.
Fig. 6 is a schematic diagram of the third fusion subunit 501 of the present embodiment 1, and as shown in fig. 6, the third fusion subunit 501 includes:
a first calculating unit 601, configured to calculate a difference between classification results at any two times among the classification results at different times;
a second calculating unit 602, configured to calculate an average difference between the classification result at each time and the classification results at other times according to the difference between the classification results at any two times and the second weight of the classification result at each time;
a fourth fusion subunit 603, configured to synthesize the classification results at each time according to the average difference between the classification result at each time and the classification results at other times, and fuse the synthesis results to obtain a fusion result of the classification results at different times.
In this example, the first calculation unit 601 calculates a degree of difference between the classification results at any two times among the classification results at different times, wherein the degree of difference can be characterized using an existing method, for example, the degree of difference can be characterized by calculating a Jousselme distance between any two classification results. In this embodiment, the Jousselme distance is taken as an example to characterize the difference between any two classification results.
In this embodiment, assuming that there are M classification results at M times, where M is an integer greater than or equal to 2, the classification results at the M times are sorted in time sequence, and are M1,m2,……,mM
In the present example, for example, the jousseme distance (hereinafter, referred to as J distance) between the classification results at any two times can be calculated according to the following formulas (1) and (2):
wherein d isBPA(i, j) represents the classification result m at the i-th timeiThe classification result m at the j-th timejJ distance between, ApAnd AqRespectively representing the p-th category and the q-th category, wherein i, j, p, q and n are positive integers, and i, j, p and q are all less than or equal to M.
In this embodiment, the second calculating unit 602 calculates an average degree of difference between the classification result at each time and the classification result at other times according to the degree of difference between the classification results at any two times and the second weight of the classification result at each time, for example, a correction value of an average J distance corresponding to the classification result at each time may be calculated according to the following formulas (7a) and (7b), and the correction value of the average J distance may be used as the average degree of difference:
Sum_m(i)=Sumo_m(i)·w(i) (7b)
wherein, Sumo _ m (i) represents the average J distance between the classification result at the i-th time and the classification results at other times, dBPA(i, J) represents a J distance between the classification result at the i-th time and the classification result at the J-th time, Sum _ m (i) represents a correction value of an average J distance between the classification result at the i-th time and the classification results at other times, and w (i) represents a second weight of the classification result at the i-th time.
In this embodiment, the classification result at the later time in time has a larger second weight than the classification result at the earlier time in time, for example, the embodiment is not limited thereto, and the second weight may be set in other manners.
In this embodiment, after obtaining the average degree of difference between the classification result at each time and the classification results at other times, the fourth fusion subunit 603 synthesizes the classification results at each time according to the average degree of difference between the classification result at each time and the classification results at other times, and fuses the synthesis results to obtain the fusion result of the classification results at different times.
For example, the fourth fusion subunit 603 synthesizes the classification result at each time with the classification result at the previous time of the classification result or the fusion results of the classification results at all times before the classification result based on the average degree of difference between the classification result at each time and the classification results at other times, and fuses the synthesis results to obtain the fusion result of the classification results at different times.
For example, for the classification results at the first two times, the fourth fusion sub-unit 603 synthesizes the classification result at the 1 st time and the classification result at the 2 nd time, for the classification result at the 3 rd time, the fourth fusion sub-unit 603 synthesizes the classification result at the 3 rd time and the fusion result of the classification results at the first two times, and so on for the classification results after the classification result at the 3 rd time.
In this embodiment, the fourth fusing subunit 603 may set a synthesis weight for synthesizing according to a comparison result between the average difference between the classification result at each time and the classification results at other times and a preset threshold, synthesize the classification results according to the synthesis weight, and fuse the synthesis results to obtain a fusion result of the classification results at different times. For example, when the average degree of difference between the classification result at a certain time and the classification results at other times is greater than the preset threshold, the synthesis weight used in the synthesis of the classification result at the certain time is set to be smaller.
In this embodiment, the preset threshold may be set according to actual needs, for example, the threshold is set to a value between 0 and 1, for example, the threshold is 0.5.
The configuration of the fourth fusion subunit 603 of the present embodiment and the method of synthesizing and fusing according to the average degree of difference are exemplarily described below.
Fig. 7 is a schematic diagram of a fourth fusion subunit 603 in embodiment 1 of the present invention. When the classification result of the first two moments in the classification results of the M moments, namely the classification result M at the 1 st moment, is needed1And classification result m at time 22When synthesis and fusion are carried out, as shown in the figureAs shown in fig. 7, the fourth fusion subunit 603 includes:
a first setting unit 701, configured to calculate a 1 st composite weight of the classification result at the 1 st time according to the 1 st average difference when the 1 st average difference between the classification result at the 1 st time and the classification results at other times is greater than a preset threshold and a 2 nd average difference between the classification result at the 2 nd time and the classification results at other times is less than the preset threshold, and calculate a 2 nd composite weight of the classification result at the 2 nd time according to the 1 st composite weight, where the 1 st composite weight is less than the 2 nd composite weight;
a second setting unit 702, configured to, when the 1 st average difference between the classification result at the 1 st time and the classification results at other times is smaller than the preset threshold and the 2 nd average difference between the classification result at the 2 nd time and the classification results at other times is greater than the preset threshold, calculate a 2 nd synthetic weight of the classification result at the 2 nd time according to the 2 nd average difference, and calculate a 1 st synthetic weight of the classification result at the 1 st time according to the 2 nd synthetic weight, where the 2 nd synthetic weight is smaller than the 1 st synthetic weight;
a third setting unit 703, configured to set a 1 st composite weight of the classification result at the 1 st time and a 2 nd composite weight of the classification result at the 2 nd time to be equal to each other when the 1 st average difference between the classification result at the 1 st time and the classification result at the other times and the 2 nd average difference between the classification result at the 2 nd time and the classification result at the other times are both greater than or equal to the preset threshold or both less than the preset threshold;
a first synthesizing unit 704, configured to synthesize the 1 st classification result and the 2 nd classification result according to the 1 st synthesis weight and the 2 nd synthesis weight;
a fifth fusing subunit 705, configured to fuse the combination result of the classification result at the 1 st time and the classification result at the 2 nd time, and obtain a fused result of the classification result at the 1 st time and the classification result at the 2 nd time.
In this embodiment, when the 1 st average difference between the classification result at the 1 st time and the classification results at other times is greater than a preset threshold and the 2 nd average difference between the classification result at the 2 nd time and the classification results at other times is less than the preset threshold, the first setting unit 701 calculates the 1 st composite weight of the classification result at the 1 st time according to the 1 st average difference, for example, the 1 st composite weight of the classification result at the 1 st time may be calculated using the following formula (8):
wherein, it represents the 1 st synthesized weight, Sum _ m (1) represents the 1 st average difference of the classification result at the 1 st moment, and α represents the synthesized weight adjustment factor, wherein, α can be set according to the actual requirement, for example, α is a value of 2-5.
After calculating the 1 st synthesis weight, the first setting unit 701 calculates a 2 nd synthesis weight of the classification result at the 2 nd time from the 1 st synthesis weight, for example, as the 2 nd synthesis weight. Wherein, namely
In this embodiment, when the 1 st average difference between the classification result at the 1 st time and the classification results at other times is smaller than the preset threshold and the 2 nd average difference between the classification result at the 2 nd time and the classification results at other times is larger than the preset threshold, the second setting unit 802 calculates the 2 nd composite weight of the classification result at the 2 nd time according to the 2 nd average difference, for example, the 2 nd composite weight of the classification result at the 2 nd time may be calculated by using formula (9) similar to formula (8):
wherein, the 2 nd synthesized weight is represented, Sum _ m (2) represents the 2 nd average difference of the classification result at the 2 nd moment, and α represents the weight adjustment factor, which has the same value as before.
After calculating the 2 nd synthesis weight, second setting section 702 calculates the 1 st synthesis weight of the classification result at the 1 st time from the 2 nd synthesis weight, for example, as the 1 st synthesis weight. Wherein, namely
In this embodiment, when the 1 st average difference between the classification result at the 1 st time and the classification result at the other times and the 2 nd average difference between the classification result at the 2 nd time and the classification result at the other times are both greater than or equal to the preset threshold or both less than the preset threshold, the third setting unit 703 sets the 1 st composite weight of the classification result at the 1 st time and the 2 nd composite weight of the classification result at the 2 nd time to be equal, for example, sets the 1 st composite weight and the 2 nd composite weight to be 0.5.
In this embodiment, the first combining unit 704 combines the classification result at the 1 st time and the classification result at the 2 nd time according to the calculated 1 st combining weight and the 2 nd combining weight, for example,
the 1 st synthesis weight and the 2 nd synthesis weight set by the first setting unit 701 can be synthesized according to the following equation (10):
wherein m represents a composite result of the classification result at the 1 st time and the classification result at the 2 nd time1Represents the classification result at time 1, m2The classification result at the 2 nd time is shown, the 1 st synthesis weight is shown, and the 2 nd synthesis weight is shown.
The 1 st synthesis weight and the 2 nd synthesis weight set by the second setting unit 702 can be synthesized according to the following equation (11):
wherein m represents a composite result of the classification result at the 1 st time and the classification result at the 2 nd time1Represents the classification result at time 1, m2The classification result at the 2 nd time is shown, the 1 st synthesis weight is shown, and the 2 nd synthesis weight is shown.
The 1 st combining weight and the 2 nd combining weight set by the third setting unit 703 can be combined according to the following equation (12):
wherein m represents a composite result of the classification result at the 1 st time and the classification result at the 2 nd time1Represents the classification result at time 1, m2Indicating the classification result at time 2.
In the present embodiment, after obtaining the combination result of the classification result at the 1 st time and the classification result at the 2 nd time, the fifth fusion subunit 705 fuses the classification result at the 1 st time and the classification result at the 2 nd time to obtain the fusion result of the classification result at the 1 st time and the classification result at the 2 nd time, wherein the fusion result can be fused with itself using an existing fusion method, for example,
the fusion result of the classification result at the 1 st time and the classification result at the 2 nd time is shown, and the composite result of the classification result at the 1 st time and the classification result at the 2 nd time is shown.
In this example, the fusion of the synthetic results can be performed by using an existing fusion method, and for example, the fusion can be performed based on the D-S (Dempster/Shafer) evidence theory.
The above is an exemplary description of synthesizing and fusing the classification results at the first two moments, and when n ≧ 3, the fourth fusing subunit 603 may further include:
a fourth setting unit 706, configured to calculate an ith synthesis weight of the classification result at the ith time according to the ith average difference when the ith average difference between the classification result at the ith time and the classification results at other times is greater than the preset threshold, and calculate an i-1 synthesis weight of the fusion result of the classification results at i-1 times before the classification result at the ith time according to the ith synthesis weight, where the ith synthesis weight is less than the i-1 synthesis weight, i is an integer greater than or equal to 3, and i is not greater than n;
a fifth setting unit 707 for setting, when the ith average degree of difference between the classification result at the ith time and the classification results at other times is less than or equal to the preset threshold, an ith synthesis weight of the classification result at the ith time and an ith-1 synthesis weight of the fusion result of the classification results at i-1 times before the classification result at the ith time according to preset parameters;
in this embodiment, when the ith average degree of difference between the classification result at the ith time and the classification results at other times is greater than the preset threshold, the fourth setting unit 206 calculates the ith composite weight of the classification result at the ith time according to the ith average degree of difference, for example, the ith composite weight may be calculated using formula (14) similar to formula (4) above:
wherein, the ith composite weight representing the classification result at the ith moment, Sum _ m (i) represents the ith average difference of the classification result at the ith moment, and α represents a composite weight adjustment factor, which has the same value as before.
After the ith synthesis weight is calculated, the fourth setting unit 706 calculates the ith-1 synthesis weight of the fusion result of the classification results at i-1 times before the classification result at the ith time according to the ith synthesis weight, and the fusion result is taken as the ith-1 synthesis weight, for example. Wherein, namely
In this embodiment, when the ith average degree of difference between the classification result at the ith time and the classification results at other times is less than or equal to the preset threshold, the fifth setting unit 707 sets the ith synthesis weight of the classification result at the ith time and the ith-1 synthesis weight of the fusion result of the classification results at i-1 times before the classification result at the ith time according to a preset parameter, for example, if the ith synthesis weight is β, the ith-1 synthesis weight is (1- β), where β is a number of 0-1, which can be set according to actual needs.
The first synthesis unit 704 is further configured to synthesize the fusion result of the classification result at the ith time and the classification result at i-1 times before the classification result at the ith time according to the ith synthesis weight and the ith-1 synthesis weight, for example,
the ith combining weight and the (i-1) th combining weight set by the fourth setting unit 706 may be combined according to the following equation (15):
wherein m represents a composite result of a fusion result of the classification result at the ith time and the classification results at i-1 times before the classification result at the ith timeiThe composite weight value is a composite weight value representing the classification result at the ith time, a fusion result of the classification results at i-1 times before the classification result at the ith time, an ith composite weight value representing the ith-1 composite weight value.
For example, when i is 3, the classification result m at the 3 rd time is shown3And a composite result of the fusion result of the classification result at the 1 st time and the classification result at the 2 nd time.
The ith synthesis weight and the i-1 th synthesis weight set by the fifth setting unit 707 can be synthesized according to the following equation (16):
wherein m represents a composite result of a fusion result of the classification result at the ith time and the classification results at i-1 times before the classification result at the ith timeiThe classification result at the ith time is shown, the fusion result of the classification results at i-1 times before the classification result at the ith time is shown, β shows the ith synthesis weight, and (1- β) shows the ith-1 synthesis weight.
In this embodiment, after obtaining the combination result of the fusion result of the classification result at the i-th time and the classification result at i-1 times before the classification result at the i-th time, the first and fourth fusion subunit 705 is further configured to fuse the combination result of the fusion result of the classification result at the i-th time and the fusion result of the classification result at i-1 times before the classification result at the i-th time, and obtain the fusion result of the classification results at i times. The fusion method is the same as the above description, and is not repeated here.
In this embodiment, when M is greater than or equal to 3, the fourth fusion subunit 603 performs the above iterative calculation, and finally obtains a fusion result of the classification results at M times.
In the present embodiment, it can be seen from the above embodiments that the classification results at different times within the predetermined time period are fused, thereby improving the accuracy of the classification results.
Example 2
An embodiment 2 of the present application provides an electronic device, including the fusion apparatus for classification results according to the embodiment 2.
Fig. 8 is a schematic view of a configuration of an electronic device according to embodiment 2 of the present application. As shown in fig. 8, the electronic device 800 may include: a Central Processing Unit (CPU)801 and a memory 802; the memory 802 is coupled to the central processor 801. Wherein the memory 802 can store various data; a program for performing fusion of the classification results is also stored, and is executed under the control of the central processor 801.
In one embodiment, the functions in the fusion device of the classification results may be integrated into the central processor 801.
Among other things, the central processor 801 may be configured to:
obtaining classification results at different moments in a preset time period; and
and fusing the classification results at different moments to obtain a fused classification result.
Wherein, the central processor 801 may be further configured to:
and according to the first weight of the classification result at each moment, carrying out weighted summation on the classification results at different moments to obtain the fused classification result.
Wherein, the central processor 801 may be further configured to:
and voting the classification results at different moments to obtain the fused classification result.
Wherein, the central processor 801 may be further configured to:
and fusing the classification results at different moments according to the difference between the classification results at different moments to obtain the fused classification result.
Wherein, the central processor 801 may be further configured to:
calculating the difference between any two classification results in the classification results at different moments;
calculating the average difference between the classification result at each moment and the classification results at other moments according to the difference between any two classification results and the second weight of the classification result at each moment; and
and synthesizing the classification results at each moment according to the average difference between the classification result at each moment and the classification results at other moments, and fusing the synthesized results to obtain the fused classification results.
Wherein, the central processor 801 may be further configured to:
and setting a synthesis weight for synthesis according to the comparison result of the average difference between the classification result at each moment and the classification results at other moments and a preset threshold, synthesizing each classification result according to the synthesis weight, and fusing the synthesis results to obtain the fused classification result.
Wherein, the central processor 801 may be further configured to:
synthesizing the classification results at each moment according to the average difference between the classification result at each moment and the classification results at other moments, and fusing the synthesized results comprises:
when the 1 st average difference between the classification result at the 1 st moment and the classification results at other moments is greater than a preset threshold and the 2 nd average difference between the classification result at the 2 nd moment and the classification results at other moments is less than the preset threshold, calculating a 1 st composite weight of the classification result at the 1 st moment according to the 1 st average difference, and calculating a 2 nd composite weight of the 2 nd classification result according to the 1 st weight, wherein the 1 st composite weight is less than the 2 nd composite weight;
when the 1 st average difference between the classification result at the 1 st moment and the classification results at other moments is smaller than the preset threshold and the 2 nd average difference between the classification result at the 2 nd moment and the classification results at other moments is larger than the preset threshold, calculating a 2 nd synthetic weight of the classification result at the 2 nd moment according to the 2 nd average difference, and calculating a 1 st synthetic weight of the classification result at the 1 st moment according to the 2 nd synthetic weight, wherein the 2 nd synthetic weight is smaller than the 1 st synthetic weight;
when the 1 st average difference between the classification result at the 1 st moment and the classification results at other moments and the 2 nd average difference between the classification result at the 2 nd moment and the classification results at other moments are both greater than or equal to the preset threshold or both less than the preset threshold, setting the 1 st synthetic weight of the classification result at the 1 st moment and the 2 nd synthetic weight of the classification result at the 2 nd moment to be equal;
synthesizing the classification result at the 1 st moment and the classification result at the 2 nd moment according to the 1 st synthesis weight and the 2 nd synthesis weight;
and fusing the composite result of the classification result at the 1 st moment and the classification result at the 2 nd moment to obtain a fused result of the classification result at the 1 st moment and the classification result at the 2 nd moment.
Wherein, the central processor 801 may be further configured to:
synthesizing the classification results at each moment according to the average difference between the classification result at each moment and the classification results at other moments, and fusing the synthesized results further comprises:
when the ith average difference between the classification result at the ith moment and the classification results at other moments is larger than the preset threshold, calculating an ith synthesis weight of the classification result at the ith moment according to the ith average difference, and calculating an ith-1 synthesis weight of a fusion result of the classification results at i-1 moments before the classification result at the ith moment according to the ith synthesis weight, wherein the ith synthesis weight is smaller than the ith-1 synthesis weight, i is an integer larger than or equal to 3, and i is not more than n;
when the ith average difference between the classification result at the ith moment and the classification results at other moments is less than or equal to the preset threshold, setting the ith synthesis weight of the classification result at the ith moment and the ith-1 synthesis weight of the fusion result of the classification results at i-1 moments before the classification result at the ith moment according to preset parameters;
synthesizing the classification result at the ith moment and the fusion result of i-1 classification results before the classification result at the ith moment according to the ith synthesis weight and the ith-1 synthesis weight;
and fusing the composite result of the classification result at the ith moment and the fusion result of the classification results at i-1 moments before the classification result at the ith moment to obtain the fusion result of the classification results at the i moments.
Wherein, the central processor 801 may be further configured to:
and when the ith average difference between the classification result at the ith moment and the classification results at other moments is less than or equal to the preset threshold, setting the preset parameter to ensure that the ith synthesis weight is equal to the (i-1) synthesis weight.
Further, as shown in fig. 8, the electronic device 800 may further include: an input/output unit 803 and a display unit 804; the functions of the above components are similar to those of the prior art, and are not described in detail here. It is noted that the electronic device 800 does not necessarily include all of the components shown in FIG. 8; furthermore, the electronic device 800 may also comprise components not shown in fig. 8, as reference may be made to the prior art.
Example 3
The embodiment of the invention also provides a fusion method of the classification results, which corresponds to the fusion device of the classification results of the embodiment 1. Fig. 9 is a flowchart of a classification result fusion method according to embodiment 1 of the present invention. As shown in fig. 9, the method includes:
step 901: obtaining classification results at different moments in a preset time period; and
step 902: and fusing the classification results at different moments to obtain a fused classification result.
In this embodiment, for specific descriptions of each step in the method, reference may be made to the description of the relevant unit in embodiment 1, and details are not described here again.
In the present embodiment, it can be seen from the above embodiments that the classification results at different times within the predetermined time period are fused, thereby improving the accuracy of the classification results.
Example 4
The embodiment of the invention also provides a fusion method of the classification results at a moment, which corresponds to the fusion device of the classification results of the embodiment 1. Fig. 10 is a flowchart of a classification result fusion method according to embodiment 4 of the present invention. As shown in fig. 10, the method includes:
step 1001: calculating the difference between the classification results at any two moments in the classification results at M moments, wherein M is an integer greater than or equal to 2;
step 1002: calculating the average difference between the classification result at each moment and the classification results at other moments according to the difference between the classification results at any two moments;
step 1003: judging the classification result M at the 1 st moment in the classification results at M moments11 average degree of difference Sum _ m (1) and classification result m at time 22Whether or not the 2 nd average degree of difference Sum _ m (2) of (2) satisfies the following condition: sum _ m (1)>th _ d and Sum _ m (2)<th _ d, th _ d is a preset threshold. When the determination result is "yes", the process proceeds to step 1004, and when the determination result is "no", the process proceeds to step 1005;
step 1004: calculating a 1 st synthesis weight of the classification result at the 1 st moment according to the 1 st average difference, and calculating a 2 nd synthesis weight of the classification result at the 2 nd moment according to the 1 st synthesis weight;
step 1005: judging the classification result m at the 1 st moment11 average degree of difference Sum _ m (1) and classification result m at time 22Whether or not the 2 nd average degree of difference Sum _ m (2) of (2) satisfies the following condition: sum _ m (1)<th _ d and Sum _ m (2)>th _ d, th _ d is a preset threshold. When the judgment result is yes, the process proceeds to step 1006, and when the judgment result is no, the process proceeds to step 1007;
step 1006: calculating a 2 nd synthesis weight of the classification result at the 2 nd moment according to the 2 nd average difference, and calculating a 1 st synthesis weight of the classification result at the 1 st moment according to the 2 nd synthesis weight;
step 1007: setting the 1 st synthesis weight of the classification result at the 1 st moment to be equal to the 2 nd synthesis weight of the classification result at the 2 nd moment;
step 1008: synthesizing the classification result at the 1 st moment and the classification result at the 2 nd moment according to the 1 st synthesis weight and the 2 nd synthesis weight;
step 1009: fusing the composite results of the classification results at the 1 st moment and the classification results at the 2 nd moment to obtain a fused result of the classification results at the 1 st moment and the classification results at the 2 nd moment;
step 1010: judging whether M is greater than or equal to 3, entering step 1011 when the judgment result is yes, and ending the process when the judgment result is no;
step 1011: judging the classification result m at the ith momentiIf the ith average difference Sum _ M (i) is greater than the preset threshold th _ d, if yes, the process goes to step 1012, and if no, the process goes to step 1013, where i is an integer greater than or equal to 3 and is not greater than M;
step 1012: calculating an ith synthesis weight of the classification result at the ith moment according to the ith average difference, and calculating an ith-1 synthesis weight of the fusion result of the classification results at i-1 moments before the classification result at the ith moment according to the ith synthesis weight;
step 1013: setting an ith synthesis weight of the classification result at the ith moment and an ith-1 synthesis weight of the fusion result of the classification results at i-1 moments before the classification result at the ith moment according to preset parameters;
step 1014: synthesizing the classification result at the ith moment and the fusion result of the classification results at i-1 moments before the classification result at the ith moment according to the ith synthesis weight and the ith-1 synthesis weight;
step 1015: fusing the composite result of the fusion result of the classification result at the ith moment and the fusion result of the classification result at the i-1 moment before the classification result at the ith moment to obtain the fusion result of the classification result at the i moments;
step 1016: judging whether i is smaller than M, entering a step 1017 when the judgment result is yes, and ending the process when the judgment result is no;
step 1017: add 1 to i.
In this embodiment, the calculation and setting methods used in the above steps are the same as those described in embodiment 1, and are not described herein again.
In the present embodiment, it can be seen from the above embodiments that the classification results at different times within the predetermined time period are fused, thereby improving the accuracy of the classification results.
An embodiment of the present invention further provides a computer-readable program, where when the program is executed in a fusion device or an electronic device for classification results, the program causes a computer to execute the fusion method for classification results described in embodiment 3 or embodiment 4 in the fusion device or the electronic device for classification results.
An embodiment of the present invention further provides a storage medium storing a computer-readable program, where the computer-readable program enables a computer to execute the fusion method of classification results described in embodiment 3 or embodiment 4 in a fusion device or an electronic device of classification results.
The fusion means of classification results described in connection with the embodiments of the invention may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. For example, one or more of the functional block diagrams and/or one or more combinations of the functional block diagrams illustrated in fig. 1, 3-7 may correspond to individual software modules of a computer program flow or individual hardware modules. These software modules may correspond to the respective steps shown in embodiment 3. These hardware modules may be implemented, for example, by solidifying these software modules using a Field Programmable Gate Array (FPGA).
A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. A storage medium may be coupled to the processor such that the processor can read information from, and write information to, the storage medium; or the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The software module may be stored in the memory of the mobile terminal or in a memory card that is insertable into the mobile terminal. For example, if the apparatus (e.g., mobile terminal) employs a relatively large capacity MEGA-SIM card or a large capacity flash memory device, the software module may be stored in the MEGA-SIM card or the large capacity flash memory device.
One or more of the functional block diagrams and/or one or more combinations of the functional block diagrams described with respect to fig. 1, 3-7 may be implemented as a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof designed to perform the functions described herein. One or more of the functional block diagrams and/or one or more combinations of the functional block diagrams described with respect to fig. 1-3 may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP communication, or any other such configuration.
The present application has been described in conjunction with specific embodiments, but it should be understood by those skilled in the art that these descriptions are intended to be illustrative, and not limiting. Various modifications and adaptations of the present application may occur to those skilled in the art based on the teachings herein and are within the scope of the present application.

Claims (20)

  1. A fusion device of classified results, the device comprising:
    the device comprises an acquisition unit, a classification unit and a classification unit, wherein the acquisition unit is used for acquiring classification results at different moments in a preset time period; and
    and the fusion unit is used for fusing the classification results at different moments to obtain a fused classification result.
  2. The fusion device of claim 1, wherein the fusion unit comprises:
    and the first fusion subunit is used for carrying out weighted summation on the classification results at different moments according to the first weight of the classification result at each moment so as to obtain the fused classification result.
  3. The fusion device of claim 2,
    the classification result at a later time in time has a larger first weight than the classification result at a previous time in time.
  4. The fusion device of claim 1, wherein the fusion unit comprises:
    and the second fusion subunit votes the classification results at different moments to obtain the fused classification result.
  5. The fusion device of claim 1, wherein the fusion unit comprises:
    and the third fusion subunit is used for fusing the classification results at different moments according to the difference between the classification results at different moments to obtain the fused classification result.
  6. The fusion device of claim 5, wherein the third fusion subunit comprises:
    the first calculating unit is used for calculating the difference degree between any two classification results in the classification results at different moments;
    the second calculation unit is used for calculating the average difference between the classification result at each moment and the classification results at other moments according to the difference between any two classification results and the second weight of the classification result at each moment; and
    and the fourth fusion subunit is used for synthesizing the classification results at each moment according to the average difference degree between the classification result at each moment and the classification results at other moments, and fusing the synthesis results to obtain the fused classification result.
  7. The fusion device of claim 6,
    the classification result at a later time in time has a second weight greater than the classification result at a previous time in time.
  8. The apparatus of claim 6, wherein,
    the fourth fusion subunit is configured to set a synthesis weight for synthesis according to a comparison result between the average difference between the classification result at each time and the classification results at other times and a preset threshold, synthesize each classification result according to the synthesis weight, and fuse the synthesis results to obtain the fused classification result.
  9. The apparatus of claim 6, wherein the classification results at different time instants comprise a classification result at a time instant 1 and a classification result at a time instant 2,
    the fourth fusion subunit includes:
    a first setting unit, configured to, when a 1 st average difference between the classification result at the 1 st time and the classification results at other times is greater than a preset threshold and a 2 nd average difference between the classification result at the 2 nd time and the classification results at other times is less than the preset threshold, calculate a 1 st composite weight of the classification result at the 1 st time according to the 1 st average difference, and calculate a 2 nd composite weight of the 2 nd classification result according to the 1 st weight, where the 1 st composite weight is less than the 2 nd composite weight;
    a second setting unit, configured to, when a 1 st average difference between the classification result at the 1 st time and the classification results at other times is smaller than the preset threshold and a 2 nd average difference between the classification result at the 2 nd time and the classification results at other times is greater than the preset threshold, calculate a 2 nd synthetic weight of the classification result at the 2 nd time according to the 2 nd average difference, and calculate a 1 st synthetic weight of the classification result at the 1 st time according to the 2 nd synthetic weight, where the 2 nd synthetic weight is smaller than the 1 st synthetic weight;
    a third setting unit, configured to set a 1 st composite weight of the classification result at the 1 st moment and a 2 nd composite weight of the classification result at the 2 nd moment to be equal to each other when the 1 st average difference between the classification result at the 1 st moment and the classification results at other moments and the 2 nd average difference between the classification result at the 2 nd moment and the other classification results are both greater than or equal to or less than the preset threshold;
    a first synthesizing unit, configured to synthesize the classification result at the 1 st time and the classification result at the 2 nd time according to the 1 st synthesis weight and the 2 nd synthesis weight;
    and a fifth fusion subunit, configured to fuse the combined result of the classification result at the 1 st time and the classification result at the 2 nd time, and obtain a fusion result of the classification result at the 1 st time and the classification result at the 2 nd time.
  10. The apparatus of claim 9, wherein the classification results at the different time instants include classification results at n time instants including the classification result at the 1 st time instant and the classification result at the 2 nd time instant, n being an integer greater than or equal to 3,
    the fifth fusion subunit further comprises:
    a fourth setting unit, configured to, when an ith average difference between the classification result at the ith time and the classification results at other times is greater than the preset threshold, calculate an ith synthesis weight of the classification result at the ith time according to the ith average difference, and calculate an ith-1 synthesis weight of a fusion result of the classification results at i-1 times before the classification result at the ith time according to the ith synthesis weight, where the ith synthesis weight is less than the ith-1 synthesis weight, i is an integer greater than or equal to 3, and i is less than or equal to n;
    a fifth setting unit, configured to set, according to a preset parameter, an ith synthesis weight of the classification result at the ith time and an i-1 synthesis weight of a fusion result of the classification results at i-1 times before the classification result at the ith time, when an ith average difference between the classification result at the ith time and the classification results at other times is less than or equal to the preset threshold;
    the first synthesis unit is further configured to synthesize, according to the ith synthesis weight and the (i-1) th synthesis weight, the classification result at the ith time and the fusion result of the i-1 classification results before the classification result at the ith time;
    the fifth fusion subunit is further configured to fuse the classification result at the ith time and the synthesis result of the fusion result of the classification results at i-1 times before the classification result at the ith time, so as to obtain a fusion result of the classification results at i times.
  11. The apparatus of claim 10, wherein,
    the fifth setting unit is configured to set the preset parameter such that the ith synthesis weight is equal to the (i-1) th synthesis weight when the ith average difference between the classification result at the ith time and the classification results at other times is less than or equal to the preset threshold.
  12. An electronic device having the fusion means of the classification results of any one of claims 1-11.
  13. A method of fusing results of classification, the method comprising:
    obtaining classification results at different moments in a preset time period; and
    and fusing the classification results at different moments to obtain a fused classification result.
  14. The fusion method of claim 13, wherein fusing the classification results at different times comprises:
    according to the first weight of the classification result at each moment, carrying out weighted summation on the classification results at different moments to obtain the fused classification result; or
    And voting the classification results at different moments to obtain the fused classification result.
  15. The fusion method of claim 13, wherein fusing the classification results at different times comprises:
    and fusing the classification results at different moments according to the difference between the classification results at different moments to obtain the fused classification result.
  16. The fusion method of claim 15, wherein fusing the classification results at different times according to the difference between the classification results at different times comprises:
    calculating the difference between any two classification results in the classification results at different moments;
    calculating the average difference between the classification result at each moment and the classification results at other moments according to the difference between any two classification results and the second weight of the classification result at each moment; and
    and synthesizing the classification results at each moment according to the average difference between the classification result at each moment and the classification results at other moments, and fusing the synthesized results to obtain the fused classification results.
  17. The method of claim 16, wherein synthesizing the classification results at each time based on the average degree of difference between the classification results at each time and the classification results at other times, and fusing the synthesis results comprises:
    and setting a synthesis weight for synthesis according to the comparison result of the average difference between the classification result at each moment and the classification results at other moments and a preset threshold, synthesizing each classification result according to the synthesis weight, and fusing the synthesis results to obtain the fused classification result.
  18. The method of claim 16, wherein the classification results at different time instants include a classification result at a time instant 1 and a classification result at a time instant 2,
    synthesizing the classification results at each moment according to the average difference between the classification result at each moment and the classification results at other moments, and fusing the synthesized results comprises:
    when the 1 st average difference between the classification result at the 1 st moment and the classification results at other moments is greater than a preset threshold and the 2 nd average difference between the classification result at the 2 nd moment and the classification results at other moments is less than the preset threshold, calculating a 1 st composite weight of the classification result at the 1 st moment according to the 1 st average difference, and calculating a 2 nd composite weight of the 2 nd classification result according to the 1 st weight, wherein the 1 st composite weight is less than the 2 nd composite weight;
    when the 1 st average difference between the classification result at the 1 st moment and the classification results at other moments is smaller than the preset threshold and the 2 nd average difference between the classification result at the 2 nd moment and the classification results at other moments is larger than the preset threshold, calculating a 2 nd synthetic weight of the classification result at the 2 nd moment according to the 2 nd average difference, and calculating a 1 st synthetic weight of the classification result at the 1 st moment according to the 2 nd synthetic weight, wherein the 2 nd synthetic weight is smaller than the 1 st synthetic weight;
    when the 1 st average difference between the classification result at the 1 st moment and the classification results at other moments and the 2 nd average difference between the classification result at the 2 nd moment and the classification results at other moments are both greater than or equal to the preset threshold or both less than the preset threshold, setting the 1 st synthetic weight of the classification result at the 1 st moment and the 2 nd synthetic weight of the classification result at the 2 nd moment to be equal;
    synthesizing the classification result at the 1 st moment and the classification result at the 2 nd moment according to the 1 st synthesis weight and the 2 nd synthesis weight;
    and fusing the composite result of the classification result at the 1 st moment and the classification result at the 2 nd moment to obtain a fused result of the classification result at the 1 st moment and the classification result at the 2 nd moment.
  19. The method of claim 18, wherein the classification results at different time instants include classification results at n time instants including the classification result at the 1 st time instant and the classification result at the 2 nd time instant, n being an integer greater than or equal to 3,
    synthesizing the classification results at each moment according to the average difference between the classification result at each moment and the classification results at other moments, and fusing the synthesized results further comprises:
    when the ith average difference between the classification result at the ith moment and the classification results at other moments is larger than the preset threshold, calculating an ith synthesis weight of the classification result at the ith moment according to the ith average difference, and calculating an ith-1 synthesis weight of a fusion result of the classification results at i-1 moments before the classification result at the ith moment according to the ith synthesis weight, wherein the ith synthesis weight is smaller than the ith-1 synthesis weight, i is an integer larger than or equal to 3, and i is not more than n;
    when the ith average difference between the classification result at the ith moment and the classification results at other moments is less than or equal to the preset threshold, setting the ith synthesis weight of the classification result at the ith moment and the ith-1 synthesis weight of the fusion result of the classification results at i-1 moments before the classification result at the ith moment according to preset parameters;
    synthesizing the classification result at the ith moment and the fusion result of i-1 classification results before the classification result at the ith moment according to the ith synthesis weight and the ith-1 synthesis weight;
    and fusing the composite result of the classification result at the ith moment and the fusion result of the classification results at i-1 moments before the classification result at the ith moment to obtain the fusion result of the classification results at the i moments.
  20. The method of claim 19, wherein,
    and when the ith average difference between the classification result at the ith moment and the classification results at other moments is less than or equal to the preset threshold, setting the preset parameter to ensure that the ith synthesis weight is equal to the (i-1) synthesis weight.
CN201680087583.3A 2016-09-30 2016-09-30 Fusion method, device and the electronic equipment of classification results Pending CN109478228A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/101199 WO2018058571A1 (en) 2016-09-30 2016-09-30 Method, apparatus, and electronic device for integrating classification results

Publications (1)

Publication Number Publication Date
CN109478228A true CN109478228A (en) 2019-03-15

Family

ID=61763593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680087583.3A Pending CN109478228A (en) 2016-09-30 2016-09-30 Fusion method, device and the electronic equipment of classification results

Country Status (2)

Country Link
CN (1) CN109478228A (en)
WO (1) WO2018058571A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115083439A (en) * 2022-06-10 2022-09-20 北京中电慧声科技有限公司 Vehicle whistling sound identification method, system, terminal and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750824A (en) * 2012-06-19 2012-10-24 银江股份有限公司 Urban road traffic condition detection method based on voting of network sorter
CN103679190A (en) * 2012-09-20 2014-03-26 富士通株式会社 Classification device, classification method and electronic equipment
CN103810482A (en) * 2014-03-12 2014-05-21 中国矿业大学(北京) Multi-information fusion classification and identification method
US20140307946A1 (en) * 2013-04-12 2014-10-16 Hitachi High-Technologies Corporation Observation device and observation method
CN104134076A (en) * 2014-07-10 2014-11-05 杭州电子科技大学 SAR image target recognition method based on CS and SVM decision fusion
CN104215935A (en) * 2014-08-12 2014-12-17 电子科技大学 Weighted decision fusion based radar cannonball target recognition method
CN105787430A (en) * 2016-01-12 2016-07-20 南通航运职业技术学院 Method for identifying second level human face with weighted collaborative representation and linear representation classification combined

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2402470B (en) * 2003-04-30 2005-11-30 Image Metrics Plc A method of and apparatus for classifying images
CN100595782C (en) * 2008-04-17 2010-03-24 中国科学院地理科学与资源研究所 Classification method for syncretizing optical spectrum information and multi-point simulation space information
CN101814147B (en) * 2010-04-12 2012-04-25 中国科学院自动化研究所 Method for realizing classification of scene images

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750824A (en) * 2012-06-19 2012-10-24 银江股份有限公司 Urban road traffic condition detection method based on voting of network sorter
CN103679190A (en) * 2012-09-20 2014-03-26 富士通株式会社 Classification device, classification method and electronic equipment
US20140307946A1 (en) * 2013-04-12 2014-10-16 Hitachi High-Technologies Corporation Observation device and observation method
CN103810482A (en) * 2014-03-12 2014-05-21 中国矿业大学(北京) Multi-information fusion classification and identification method
CN104134076A (en) * 2014-07-10 2014-11-05 杭州电子科技大学 SAR image target recognition method based on CS and SVM decision fusion
CN104215935A (en) * 2014-08-12 2014-12-17 电子科技大学 Weighted decision fusion based radar cannonball target recognition method
CN105787430A (en) * 2016-01-12 2016-07-20 南通航运职业技术学院 Method for identifying second level human face with weighted collaborative representation and linear representation classification combined

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115083439A (en) * 2022-06-10 2022-09-20 北京中电慧声科技有限公司 Vehicle whistling sound identification method, system, terminal and storage medium

Also Published As

Publication number Publication date
WO2018058571A1 (en) 2018-04-05

Similar Documents

Publication Publication Date Title
US11487966B2 (en) Image processing method and apparatus for target recognition
EP3779774B1 (en) Training method for image semantic segmentation model and server
CN112380921A (en) Road detection method based on Internet of vehicles
CN108230291B (en) Object recognition system training method, object recognition method, device and electronic equipment
CN109948616B (en) Image detection method and device, electronic equipment and computer readable storage medium
JP6921694B2 (en) Monitoring system
CN110245679B (en) Image clustering method and device, electronic equipment and computer readable storage medium
CN110895802B (en) Image processing method and device
US20150012472A1 (en) Systems, methods, and media for updating a classifier
CN112927279A (en) Image depth information generation method, device and storage medium
CN112989962B (en) Track generation method, track generation device, electronic equipment and storage medium
CN112465909B (en) Class activation mapping target positioning method and system based on convolutional neural network
CN111104830A (en) Deep learning model for image recognition, training device and method of deep learning model
CN112001403A (en) Image contour detection method and system
CN110991385A (en) Method and device for identifying ship driving track and electronic equipment
CN112733672A (en) Monocular camera-based three-dimensional target detection method and device and computer equipment
CN106874922B (en) Method and device for determining service parameters
CN112288701A (en) Intelligent traffic image detection method
CN114220063B (en) Target detection method and device
CN111798482A (en) Target tracking method and device
JP7165353B2 (en) Image feature output device, image recognition device, image feature output program, and image recognition program
CN113903041A (en) Text recognition method and device, vehicle and storage medium
CN115131621A (en) Image quality evaluation method and device
CN112990009A (en) End-to-end-based lane line detection method, device, equipment and storage medium
CN115908831B (en) Image detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190315