CN104091038A - Method for weighting multiple example studying features based on master space classifying criterion - Google Patents

Method for weighting multiple example studying features based on master space classifying criterion Download PDF

Info

Publication number
CN104091038A
CN104091038A CN201310110952.4A CN201310110952A CN104091038A CN 104091038 A CN104091038 A CN 104091038A CN 201310110952 A CN201310110952 A CN 201310110952A CN 104091038 A CN104091038 A CN 104091038A
Authority
CN
China
Prior art keywords
represent
positive closure
negative
individual
bag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310110952.4A
Other languages
Chinese (zh)
Inventor
柴晶
陈宏涛
黄丽霞
孙颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiyuan University of Technology
Original Assignee
Taiyuan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taiyuan University of Technology filed Critical Taiyuan University of Technology
Priority to CN201310110952.4A priority Critical patent/CN104091038A/en
Publication of CN104091038A publication Critical patent/CN104091038A/en
Pending legal-status Critical Current

Links

Landscapes

  • Sorting Of Articles (AREA)

Abstract

The invention discloses a method for weighting multiple example studying features based on a master space classifying criterion. The realization scheme of the method comprises three steps of initializing a positive package representative example and a negative package representative example, building a problem to be optimized, and updating three kinds of unknown variables of the problem to be optimized. A representative example which can right express the category mark of a package in a positive package is found by adopting a heuristic search method, so that the problem of the fuzzification of the category mark of the example in the positive package is solved; repeated iteration is performed by adopting a coordinate rising method, so that the problem to be optimized can be converged into a local optimum solution; a relative weight is given according to the size of the contribution of each feature to recognition, and compared with the method of using original data for recognition, when the data which are weighted by features are used for recognition, higher recognition precision can be obtained.

Description

Many learn-by-examples Feature Weighting Method based on large-spacing sorting criterion
Technical field
The present invention relates to a kind of method of based on large-spacing sorting criterion, many learn-by-examples data being carried out characteristic weighing, specifically a kind of give effective differentiation feature with higher weights, give noise and the redundancy feature data preprocessing method with lower weight.The method can automatically be weighted it the contribution of identification according to each feature, then the data after weighting is identified to improve the accuracy of identification of many learn-by-examples.
Background technology
Many learn-by-examples are important branch for artificial intelligence field, and its handled sample is not single example, but a bag, i.e. the set of a series of examples, and to only have the classification mark of bag be known, in bag, the classification mark of example is unknown.If at least comprise a positive example in a bag, this is coated with and is labeled as positive closure, otherwise is marked as negative bag.Characteristic weighing technology is a gordian technique of artificial intelligence field, assesses the correlativity between each feature and learning tasks by certain criterion, and gives weights of each feature and weigh the relative size of this correlativity.Large-spacing sorting criterion is a kind of very important algorithm design principle of artificial intelligence field, it improves the separability between heterogeneous destinations sample by the class interval maximizing between heterogeneous destinations sample, is mainly used at present the design effort of supervised learning algorithm.
Chinese scholars has been carried out some correlative studys for the design effort of many learn-by-examples algorithm, but not yet someone relates to about the research of characteristic weighing problem in many learn-by-examples.Very close with many learn-by-examples characteristic weighing problem is many learn-by-example feature extractions and feature selecting problem, the former refers to by spatial alternation and will learn from example data projection to certain feature space more and learn in feature space, and the latter refers to that only selected part primitive character carries out many learn-by-examples.Domestic and international progress about many learn-by-examples feature extraction and feature selecting problem is as follows:
The people such as Yunyin Sun have proposed many learn-by-examples feature extracting method of a kind of MIDR by name, the method is intended to find a classification that makes positive bags and negative bags and is labeled as positive posterior probability and equals respectively 1 and 0 feature space, and by gradient descent method, this feature space is carried out to iterative, conventionally only can converge to locally optimal solution, cannot obtain globally optimal solution.The people such as Wei Ping have proposed many learn-by-examples feature extracting method of a kind of MidLABS by name, the method builds scattering matrix and class inscattering matrix between class based on node vector sum edge vector, and calculate projection matrix by scattering matrix and class inscattering matrix trace entropy between maximization class, its weak point is when the positive example numbers in positive closure is during much smaller than negative example numbers in this bag, easily cause sample unbalance, thereby affect learning performance.The people such as Vikas C. Raykar have proposed a kind of Bayesian-MIL method and have processed the feature selecting problem in many learn-by-examples, the method is combined togather classifier design and feature selecting algorithm design, and adopt Bayesian MAP canon of probability to complete the screening operation to primitive character, the shortcoming of the method is that it only can be applicable to Logistic sorter, when data acquisition after the method feature selecting is identified with other sorter, its recognition performance has decline in various degree.
In above-mentioned three kinds of methods, first two belongs to many learn-by-example feature extracting methods, and rear one belongs to many learn-by-example feature selection approachs, compare from the many learn-by-examples Feature Weighting Method designing in the present invention have obviously different.
Summary of the invention
Lack the deficiency of effective Feature Weighting Method for existing many learn-by-examples field, the invention provides a kind of many learn-by-examples Feature Weighting Method based on large-spacing sorting criterion.
A kind of many learn-by-examples Feature Weighting Method based on large-spacing sorting criterion that the present invention is above-mentioned provided, described in it, the implementation of method content carries out according to the following steps:
(1) initialization positive closure represents that example and negative bag represent example
With regard to positive closure represents example, need in each positive closure, select a classification mark most possibly for positive example is as the representative example of this bag, adopt the probability density function estimation technique to complete above-mentioned initial work;
With regard to negative bag represents example, first need to carry out K mean cluster for the example in all negative bags, then choose the cluster centre obtaining after cluster and represent example as negative bag;
(2) build problem to be optimized
Problem to be optimized is made up of objective function and constraint function two large divisions, and objective function comprises two, and wherein Section 1 is class interval, and Section 2 is the loss sum that the situation of all violations class interval causes; Constraint function comprises three, wherein front two respectively each element of claim vector be nonnegative value and weight vector norm equals 1, and last requires class interval is nonnegative value;
(3) upgrade respectively three class known variables of problem to be optimized
Upgrade respectively the three class known variables that comprise in problem to be optimized in the mode of iteration by coordinate rise method: positive closure represents example, weight vector and class interval, until the relative variation of objective function is less than predefined threshold value; In single upgrades, need to fix other two classes known variables, only just need that class known variables of upgrading to upgrade, described in it, method is to carry out according to the following steps:
(1) initialization positive closure represents that example and negative bag represent example
Adopt the nonparametric probability density function estimation technique-Parzen window method to represent that to positive closure example carries out initialization: to regard the example in all negative bags as training sample and estimate negative example probability density function, estimate respectively each probability density value that is exemplified as negative example in given positive closure, and choose that example of probability density value minimum as the representative example of this given positive closure;
Adopt K means Method to represent that to negative bag example carries out initialization: to consider that the example in negative bag does not exist classification mark fuzzy problem, the all examples in negative bag are negative example, and in negative bag, example numbers is very many under normal circumstances, therefore first carry out K mean cluster for the example in all negative bags, then choose the cluster centre obtaining after cluster and represent example as negative bag;
(2) build problem to be optimized
If , represent respectively positive closure number and negative bag number; ( ), , represent respectively individual positive closure, in individual positive closure individual example, example numbers in individual positive closure; ( ), , represent respectively individual negative bag, in individual negative bag individual example, example numbers in individual negative bag; ( ), ( ) represent respectively individual positive closure represents example and individual negative bag represents example:
Problem to be optimized can be expressed as
(1)
Wherein represent slack variable, represent weight vector, represent the of weight vector individual component, for all positive closures represent the mean vector of example, represent square Hingle loss function
(2)
Objective function Section 1 in problem to be optimized (1) is large-spacing item, Section 2 is the caused loss function sum of the situation of all violation large-spacing class conditions, wherein large-spacing class condition is: in the feature space after weighting, positive closure represents that example represents that to bearing arbitrarily bag the distance of example and this positive closure represent that example represents that to positive closure the difference of the distance of example mean vector can not be less than class interval arbitrarily .Each element of constraint function Section 1 claim vector is nonnegative value, Section 2 claim vector norm equals 1, and last requires class interval for nonnegative value;
(3) carry out iteration optimization by coordinate rise method
When (1) being optimized based on coordinate rise method, the mode of employing iteration is upgraded respectively positive closure and is represented this three classes known variables of example, weight vector and class interval, and keeps other two classes known variables to immobilize in the time upgrading a certain class known variables;
1) in the time that renewal positive closure represents example, because a square Hingle loss function is non-strictly monotone decreasing, can adopt heuristic search to upgrade, specifically renewal, need to be to positive closure in all examples carry out exhaustive-search, find out and can minimize
(3)
's and it is made as new , wherein represent that Hadamard is long-pending, i.e. corresponding element product; When for all ( ) all carried out once upgrading after, completed one take turns renewal after, need to recalculate even, ; After upgrading, continuous two-wheeled obtains ( ) when changing, can end to represent for positive closure the renewal process of example;
2) order represent vector individual component, and , wherein , , represent respectively , , individual component; Order vector represent weight vector corresponding element square, even ; Order , represent that respectively those have triggered the sample pair of Hinge quadratic loss function in with subscript; To weight vector while renewal, first need to optimize following convex quadratic programming problem
(4)
Then make weight vector equal vector the square root of corresponding element, even ;
3) to class interval while renewal, need to optimize following convex quadratic programming problem and solve
(5)
While adopting coordinate rise method to treat optimization problem (1) to carry out iterative, take turns and in iteration, need respectively positive closure to represent that example, weight vector and class interval upgrade at each, and take turns iteration and recalculate after complete the target function value of (1) at each; If the relative variation of the target function value of (1) that calculates after continuous two-wheeled iteration is less than predefined threshold value, can stops iteration and finish whole optimizing process.
The above-mentioned implementation of the present invention is found out in positive closure the representative example of classification mark that can Correct bag by heuristic search, i.e. positive example in positive closure, thus solve the classification mark fuzzy problem of example in positive closure; Carry out repeatedly iteration by coordinate rise method, make problem to be optimized can converge to a locally optimal solution; Give its relative weighting according to each feature to the contribution of identification, the data after employing characteristic weighing are identified and can be obtained than adopting raw data to identify higher accuracy of identification.
The present invention is compared with existing many learn-by-examples technology, this method increases the weight of the characteristic component that contains discriminant information in identification by characteristic weighing, reduce the weight of the characteristic component that contains noise and redundant information in identification, automatically give its weight according to each feature to the relative size of identification contribution, can strengthen the separability between foreign peoples's sample, improve the accuracy of identification of many learn-by-example data.Also the monotone decreasing characteristics design of with good grounds square of Hinge loss function a kind of heuristic search find the representative example in positive closure, be that in each positive closure, classification mark is most possibly positive example, solve the classification mark fuzzy problem of example in positive closure, made the design effort of follow-up large-spacing Feature Weighting Method become simple.
embodiment
Below the specific embodiment of the present invention is further illustrated.
Implement the above-mentioned a kind of many learn-by-examples Feature Weighting Method based on large-spacing sorting criterion of the present invention, described in it, the implementation of method is undertaken by following step:
Step 1, initialization positive closure represent that example and negative bag represent example
In positive closure, example has classification mark ambiguity, represents that by finding positive closure example is that in positive closure, classification mark most possibly can be eliminated the inconvenience that work brings to subsequent design of above-mentioned ambiguity for positive example, therefore needs initialization positive closure to represent example.There is not classification mark fuzzy problem in the example in negative bag, the all examples in negative bag are negative example, but in negative bag, the number of example is a lot, adopt so many example to carry out design feature method of weighting and can increase the computation complexity of follow-up optimizing process, represent that by finding negative bag example can reduce the example numbers for carrying out Feature Weighting Method design and reduce the computation complexity of algorithm optimization process, therefore needs the negative bag of initialization to represent example.
What initialization positive closure represented example employing is the nonparametric probability density function estimation technique-Parzen window method: regard the example in all negative bags as training sample and estimate negative example probability density function, estimate respectively each probability density value that is exemplified as negative example in given positive closure, and choose that example of probability density value minimum as the representative example of this given positive closure.
What the negative bag of initialization represented that example adopts is K means Method: choose example in all negative bags as training example and it is carried out to K mean cluster, then choosing the cluster centre obtaining after cluster and represent example as negative bag.
Step 2, build problem to be optimized
If , represent respectively positive closure number and negative bag number; ( ), , represent respectively individual positive closure, in individual positive closure individual example, example numbers in individual positive closure; ( ), , represent respectively individual negative bag, in individual negative bag individual example, example numbers in individual negative bag; ( ), ( ) represent respectively individual positive closure represents example and individual negative bag represents example.
In problem to be optimized, need to build positive closure and represent that example and negative bag represent the macrotaxonomy interval between example, and the situation of violating above-mentioned large-spacing class condition is punished, this problem can be expressed as
(1)
Wherein presentation class interval, represent weight vector, represent the of weight vector individual component, represent that it is corresponding element product that Hadamard amasss, expression square norm, represent that all positive closures represent the mean vector of example, represent square Hingle loss function
(2)
In problem to be optimized (1), the Section 1 of objective function is large-spacing item, Section 2 is caused square of Hingle loss function sum of situation of all violation large-spacing class conditions, wherein large-spacing class condition is: in the feature space after weighting, positive closure represents that example represents that to bearing arbitrarily bag the distance of example and this positive closure represent that example represents that to positive closure the difference of the distance of example mean vector can not be less than class interval arbitrarily .Each element of the Section 1 claim vector of constraint function is nonnegative value; Section 2 claim vector norm equals 1, and the scale factor of weight vector can not unrestrictedly increase; Last requires class interval for nonnegative value.
Step 3, carry out iteration optimization by coordinate rise method
In problem to be optimized (1), contain altogether three class known variables: positive closure represents example, weight vector and class interval.When (1) being optimized based on coordinate rise method, adopt the mode of iteration to upgrade respectively positive closure and represented this three classes known variables of example, weight vector and class interval, and kept other two classes known variables to immobilize in the time upgrading a certain class known variables.
1) in the time that renewal positive closure represents example, because a square Hingle loss function is non-strictly monotone decreasing, can adopt heuristic search to upgrade.Specifically renewal, need to be to positive closure in all examples carry out exhaustive-search, find out and can minimize
(3)
's and it is made as new .When for all ( ) all carried out once upgrading after, completed one take turns renewal after, need to recalculate even, .After upgrading, continuous two-wheeled obtains ( ) when changing, can end to represent for positive closure the renewal process of example.
2) order represent vector individual component, and , wherein , , represent respectively , , individual component; Order vector represent weight vector corresponding element square, even ; Order , represent that respectively those have triggered the sample pair of Hinge quadratic loss function in with subscript.To weight vector while renewal, first need to optimize following convex quadratic programming problem
(4)
Then make weight vector equal vector the square root of corresponding element, even .
3) to class interval while renewal, need to optimize following convex quadratic programming problem and solve
(5)
While adopting coordinate rise method to treat optimization problem (1) to carry out iterative, take turns and in iteration, need respectively positive closure to represent that example, weight vector and class interval upgrade at each, and take turns iteration and recalculate after complete the target function value of (1) at each.If the relative variation of the target function value of (1) that calculates after continuous two-wheeled iteration is less than predefined threshold value, can stops iteration and finish whole optimizing process, thereby obtaining a locally optimal solution.
The present invention will be used in many learn-by-examples field, and its performance can be carried out following emulation experiment by computing machine and be provided.
Experiment has adopted five groups of public data collection conventional in many learn-by-examples field to test the recognition performance of the Feature Weighting Method of the present invention's proposition.These five groups of public data collection are respectively Musk1, Musk2, Elephant, Fox and Tiger, wherein this two group data set of Musk1 and Musk2 is mainly used to carry out the prediction of pharmaceutically active problem, Elephant, and this three group data set of Fox and Tiger is mainly used to carry out image retrieval.What propose due to the present invention is only a kind of characteristic weighing preprocess method, first adopts institute of the present invention extracting method to carry out characteristic weighing to raw data in experiment, and the data after then adopting Citation-KNN sorter to characteristic weighing are identified.In order fully to compare the recognition performance of institute of the present invention extracting method and other many learn-by-examples method, also institute's extracting method of the present invention and several existing other many learn-by-examples method are carried out to performance comparison.All adopt 10 retransposing verification techniques to calculate average correct recognition rata (%) and with this discrimination corresponding standard deviation of the method on data-oriented collection for all methods, specific experiment result can see table 1.
Table 1
Algorithm title Musk1 Musk2 Elephant Fox Tiger
MI-Kernel 88.0±3.1 89.3±1.5 84.3±1.6 60.3±1.9 84.2±1.0
MI-Graph 90.0±3.8 90.0±2.7 85.1±2.8 61.2±1.7 81.9±1.5
mi-Graph 88.9±3.3 90.3±2.6 86.8±0.7 61.6±2.8 86.0±1.6
Citation-KNN 90.0±2.7 89.1±3.0 87.8±3.2 62.0±2.1 82.5±1.9
CLFDA+Citation-KNN 92.1±3.6 90.3±2.9 89.4±1.8 71.6±3.1 84.4±2.7
Bayesian-MIL 89.1±2.8 92.7±3.3 88.7±2.9 73.5±3.6 88.4±3.0
The present invention+Citation-KNN 96.7±3.1 93.2±2.8 92.5±2.1 77.1±3.2 89.9±2.4
After the experimental result of upper table 1 shows that process Feature Weighting Method of the present invention is weighted pre-service to data, Citation-KNN sorter has all been obtained the highest average correct recognition rata on five groups of public many learn-by-example data sets, its average correct recognition rata has improved respectively 4.6 compared with best method in other several method, 0.5,3.1,3.6 and 1.5 percentage points.

Claims (2)

1. the many learn-by-examples Feature Weighting Method based on large-spacing sorting criterion, described in it, the implementation of method carries out according to the following steps:
(1) initialization positive closure represents that example and negative bag represent example
With regard to positive closure represents example, need in each positive closure, select a classification mark most possibly for positive example is as the representative example of this bag, adopt the probability density function estimation technique to complete above-mentioned initial work;
With regard to negative bag represents example, first need to carry out K mean cluster for the example in all negative bags, then choose the cluster centre obtaining after cluster and represent example as negative bag;
(2) build problem to be optimized
Problem to be optimized is made up of objective function and constraint function two large divisions, and objective function comprises two, and wherein Section 1 is class interval, and Section 2 is the loss sum that the situation of all violations class interval causes; Constraint function comprises three, wherein front two respectively each element of claim vector be nonnegative value and weight vector norm equals 1, and last requires class interval is nonnegative value;
(3) upgrade respectively three class known variables of problem to be optimized
Upgrade respectively the three class known variables that comprise in problem to be optimized in the mode of iteration by coordinate rise method: positive closure represents example, weight vector and class interval, until the relative variation of objective function is less than predefined threshold value; In single upgrades, need to fix other two classes known variables, only just need that class known variables of upgrading to upgrade.
2. the method for claim 1, described in it, method is carried out according to the following steps:
(1) initialization positive closure represents that example and negative bag represent example
Adopt the nonparametric probability density function estimation technique-Parzen window method to represent that to positive closure example carries out initialization: to regard the example in all negative bags as training sample and estimate negative example probability density function, estimate respectively each probability density value that is exemplified as negative example in given positive closure, and choose that example of probability density value minimum as the representative example of this given positive closure;
Adopt K means Method to represent that to negative bag example carries out initialization: to consider that the example in negative bag does not exist classification mark fuzzy problem, the all examples in negative bag are negative example, and in negative bag, example numbers is very many under normal circumstances, therefore first carry out K mean cluster for the example in all negative bags, then choose the cluster centre obtaining after cluster and represent example as negative bag;
(2) build problem to be optimized
If , represent respectively positive closure number and negative bag number; ( ), , represent respectively individual positive closure, in individual positive closure individual example, example numbers in individual positive closure; ( ), , represent respectively individual negative bag, in individual negative bag individual example, example numbers in individual negative bag; ( ), ( ) represent respectively individual positive closure represents example and individual negative bag represents example:
Problem to be optimized can be expressed as
(1)
Wherein represent slack variable, represent weight vector, represent the of weight vector individual component, for all positive closures represent the mean vector of example, represent square Hingle loss function
(2)
Objective function Section 1 in problem to be optimized (1) is large-spacing item, Section 2 is the caused loss function sum of the situation of all violation large-spacing class conditions, wherein large-spacing class condition is: in the feature space after weighting, positive closure represents that example represents that to bearing arbitrarily bag the distance of example and this positive closure represent that example represents that to positive closure the difference of the distance of example mean vector can not be less than class interval arbitrarily 0each element of constraint function Section 1 claim vector is nonnegative value, Section 2 claim vector norm equals 1, and last requires class interval for nonnegative value;
(3) carry out iteration optimization by coordinate rise method
When (1) being optimized based on coordinate rise method, the mode of employing iteration is upgraded respectively positive closure and is represented this three classes known variables of example, weight vector and class interval, and keeps other two classes known variables to immobilize in the time upgrading a certain class known variables;
1) in the time that renewal positive closure represents example, because a square Hingle loss function is non-strictly monotone decreasing, can adopt heuristic search to upgrade, specifically renewal, need to be to positive closure in all examples carry out exhaustive-search, find out and can minimize
(3)
's and it is made as new , wherein represent that Hadamard is long-pending, i.e. corresponding element product; When for all ( ) all carried out once upgrading after, completed one take turns renewal after, need to recalculate even, ; After upgrading, continuous two-wheeled obtains ( ) when changing, can end to represent for positive closure the renewal process of example;
2) order represent vector individual component, and , wherein , , represent respectively , , individual component; Order vector represent weight vector corresponding element square, even ; Order , represent that respectively those have triggered the sample pair of Hinge quadratic loss function in with subscript; To weight vector while renewal, first need to optimize following convex quadratic programming problem
(4)
Then make weight vector equal vector the square root of corresponding element, even ;
3) to class interval while renewal, need to optimize following convex quadratic programming problem and solve
(5)
While adopting coordinate rise method to treat optimization problem (1) to carry out iterative, take turns and in iteration, need respectively positive closure to represent that example, weight vector and class interval upgrade at each, and take turns iteration and recalculate after complete the target function value of (1) at each; If the relative variation of the target function value of (1) that calculates after continuous two-wheeled iteration is less than predefined threshold value, can stops iteration and finish whole optimizing process.
CN201310110952.4A 2013-04-01 2013-04-01 Method for weighting multiple example studying features based on master space classifying criterion Pending CN104091038A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310110952.4A CN104091038A (en) 2013-04-01 2013-04-01 Method for weighting multiple example studying features based on master space classifying criterion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310110952.4A CN104091038A (en) 2013-04-01 2013-04-01 Method for weighting multiple example studying features based on master space classifying criterion

Publications (1)

Publication Number Publication Date
CN104091038A true CN104091038A (en) 2014-10-08

Family

ID=51638754

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310110952.4A Pending CN104091038A (en) 2013-04-01 2013-04-01 Method for weighting multiple example studying features based on master space classifying criterion

Country Status (1)

Country Link
CN (1) CN104091038A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046269A (en) * 2015-06-19 2015-11-11 鲁东大学 Multi-instance multi-label scene classification method based on multinuclear fusion
CN106250924A (en) * 2016-07-27 2016-12-21 南京大学 A kind of newly-increased category detection method based on multi-instance learning
CN106327516A (en) * 2015-06-29 2017-01-11 北京雷动云合智能技术有限公司 Learning-type visual tracking method based on appearance model
CN108073567A (en) * 2016-11-16 2018-05-25 北京嘀嘀无限科技发展有限公司 A kind of Feature Words extraction process method, system and server
CN108629373A (en) * 2018-05-07 2018-10-09 苏州大学 A kind of image classification method, system, equipment and computer readable storage medium
CN109444840A (en) * 2018-12-04 2019-03-08 南京航空航天大学 A kind of radar clutter suppression method based on machine learning
CN109918505A (en) * 2019-02-26 2019-06-21 西安电子科技大学 A kind of network security incident visualization method based on text-processing
CN110069985A (en) * 2019-03-12 2019-07-30 北京三快在线科技有限公司 Aiming spot detection method based on image, device, electronic equipment
CN111539493A (en) * 2020-07-08 2020-08-14 北京必示科技有限公司 Alarm prediction method and device, electronic equipment and storage medium
CN113177608A (en) * 2021-05-21 2021-07-27 河南大学 Neighbor model feature selection method and device for incomplete data

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046269B (en) * 2015-06-19 2019-02-22 鲁东大学 A kind of more example multi-tag scene classification methods based on multi-core integration
CN105046269A (en) * 2015-06-19 2015-11-11 鲁东大学 Multi-instance multi-label scene classification method based on multinuclear fusion
CN106327516A (en) * 2015-06-29 2017-01-11 北京雷动云合智能技术有限公司 Learning-type visual tracking method based on appearance model
CN106327516B (en) * 2015-06-29 2018-12-18 北京雷动云合智能技术有限公司 A kind of learning-oriented visual pursuit method based on display model
CN106250924B (en) * 2016-07-27 2019-07-16 南京大学 A kind of newly-increased category detection method based on multi-instance learning
CN106250924A (en) * 2016-07-27 2016-12-21 南京大学 A kind of newly-increased category detection method based on multi-instance learning
CN108073567A (en) * 2016-11-16 2018-05-25 北京嘀嘀无限科技发展有限公司 A kind of Feature Words extraction process method, system and server
CN108629373A (en) * 2018-05-07 2018-10-09 苏州大学 A kind of image classification method, system, equipment and computer readable storage medium
CN109444840A (en) * 2018-12-04 2019-03-08 南京航空航天大学 A kind of radar clutter suppression method based on machine learning
CN109918505A (en) * 2019-02-26 2019-06-21 西安电子科技大学 A kind of network security incident visualization method based on text-processing
CN109918505B (en) * 2019-02-26 2023-09-19 西安电子科技大学 Network security event visualization method based on text processing
CN110069985A (en) * 2019-03-12 2019-07-30 北京三快在线科技有限公司 Aiming spot detection method based on image, device, electronic equipment
CN111539493A (en) * 2020-07-08 2020-08-14 北京必示科技有限公司 Alarm prediction method and device, electronic equipment and storage medium
CN111539493B (en) * 2020-07-08 2020-11-27 北京必示科技有限公司 Alarm prediction method and device, electronic equipment and storage medium
CN113177608A (en) * 2021-05-21 2021-07-27 河南大学 Neighbor model feature selection method and device for incomplete data
CN113177608B (en) * 2021-05-21 2023-09-05 河南大学 Neighbor model feature selection method and device for incomplete data

Similar Documents

Publication Publication Date Title
CN104091038A (en) Method for weighting multiple example studying features based on master space classifying criterion
US11960568B2 (en) Model and method for multi-source domain adaptation by aligning partial features
CN105701502B (en) Automatic image annotation method based on Monte Carlo data equalization
Eigen et al. Nonparametric image parsing using adaptive neighbor sets
Isa et al. Using the self organizing map for clustering of text documents
CN102314614B (en) Image semantics classification method based on class-shared multiple kernel learning (MKL)
CN103136504B (en) Face identification method and device
CN114841257B (en) Small sample target detection method based on self-supervision comparison constraint
CN113378632A (en) Unsupervised domain pedestrian re-identification algorithm based on pseudo label optimization
CN106611052A (en) Text label determination method and device
CN103258210B (en) A kind of high-definition image classification method based on dictionary learning
CN105320961A (en) Handwriting numeral recognition method based on convolutional neural network and support vector machine
CN110348579A (en) A kind of domain-adaptive migration feature method and system
CN113887643B (en) New dialogue intention recognition method based on pseudo tag self-training and source domain retraining
CN113408605A (en) Hyperspectral image semi-supervised classification method based on small sample learning
CN109492673A (en) A kind of unbalanced data prediction technique based on spectral clustering sampling
CN109871872A (en) A kind of flow real-time grading method based on shell vector mode SVM incremental learning model
CN111160553A (en) Novel field self-adaptive learning method
CN113569895A (en) Image processing model training method, processing method, device, equipment and medium
Li et al. GAN driven semi-distant supervision for relation extraction
Chu et al. Co-training based on semi-supervised ensemble classification approach for multi-label data stream
CN105512675B (en) A kind of feature selection approach based on the search of Memorability multiple point crossover gravitation
Wang et al. A novel sparse boosting method for crater detection in the high resolution planetary image
CN113535947A (en) Multi-label classification method and device for incomplete data with missing labels
CN106203508A (en) A kind of image classification method based on Hadoop platform

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20141008

WD01 Invention patent application deemed withdrawn after publication