CN109947938A - Multiple labeling classification method, system, readable storage medium storing program for executing and computer equipment - Google Patents
Multiple labeling classification method, system, readable storage medium storing program for executing and computer equipment Download PDFInfo
- Publication number
- CN109947938A CN109947938A CN201910081373.9A CN201910081373A CN109947938A CN 109947938 A CN109947938 A CN 109947938A CN 201910081373 A CN201910081373 A CN 201910081373A CN 109947938 A CN109947938 A CN 109947938A
- Authority
- CN
- China
- Prior art keywords
- label
- semi
- matrix
- correlation
- classification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000002372 labelling Methods 0.000 title abstract 8
- 239000011159 matrix material Substances 0.000 claims abstract description 62
- 230000006870 function Effects 0.000 claims description 25
- 239000013598 vector Substances 0.000 claims description 18
- 239000003550 marker Substances 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 8
- 238000012549 training Methods 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 230000009467 reduction Effects 0.000 abstract description 3
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Landscapes
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The present invention provides a kind of multiple labeling classification method, system, readable storage medium storing program for executing and computer equipment, this method comprises: converting two class classification problems for multiple labeling classification problem, each binary classifier corresponds to a label in multiple labeling data set;According to the correlation between preset feature selecting and binary classifier, semi-supervised Multi-label learning model is constructed;Semi-supervised Multi-label learning model is solved using preset algorithm, to obtain the model parameter and label correlation matrix of semi-supervised multiple labeling classification;According to model parameter and label correlation matrix, label sets belonging to unknown mark are predicted.Multiple labeling classification method in the present invention, using the semi-supervised Multi-label learning method of combined mark correlation and feature selecting, not only efficiently utilize a large amount of unmarked multiple labeling sample, and the correlation between label is automatically obtained also by study, furthermore dimensionality reduction is carried out to high dimensional data, help to obtain the better multiple labeling classifier of Generalization Capability.
Description
Technical Field
The invention relates to the technical field of multi-label learning, in particular to a multi-label classification method, a multi-label classification system, a readable storage medium and computer equipment.
Background
Machine learning is an important research area in computer science, which is specialized in studying how a computer simulates or realizes human learning behaviors. Supervised learning is the most studied and most widely used learning framework in machine learning, and it is assumed that each learning object is only attached to one label.
In the real world, however, one marker is often not enough to accurately describe some complex semantic objects. Therefore, complex learning objects with multiple marks are ubiquitous, and the traditional supervised learning method has difficulty in well processing the complex learning objects.
For this reason, multi-label learning comes along with the task of learning known multi-label data sets to predict the label set to which the unknown sample belongs. Among them, semi-supervised multi-labeled learning is a typical type of multi-labeled learning, which utilizes a small amount of labeled samples and a large amount of unlabeled samples to train and obtain a multi-labeled classifier.
However, in the prior art, although the correlation between labels is considered in the current semi-supervised multi-label learning method, the label correlation cannot be estimated and the multi-label feature selection is not considered, and some semi-supervised multi-label learning methods consider the multi-label feature selection, but the generalization performance of the trained multi-label classifier is poor because the multi-label correlation cannot be automatically mined and utilized through learning.
Disclosure of Invention
Based on this, the present invention provides a multi-label classification method, system, readable storage medium and computer device, so as to solve the technical problem of poor generalization performance of the multi-label classifier trained by the semi-supervised multi-label learning method in the prior art.
The multi-label classification method comprises the following steps:
converting the multi-label classification problem into two classification problems, wherein each two classifier corresponds to one label in the multi-label data set;
constructing a semi-supervised multi-label learning model according to the preset feature selection and the correlation between the two types of classifiers;
solving the semi-supervised multi-label learning model by adopting a preset algorithm to obtain model parameters and a label correlation matrix of the semi-supervised multi-label classification;
and predicting a label set to which the unknown label belongs according to the model parameters and the label correlation matrix of the semi-supervised multi-label classification.
In addition, the multi-label classification method according to the above embodiment of the present invention may further have the following additional technical features:
further, the step of converting the multi-label classification problem into two classification problems comprises:
converting the multi-label classification problem into Q two types of classifiers, wherein the discriminant function of the Q two types of classifiers is defined as
Wherein Q represents the number of markers in the multi-marker dataset,andand respectively representing the weight and the deviation corresponding to the discriminant functions of the qth classifiers and the two classifiers.
Further, the correlation between the two classes of classifiers depends on the original feature vectors and other labeled variables simultaneously for the learning of each of the two classes of classifiers.
Further, the discriminant functions of the two classes of classifiers can be converted into:
wherein the weight matrixDeviation matrixMark up correlation vector
Further, the feature selection is a term that constrains the weight matrix WwiRow i of W.
Further, the objective function expression of the semi-supervised multi-labeled learning model is as follows:
wherein n and Q respectively represent the number of samples and the number of marks,a marking matrix representing the correspondence of marked samples, the marking matrixFrom n tolMarking matrix corresponding to marked multi-marked samplesLabel matrix F corresponding to unlabeled multi-labeled samplesuComposition FuThe initial state is an all-zero matrix, Fi·Representing the marker vector, p, corresponding to the ith sampleiThe importance of the ith training sample is defined,in order to mark the correlation matrix, the correlation matrix is marked,is to avoid over-fittingThen the term is changed into a more complete term,wirow i of W, λ and η are balance parameters,in order to be a multi-label data matrix,tr (-) denotes the trace of the matrix,representing a vector with elements all 1.
Further, the preset algorithm is an alternating iteration solving algorithm.
A multi-label classification system according to an embodiment of the present invention includes:
the problem conversion module is used for converting the multi-label classification problem into two types of classification problems, and each two types of classifiers corresponds to one label in the multi-label data set;
the model building module is used for building a semi-supervised multi-label learning model according to the preset feature selection and the correlation between the two types of classifiers;
the model solving module is used for solving the semi-supervised multi-label learning model by adopting a preset algorithm so as to obtain model parameters and a label correlation matrix of the semi-supervised multi-label classification;
and the mark prediction module predicts a mark set to which the unknown mark belongs according to the model parameters of the semi-supervised multi-mark classification and the mark correlation matrix.
Another method of the invention also proposes a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as described above
Another method of the present invention also provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the program, the method as described above is implemented.
The multi-label classification method adopts a semi-supervised multi-label learning method combining label correlation and feature selection, not only considers utilizing a large number of unmarked data sets, but also considers label correlation and multi-label feature selection when designing the multi-label classifier, thereby not only effectively utilizing a large number of unmarked multi-label samples, but also automatically obtaining the correlation among labels through learning, and in addition, reducing the dimension of high-dimensional data, and being beneficial to obtaining the multi-label classifier with better generalization performance.
Drawings
FIG. 1 is a flow chart of a multi-label classification method according to a first embodiment of the invention;
FIG. 2 is a flow chart of a multi-label classification system in a second embodiment of the invention;
fig. 3 is a block diagram of a computer apparatus in a third embodiment of the present invention.
Description of the main element symbols:
problem transformation module | 11 | Model building module | 12 |
Model solving module | 13 | Mark prediction module | 14 |
Memory device | 10 | Processor with a memory having a plurality of memory cells | 20 |
Computer program | 30 |
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Several embodiments of the invention are presented in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like as used herein are for illustrative purposes only.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Referring to fig. 1, a multi-label classification method according to a first embodiment of the present invention is shown, including steps S01 through S04.
Step S01, the multi-label classification problem is converted into two classes of classification problems, each two class of classifiers corresponding to one label in the multi-label dataset.
Specifically, the step of converting the multi-label classification problem into two classification problems comprises:
converting the multi-label classification problem into Q two types of classifiers, wherein the discriminant function of the Q two types of classifiers is defined asWherein Q represents the number of markers in the multi-marker dataset,andand respectively representing the weight and the deviation corresponding to the discriminant functions of the qth classifiers.
And step S02, constructing a semi-supervised multi-label learning model according to the preset feature selection and the correlation between the two types of classifiers.
Wherein the correlation between the two types of classifiers is: the learning of each of the two classes of classifiers depends on both the original feature vectors and other labeled variables.
It can be understood that since the learning of the classifier corresponding to each label depends on the original feature vector and other label variables, the two classes of classifiers independent from each other can learn at the same time, and the multi-label classification and the label correlation can also learn at the same time. Based on this, the corresponding discriminant functions of the qth two classes of classifiers can be converted into:
wherein the weight matrixDeviation matrixMark up correlation vector
Furthermore, to reduce the dimensionality of the high-dimensional data feature space, the features are selected as: terms constraining the weight matrix WwiRow i of W.
Specifically, the purpose of step S02 is to: and unifying parameters, mark correlation and semi-supervised multi-mark feature selection of the semi-supervised multi-mark classification model into the same model frame for joint learning so as to construct a semi-supervised multi-mark learning model.
Wherein the objective function of the semi-supervised multi-label learning model is expressed as:
wherein n and Q respectively represent the number of samples and the number of marks,a marking matrix representing the correspondence of marked samples, the marking matrixFrom n tolMarking matrix corresponding to marked multi-marked samplesLabel matrix F corresponding to unlabeled multi-labeled samplesuComposition FuThe initial state is an all-zero matrix, Fi·Representing the marker vector, p, corresponding to the ith sampleiThe importance of the ith training sample is defined,in order to mark the correlation matrix, the correlation matrix is marked,in order to avoid over-fitting the regularization term,wiline i of W, λ and η are balance parameters.
Based on this, the model can be further represented as:
wherein,in order to be a multi-label data matrix,tr (-) denotes the trace of the matrix,representing vectors with elements all being 1.
And step S03, solving the semi-supervised multi-label learning model by adopting a preset algorithm to obtain model parameters and a label correlation matrix of the semi-supervised multi-label classification.
The preset algorithm is an alternating iteration solving algorithm.
The specific alternating iterative solution algorithm comprises the following steps (A) to (C):
step (I): fixing C and F, solving W and b
When C and F are given, W and b can be obtained by solving the following objective function by optimization.
Because | | W | purple light2,1The invention converts the target function into an equivalent constraint smooth convex optimization problem and adopts an acceleration projection gradient method based on Nesterov to solve.
Wherein Θ ∈ { θ ∈ E(d+1)×Q|θ=(WT,b)T,||W||2,1≤r,W∈Rd×Q,b∈RQIs a convex set, r is l2,1The radius of the sphere.
Order toSearch points representing the t-th iteration, wherein αtTo adjust the parameters.
For theDefinition of
Wherein,representing the function g (-) at a point θThe upper derivative of the signal is taken as the upper derivative, is strongly convex with respect to theta. Thus, in the t +1 th iteration Θt+1The approximate solution of (c) can be sub-calculated according to the following equation.
Thus, wt+1The following equation can be calculated:
wherein Ω ∈ { W ∈ Rd×Q|||W||2,1≤r]Is a closed convex set, piΩ(. cndot.) is the Euclidean projection onto the convex set Ω.
bt+1The following equation can be calculated:
namely, it is
Step (II): fixing W, b and F, and solving for C
When W, b, and F are given, C can be obtained by solving the following objective function by optimization.
Each component in C can be obtained by optimizing the following objective function:
step (three): fixing W, b and C, and obtaining F
After optimizing W, b and C, one can calculateTherefore, the label values corresponding to the unlabeled samples in the label matrix F can be adjusted according to the formula defined below during each iterative solution process.
And step S04, predicting a label set to which the unknown label belongs according to the model parameters and the label correlation matrix of the semi-supervised multi-label classification.
For example, suppose an unknown sample x is givenuThe discrimination function value on the qth mark is:the predicted tag vector is: h (x)u)=sign(f1(xu),...,fQ(xu) Sign () is a sign function.
In summary, in the multi-label classification method in the above embodiments of the present invention, a semi-supervised multi-label learning method combining label correlation and feature selection is adopted, and not only a large number of unlabeled data sets are considered, but also label correlation and multi-label feature selection are considered while designing the multi-label classifier, so that not only a large number of unlabeled multi-label samples are effectively utilized, but also correlation between labels is automatically obtained through learning, and in addition, dimension reduction is performed on high-dimensional data, which is beneficial to obtaining a multi-label classifier with better generalization performance.
Referring to fig. 2, a multi-label classification system according to a third embodiment of the present invention is shown, and includes:
the problem conversion module 11 is configured to convert the multi-label classification problem into two types of classification problems, where each two types of classifiers corresponds to one label in the multi-label data set;
the model construction module 12 is used for constructing a semi-supervised multi-label learning model according to the preset feature selection and the correlation between the two types of classifiers;
the model solving module 13 is configured to solve the semi-supervised multi-labeled learning model by using a preset algorithm to obtain model parameters and a label correlation matrix of the semi-supervised multi-labeled classification;
and the mark prediction module 14 is used for predicting a mark set to which the unknown mark belongs according to the model parameters and the mark correlation matrix of the semi-supervised multi-mark classification.
Further, the problem transformation module 11 may be further configured to transform the multi-labeled classification problem into Q two-class classifiers, where a discriminant function of the qth two-class classifier is defined as
Wherein Q represents the number of markers in the multi-marker dataset,andand respectively representing the weight and the deviation corresponding to the discriminant functions of the qth classifiers and the two classifiers.
Further, the correlation between the two classes of classifiers depends on the original feature vectors and other labeled variables simultaneously for the learning of each of the two classes of classifiers.
Further, the discriminant functions of the two classes of classifiers can be converted into:
wherein the weight matrixDeviation matrixMark up correlation vector
Further, the features are selected as: terms constraining the weight matrix WwiRow i of W.
Further, the objective function expression of the semi-supervised multi-labeled learning model is as follows:
wherein n and Q respectively represent the number of samples and the number of marks,a marking matrix representing the correspondence of marked samples, the marking matrixFrom n tolMarking matrix corresponding to marked multi-marked samplesLabel matrix F corresponding to unlabeled multi-labeled samplesuComposition FuThe initial state is an all-zero matrix, Fi·Representing the marker vector, p, corresponding to the ith sampleiThe importance of the ith training sample is defined,in order to mark the correlation matrix, the correlation matrix is marked,in order to avoid over-fitting the regularization term,wirow i of W, λ and η are balance parameters,in order to be a multi-label data matrix,tr (-) denotes the trace of the matrix,representing a vector with elements all 1.
Further, the preset algorithm is an alternating iteration solving algorithm.
In summary, in the multi-label classification system in the above embodiment of the present invention, a semi-supervised multi-label learning method combining label correlation and feature selection is adopted, and not only a large number of unlabeled data sets are considered, but also label correlation and multi-label feature selection are considered while designing the multi-label classifier, so that not only a large number of unlabeled multi-label samples are effectively utilized, but also correlation between labels is automatically obtained through learning, and in addition, dimension reduction is performed on high-dimensional data, which is beneficial to obtaining a multi-label classifier with better generalization performance.
The invention also proposes a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements a multi-label classification method as described above
Referring to fig. 3, a computer device according to a third embodiment of the present invention is shown, which includes a memory 10, a processor 20, and a computer program 30 stored in the memory 20 and executable on the processor 10, wherein the processor 20 implements the multi-label classification method as described above when executing the computer program 30.
Those of skill in the art will understand that the logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be viewed as implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A multi-label classification method, comprising:
converting the multi-label classification problem into two classification problems, wherein each two classifier corresponds to one label in the multi-label data set;
constructing a semi-supervised multi-label learning model according to the preset feature selection and the correlation between the two types of classifiers;
solving the semi-supervised multi-label learning model by adopting a preset algorithm to obtain model parameters and a label correlation matrix of the semi-supervised multi-label classification;
and predicting a label set to which unknown labels belong according to the model parameters and the label correlation matrix of the semi-supervised multi-label classification.
2. The multi-label classification method according to claim 1, wherein said step of converting the multi-label classification problem into a two-class classification problem comprises:
converting the multi-label classification problem into Q two types of classifiers, wherein the discriminant function of the Q two types of classifiers is defined as
Wherein Q represents the number of markers in the multi-marker dataset,andand respectively representing the weight and the deviation corresponding to the discriminant functions of the qth classifiers and the two classifiers.
3. The multi-label classification method according to claim 2, characterized in that the correlation between the two classes of classifiers depends on the original feature vectors and other label variables simultaneously for the learning of each of the two classes of classifiers.
4. The multi-label classification method according to claim 3, characterized in that the discriminant functions of the two classes of classifiers are convertible into:
wherein the weight matrixDeviation ofMatrix arrayMark up correlation vector
5. The multi-label classification method according to claim 4, characterized in that the features are selected as terms constraining a weight matrix WwiRow i of W.
6. The multi-label classification method according to claim 1, characterized in that the objective function expression of the semi-supervised multi-label learning model is:
s.t.cqq=1,q=1,2,...,Q
wherein n and Q respectively represent the number of samples and the number of marks,a marking matrix representing the correspondence of marked samples, the marking matrixFrom n tolMarking matrix corresponding to marked multi-marked samplesMark matrix corresponding to unmarked multi-marked sampleComposition of,The initial state is an all-zero matrix, Fi·Representing the marker vector, p, corresponding to the ith sampleiThe importance of the ith training sample is defined,in order to mark the correlation matrix, the correlation matrix is marked,in order to avoid over-fitting the regularization term,wirow i of W, λ and η are balance parameters,in order to be a multi-label data matrix,tr (-) denotes the trace of the matrix,representing a vector with elements all 1.
7. The multi-label classification method according to claim 1, characterized in that the preset algorithm is an alternating iterative solution algorithm.
8. A multi-label classification system, comprising:
the problem conversion module is used for converting the multi-label classification problem into two types of classification problems, and each two types of classifiers corresponds to one label in the multi-label data set;
the model building module is used for building a semi-supervised multi-label learning model according to the preset feature selection and the correlation between the two types of classifiers;
the model solving module is used for solving the semi-supervised multi-label learning model by adopting a preset algorithm so as to obtain model parameters and a label correlation matrix of the semi-supervised multi-label classification;
and the mark prediction module predicts a mark set to which unknown marks belong according to the model parameters of the semi-supervised multi-mark classification and the mark correlation matrix.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-7 when executing the program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910081373.9A CN109947938A (en) | 2019-01-28 | 2019-01-28 | Multiple labeling classification method, system, readable storage medium storing program for executing and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910081373.9A CN109947938A (en) | 2019-01-28 | 2019-01-28 | Multiple labeling classification method, system, readable storage medium storing program for executing and computer equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109947938A true CN109947938A (en) | 2019-06-28 |
Family
ID=67006598
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910081373.9A Pending CN109947938A (en) | 2019-01-28 | 2019-01-28 | Multiple labeling classification method, system, readable storage medium storing program for executing and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109947938A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110781295A (en) * | 2019-09-09 | 2020-02-11 | 河南师范大学 | Multi-label data feature selection method and device |
CN113051452A (en) * | 2021-04-12 | 2021-06-29 | 清华大学 | Operation and maintenance data feature selection method and device |
CN117893839A (en) * | 2024-03-15 | 2024-04-16 | 华东交通大学 | Multi-label classification method and system based on graph attention mechanism |
CN118427581A (en) * | 2024-07-05 | 2024-08-02 | 华东交通大学 | Weak supervision mark distribution feature selection method and system based on mark correlation |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107256411A (en) * | 2017-05-27 | 2017-10-17 | 南京师范大学 | The multiple labeling data classification method of feature selecting and mark correlation combination learning |
CN107330448A (en) * | 2017-06-09 | 2017-11-07 | 南京师范大学 | A kind of combination learning method based on mark covariance and multiple labeling classification |
-
2019
- 2019-01-28 CN CN201910081373.9A patent/CN109947938A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107256411A (en) * | 2017-05-27 | 2017-10-17 | 南京师范大学 | The multiple labeling data classification method of feature selecting and mark correlation combination learning |
CN107330448A (en) * | 2017-06-09 | 2017-11-07 | 南京师范大学 | A kind of combination learning method based on mark covariance and multiple labeling classification |
Non-Patent Citations (2)
Title |
---|
何志芬: "结合标记相关性的多标记分类算法研究", 《中国博士学位论文全文数据库 I140-35 信息科技辑(月刊)》 * |
蔡亚萍: "结合标记相关性的多标记特征选择及分类算法研究", 《中国优秀硕士学位全文数据库 I140-259 信息科技辑》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110781295A (en) * | 2019-09-09 | 2020-02-11 | 河南师范大学 | Multi-label data feature selection method and device |
CN110781295B (en) * | 2019-09-09 | 2023-04-07 | 河南师范大学 | Multi-label data feature selection method and device |
CN113051452A (en) * | 2021-04-12 | 2021-06-29 | 清华大学 | Operation and maintenance data feature selection method and device |
CN113051452B (en) * | 2021-04-12 | 2022-04-26 | 清华大学 | Operation and maintenance data feature selection method and device |
CN117893839A (en) * | 2024-03-15 | 2024-04-16 | 华东交通大学 | Multi-label classification method and system based on graph attention mechanism |
CN117893839B (en) * | 2024-03-15 | 2024-06-07 | 华东交通大学 | Multi-label classification method and system based on graph attention mechanism |
CN118427581A (en) * | 2024-07-05 | 2024-08-02 | 华东交通大学 | Weak supervision mark distribution feature selection method and system based on mark correlation |
CN118427581B (en) * | 2024-07-05 | 2024-10-18 | 华东交通大学 | Weak supervision mark distribution feature selection method and system based on mark correlation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Bharadiya | Convolutional neural networks for image classification | |
McNeely-White et al. | Inception and ResNet features are (almost) equivalent | |
CN109947938A (en) | Multiple labeling classification method, system, readable storage medium storing program for executing and computer equipment | |
Xu et al. | Learning low-rank label correlations for multi-label classification with missing labels | |
Lian et al. | Max-margin dictionary learning for multiclass image categorization | |
CN102314614B (en) | Image semantics classification method based on class-shared multiple kernel learning (MKL) | |
Waheed et al. | Deep learning algorithms-based object detection and localization revisited | |
CN103605984B (en) | Indoor scene sorting technique based on hypergraph study | |
Cheng et al. | Joint label-specific features and label correlation for multi-label learning with missing label | |
CN107943856A (en) | A kind of file classification method and system based on expansion marker samples | |
CN103745233B (en) | The hyperspectral image classification method migrated based on spatial information | |
Batra et al. | Learning class-specific affinities for image labelling | |
CN113128478B (en) | Model training method, pedestrian analysis method, device, equipment and storage medium | |
Boi et al. | A support vector machines network for traffic sign recognition | |
CN107330448A (en) | A kind of combination learning method based on mark covariance and multiple labeling classification | |
Taalimi et al. | Multimodal dictionary learning and joint sparse representation for hep-2 cell classification | |
CN114676777A (en) | Self-supervision learning fine-grained image classification method based on twin network | |
CN114255371A (en) | Small sample image classification method based on component supervision network | |
Shahriyar et al. | An approach for multi label image classification using single label convolutional neural network | |
CN117893839B (en) | Multi-label classification method and system based on graph attention mechanism | |
CN115439715A (en) | Semi-supervised few-sample image classification learning method and system based on anti-label learning | |
Wang et al. | Object detection with deep learning for underwater environment | |
CN117671704B (en) | Handwriting digital recognition method, handwriting digital recognition device and computer storage medium | |
Nie et al. | Multi-label image recognition with attentive transformer-localizer module | |
CN113408606B (en) | Semi-supervised small sample image classification method based on graph collaborative training |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190628 |
|
RJ01 | Rejection of invention patent application after publication |