CN108921226A - A kind of zero sample classification method based on low-rank representation and manifold regularization - Google Patents
A kind of zero sample classification method based on low-rank representation and manifold regularization Download PDFInfo
- Publication number
- CN108921226A CN108921226A CN201810759171.0A CN201810759171A CN108921226A CN 108921226 A CN108921226 A CN 108921226A CN 201810759171 A CN201810759171 A CN 201810759171A CN 108921226 A CN108921226 A CN 108921226A
- Authority
- CN
- China
- Prior art keywords
- sample
- data set
- class data
- invisible
- low
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/14—Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Computational Biology (AREA)
- Pure & Applied Mathematics (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computational Mathematics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Algebra (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Image Analysis (AREA)
Abstract
The zero sample classification method based on low-rank representation and manifold regularization that the invention discloses a kind of, including:Calculate the mapping relations in visible class data set between the visual signature and semantic feature of sample;Calculate the semantic expressiveness of sample in invisible class data set;It introduces sparse constraint and Laplce's canonical is combined to constrain, calculate the low-rank representation of sample in invisible class data set;Calculate weight matrix and Laplacian Matrix;Manifold regularization is introduced, the noise of semantic expressiveness in invisible class data set is removed;It predicts the label of sample in invisible class data set, realizes sample classification.The limitation that the zero sample classification method based on low-rank representation and manifold regularization that the present invention designs effectively overcomes conventional sorting methods low for nicety of grading when sample size is few, sample label information is lost, obtain more accurate semantic expressiveness on invisible class data set, enhance the descriptive power to data characteristics, the precision of zero sample classification can be effectively improved.
Description
Technical field
The present invention relates to sample classification technical field more particularly to a kind of zero samples based on low-rank representation and manifold regularization
This classification method.
Background technique
In large-scale classification problem, lack enough training samples, perhaps the label information of multisample is lost, one
Determine to limit the raising of nicety of grading in degree.Zero sample classification is a kind of effective solution side proposed for this problem
Method.
Usually assume that sample data is all distributed in the structure in the subspace of low-dimensional and with low-rank in the prior art.It is existing
Method is based on data distribution approximation across multiple lower-dimensional subspaces it is assumed that being absorbed in the low-rank representation for finding data.It passes through
l1/l2Norm handles outlier, and has accurately restored the subspace structure of sample under certain technical conditions, detects simultaneously
Outlier is gone out.However when data distribution is when combining nonlinear subspace, such methods can not accurately restore the several of data
What structure.In actual application, the face-image of face is exactly to be located in nonlinear manifold structure.
In terms of sample denoising, the prior art usually assumes that sample data is strictly distributed in manifold, however is actually answering
In, often all there is noise in sample data.In this case, certain methods pass through the knot in punishment manifold locally or globally
Structure handles noise problem, however this excessive punishment would generally reduce the generalization ability of classifier, result in currently scarce
When the label information of weary enough training samples or sample is lost, the low problem of nicety of grading.
Summary of the invention
The zero sample classification method based on low-rank representation and manifold regularization that the present invention provides a kind of, solves and currently exists
When lacking the label information of enough training samples or sample and losing, the low technical problem of nicety of grading.Provided by the invention one
Zero sample classification method of the kind based on low-rank representation and manifold regularization, including:
Step 1:Calculate the visual signature X of sample in visible class data setsWith semantic expressiveness AsBetween mapping relationship f, i.e.,
f:Xs→As, wherein visible class data set is
For the visual signature of sample in visible class data set, p is the dimension of sample visual signature,For visible class
The semantic expressiveness of sample in data set, q are the dimension that each sample corresponds to semantic expressiveness, csFor the class of visible class data set sample
Not total, m is the total sample number of visible class data set;
Step 2:The semantic expressiveness A of sample in invisible class data set is calculated using mapping relationship fu, wherein invisible class
Data set isFor in invisible class data set
The visual signature of sample andcuFor the classification sum of invisible class data set sample, n is invisible class data set
Total sample number,For the invisible class data set X being calculateduSemantic expressiveness,
Step 3:Calculate the non-negative sparse low-rank representation Z of the Laplace regularization of sample in invisible class data set;
Step 4:Weight matrix W and Laplacian Matrix L is calculated using low-rank representation Z;
Step 5:Manifold regularization is introduced, the noise of the semantic expressiveness in invisible class data set is removed;
Step 6:Using the semantic expressiveness in the invisible class data set after denoising, sample in invisible class data set is predicted
Label, realize sample classification.
Preferably, the non-negative sparse low-rank of the Laplace regularization of sample in invisible class data set is calculated in step 3
Indicate Z expression formula be:
s.t.Xu=XuZ+E
Z≥0
||Z||0≤T
Wherein E is noise, and α is the first pre-setting tuning parameters, and β is the second pre-setting tuning parameters, | | | |*Indicate core model
Number, | | | |1Indicate l1Norm, tr () indicate trace function, and Z >=0 ensure that the non-negative characteristic of matrix Z, ‖ Z | |0≤ T ensure that
The sparse characteristic of matrix Z.
Preferably, manifold regularization is introduced in step 5, removes the public affairs of the noise of the semantic expressiveness in invisible class data set
Formula is:
Wherein, I is unit matrix, and λ is third pre-setting tuning parameters,For the semanteme in invisible class data set after denoising
It indicates.
It can be seen that the present invention from the above summary of the invention to have the following advantages that:
The present invention passes through low-rank representation and manifold regularization when sample size is few, sample label information is lost
More accurate semantic expressiveness on invisible class data set is obtained, enhances the descriptive power to data characteristics, can effectively improve
The precision of zero sample classification solves currently when the label information for lacking enough training samples or sample is lost, classification essence
Spend low problem.
Detailed description of the invention
Fig. 1 is a kind of zero sample classification method based on low-rank representation and manifold regularization provided in an embodiment of the present invention
Flow diagram.
Fig. 2 is a kind of part of the zero sample classification method based on low-rank representation and manifold regularization provided in this embodiment
Classification results schematic diagram.
Specific embodiment
Attribute Pascal and Yahoo (aPY) data set includes 32 classifications, wherein 20 classifications are visible
Class, for training, 12 classifications are invisible classes, for testing.Each sample has 64 attribute informations.The present embodiment uses
APY data set does exemplary illustration to method proposed by the present invention.To enable goal of the invention of the invention, feature, advantage
More obvious and understandable, following will be combined with the drawings in the embodiments of the present invention, to the method in the embodiment of the present invention carry out it is clear,
It is fully described by, it is clear that the embodiments described below are only a part of the embodiment of the present invention, and not all embodiment.
Based on the embodiments of the present invention, obtained by those of ordinary skill in the art without making creative efforts all
Other embodiments shall fall within the protection scope of the present invention.
Referring to Fig. 1, a kind of zero sample classification based on low-rank representation and manifold regularization provided in an embodiment of the present invention
One embodiment of method, including:
Step 1:Calculate the visual signature X of sample in visible class data setsWith semantic expressiveness AsBetween mapping relationship f, i.e.,
f:Xs→As, wherein visible class data set is
For the visual signature of sample in visible class data set, p is the dimension of sample visual signature,For visible class
The semantic expressiveness of sample in data set, q are the dimension that each sample corresponds to semantic expressiveness, csFor the class of visible class data set sample
Not total, m is the total sample number of visible class data set;
Step 2:The semantic expressiveness A of sample in invisible class data set is calculated using mapping relationship fu, wherein invisible class
Data set isFor in invisible class data set
The visual signature of sample andcuFor the classification sum of invisible class data set sample, n is invisible class data set
Total sample number,For the invisible class data set X being calculateduSemantic expressiveness,
Step 3:Calculate the non-negative sparse low-rank representation Z of the Laplace regularization of sample in invisible class data set;
It should be noted that introducing sparse constraint in order to preferably obtain the partial structurtes of data, calculating invisible class
The expression formula of the non-negative sparse low-rank representation Z of the Laplace regularization of sample is in data set:
s.t.Xu=XuZ+E
Z≥0
||Z||0≤T
Wherein E is noise, and α is the first pre-setting tuning parameters, and β is the second pre-setting tuning parameters, ‖ ‖*Indicate nuclear norm, |
|·||1Indicate l1Norm, tr () indicate trace function, and Z >=0 ensure that the non-negative characteristic of matrix Z, ‖ Z ‖0≤ T ensure that matrix Z
Sparse characteristic.
Step 4:Weight matrix W and Laplacian Matrix L is calculated using low-rank representation Z;
It should be noted that the formula for calculating weight matrix W is:
Calculate Laplacian Matrix L formula be:
L=D-W (3)
Wherein, D is the degree matrix of n × n, that is, includes element { d1,d2,d3,...,dnDiagonal matrix, k-th is diagonal
Element dkIndicate the sum of the weighted value on all sides being connected in undirected weight map with k-th of vertex;
Step 5:Manifold regularization is introduced, the noise of the semantic expressiveness in invisible class data set is removed;
It should be noted that introducing manifold regularization, the public affairs of the noise of the semantic expressiveness in invisible class data set are removed
Formula is:
Wherein, I is unit matrix, and λ is third pre-setting tuning parameters,For the semanteme in invisible class data set after denoising
It indicates.
Step 6:Using the semantic expressiveness in the invisible class data set after denoising, sample in invisible class data set is predicted
Label, realize sample classification, formula is:
Referring to Fig. 2, Fig. 2 is a kind of zero sample classification based on low-rank representation and manifold regularization provided in this embodiment
The part classifying result schematic diagram of method.Sample expression in figure with a line is assigned in same class, wherein wrong error symbol ×
Sample be classification error sample, other samples are the correct sample of classifying.
In the present embodiment, aPY data set may be selected in data set, and MATLAB R2017a, operation system may be selected in experiment porch
It unites and 10 Education Edition of Windows may be selected, Intel (R) Core (TM) i7-6700K CPU@4.00GHz may be selected in processor, interior
Deposit optional 32.0GB.
The zero sample classification method based on low-rank representation and manifold regularization of the present embodiment can effectively overcome tradition point
The class method limitation low for nicety of grading when sample size is few, sample label information is lost, obtains invisible class
More accurate semantic expressiveness on data set enhances the descriptive power to data characteristics, can effectively improve zero sample classification
Precision solves currently when the label information for lacking enough training samples or sample is lost, and the low technology of nicety of grading is asked
Topic.
The above, above embodiments are only to illustrate method of the invention, rather than its limitations;Although referring to aforementioned reality
Applying example, invention is explained in detail, those skilled in the art should understand that:It still can be to aforementioned each
Method documented by embodiment is modified or equivalent replacement of some of the technical features;And these modification or
Replacement does not make the essence of correlation method be detached from the spirit and scope of the method for the present invention.
Claims (3)
1. a kind of zero sample classification method based on low-rank representation and manifold regularization, which is characterized in that include the following steps:
Step 1:Calculate the visual signature X of sample in visible class data setsWith semantic expressiveness AsBetween mapping relationship f, i.e. f:Xs
→As, wherein visible class data set is
For the visual signature of sample in visible class data set, p is the dimension of sample visual signature,For visible class
The semantic expressiveness of sample in data set, q are the dimension that each sample corresponds to semantic expressiveness, csFor the class of visible class data set sample
Not total, m is the total sample number of visible class data set;
Step 2:The semantic expressiveness A of sample in invisible class data set is calculated using mapping relationship fu, wherein invisible class data set
ForFor sample in invisible class data set
Visual signature andcuFor the classification sum of invisible class data set sample, n is the sample of invisible class data set
Sum,For the invisible class data set X being calculateduSemantic expressiveness,
Step 3:Calculate the non-negative sparse low-rank representation Z of the Laplace regularization of sample in invisible class data set;
Step 4:Weight matrix W and Laplacian Matrix L is calculated using low-rank representation Z;
Step 5:Manifold regularization is introduced, the noise of the semantic expressiveness in invisible class data set is removed;
Step 6:Using the semantic expressiveness in the invisible class data set after denoising, the mark of sample in invisible class data set is predicted
Label realize sample classification.
2. a kind of zero sample classification method based on low-rank representation and manifold regularization according to claim 1, feature
It is, the expression of the non-negative sparse low-rank representation Z of the Laplace regularization of sample in invisible class data set is calculated in step 3
Formula is:
s.t. Xu=XuZ+E
Z≥0
||Z||0≤T
Wherein E is noise, and α is the first pre-setting tuning parameters, and β is the second pre-setting tuning parameters, | | | |*Indicate nuclear norm, | |
||1Indicate l1Norm, tr () indicate trace function, and Z >=0 ensure that the non-negative characteristic of matrix Z, | | Z | |0≤ T ensure that matrix Z
Sparse characteristic.
3. zero sample classification method according to claim 1, which is characterized in that introduce manifold regularization in step 5, remove
The formula of the noise of semantic expressiveness in invisible class data set is:
Wherein, I is unit matrix, and λ is third pre-setting tuning parameters,For the semantic table in the invisible class data set after denoising
Show.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810759171.0A CN108921226B (en) | 2018-07-11 | 2018-07-11 | Zero sample image classification method based on low-rank representation and manifold regularization |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810759171.0A CN108921226B (en) | 2018-07-11 | 2018-07-11 | Zero sample image classification method based on low-rank representation and manifold regularization |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108921226A true CN108921226A (en) | 2018-11-30 |
CN108921226B CN108921226B (en) | 2020-05-19 |
Family
ID=64411280
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810759171.0A Active CN108921226B (en) | 2018-07-11 | 2018-07-11 | Zero sample image classification method based on low-rank representation and manifold regularization |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108921226B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110427967A (en) * | 2019-06-27 | 2019-11-08 | 中国矿业大学 | The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104866871A (en) * | 2015-06-02 | 2015-08-26 | 西安电子科技大学 | Projection structure sparse coding-based hyperspectral image classification method |
CN105740912A (en) * | 2016-02-03 | 2016-07-06 | 苏州大学 | Nuclear norm regularization based low-rank image characteristic extraction identification method and system |
US20160307072A1 (en) * | 2015-04-17 | 2016-10-20 | Nec Laboratories America, Inc. | Fine-grained Image Classification by Exploring Bipartite-Graph Labels |
CN106485271A (en) * | 2016-09-30 | 2017-03-08 | 天津大学 | A kind of zero sample classification method based on multi-modal dictionary learning |
CN107491788A (en) * | 2017-08-21 | 2017-12-19 | 天津大学 | A kind of zero sample classification method based on dictionary learning |
-
2018
- 2018-07-11 CN CN201810759171.0A patent/CN108921226B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160307072A1 (en) * | 2015-04-17 | 2016-10-20 | Nec Laboratories America, Inc. | Fine-grained Image Classification by Exploring Bipartite-Graph Labels |
CN104866871A (en) * | 2015-06-02 | 2015-08-26 | 西安电子科技大学 | Projection structure sparse coding-based hyperspectral image classification method |
CN105740912A (en) * | 2016-02-03 | 2016-07-06 | 苏州大学 | Nuclear norm regularization based low-rank image characteristic extraction identification method and system |
CN106485271A (en) * | 2016-09-30 | 2017-03-08 | 天津大学 | A kind of zero sample classification method based on multi-modal dictionary learning |
CN107491788A (en) * | 2017-08-21 | 2017-12-19 | 天津大学 | A kind of zero sample classification method based on dictionary learning |
Non-Patent Citations (2)
Title |
---|
C. H. LAMPERT等: "Attribute-based classification for zero-shot visual object categorization", 《TPAMI》 * |
魏杰等: "基于视觉特征低维嵌入的细粒度图像分类", 《计算机辅助设计与图形学学报》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110427967A (en) * | 2019-06-27 | 2019-11-08 | 中国矿业大学 | The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder |
Also Published As
Publication number | Publication date |
---|---|
CN108921226B (en) | 2020-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113822494B (en) | Risk prediction method, device, equipment and storage medium | |
CN111581046A (en) | Data anomaly detection method and device, electronic equipment and storage medium | |
CN109388712A (en) | A kind of trade classification method and terminal device based on machine learning | |
CN108229341A (en) | Sorting technique and device, electronic equipment, computer storage media, program | |
CN109816032A (en) | Zero sample classification method and apparatus of unbiased mapping based on production confrontation network | |
CN103177265B (en) | High-definition image classification method based on kernel function Yu sparse coding | |
CN116629275B (en) | Intelligent decision support system and method based on big data | |
CN112925908A (en) | Attention-based text classification method and system for graph Attention network | |
CN105956570B (en) | Smiling face's recognition methods based on lip feature and deep learning | |
CN111475622A (en) | Text classification method, device, terminal and storage medium | |
CN111597348A (en) | User image drawing method, device, computer equipment and storage medium | |
CN108228684A (en) | Training method, device, electronic equipment and the computer storage media of Clustering Model | |
CN109726918A (en) | The personal credit for fighting network and semi-supervised learning based on production determines method | |
CN110457471A (en) | File classification method and device based on A-BiLSTM neural network | |
CN108805054A (en) | A kind of facial image sorting technique, system, equipment and computer storage media | |
CN110110035A (en) | Data processing method and device and computer readable storage medium | |
CN111160959A (en) | User click conversion estimation method and device | |
CN107220656A (en) | A kind of multiple labeling data classification method based on self-adaptive features dimensionality reduction | |
CN115545103A (en) | Abnormal data identification method, label identification method and abnormal data identification device | |
CN108647714A (en) | Acquisition methods, terminal device and the medium of negative label weight | |
CN110019820A (en) | Main suit and present illness history symptom Timing Coincidence Detection method in a kind of case history | |
CN110399813A (en) | A kind of age recognition methods, device, electronic equipment and storage medium | |
CN104573726B (en) | Facial image recognition method based on the quartering and each ingredient reconstructed error optimum combination | |
CN108921226A (en) | A kind of zero sample classification method based on low-rank representation and manifold regularization | |
CN106557783B (en) | A kind of automatic extracting system and method for caricature dominant role |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |