CN106096661B - The zero sample image classification method based on relative priority random forest - Google Patents

The zero sample image classification method based on relative priority random forest Download PDF

Info

Publication number
CN106096661B
CN106096661B CN201610465880.9A CN201610465880A CN106096661B CN 106096661 B CN106096661 B CN 106096661B CN 201610465880 A CN201610465880 A CN 201610465880A CN 106096661 B CN106096661 B CN 106096661B
Authority
CN
China
Prior art keywords
attribute
image
class
classification
known class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610465880.9A
Other languages
Chinese (zh)
Other versions
CN106096661A (en
Inventor
乔雪
彭晨
段贺
刘久云
胡岩峰
刘振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Original Assignee
Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences filed Critical Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Priority to CN201610465880.9A priority Critical patent/CN106096661B/en
Publication of CN106096661A publication Critical patent/CN106096661A/en
Application granted granted Critical
Publication of CN106096661B publication Critical patent/CN106096661B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/192Recognition using electronic means using simultaneous comparisons or correlations of the image signals with a plurality of references
    • G06V30/194References adjustable by an adaptive method, e.g. learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Probability & Statistics with Applications (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The present invention proposes a kind of zero sample image classification method based on relative priority random forest, attribute sequence score model is established according to the image that the relativeness between image category and image attributes is unknown classification, the attribute of all images sequence score model is trained into random forest grader as training sample, finally the label of test image is predicted according to the random forest grader that the attribute of test image sequence score and training obtain.Method of the invention can be realized the classification of zero sample image, and have many advantages, such as that Classification and Identification rate is high, model stability is strong.

Description

The zero sample image classification method based on relative priority random forest
Technical field
The invention belongs to area of pattern recognition, especially a kind of zero sample image based on relative priority random forest point Class method.
Background technique
The classification of 1.1 0 sample images
The classification of zero sample image is one of the research hotspot of current area of pattern recognition, not with traditional image classification problem Together, zero sample image is sorted in that test phase is classified and the image that identifies has neither part nor lot in the training of sorter model.Such as Fig. 1 institute Show, the training stage, marked image covered " Lion ", and " Athletic shoes " and " Polar bear " three classifications are (i.e. Know classification), and there is " Stiletto " classification (i.e. unknown classification) in the image of test phase, due to " Stiletto " class There is no the training for participating in classifier, therefore classifier will be unable to predict its label.The distribution of training data and test data Difference makes the classification of zero sample image become a very difficult learning tasks, but this problem scenes are widely present in calculating The fields such as machine vision, image classification, face and speech recognition.In zero sample image classification problem, in order to realize from known class It is clipped to the knowledge migration of unknown classification, disaggregated model just needs to build one by perceptual property from low-level image feature to classification mark The bridge of label.
The study of 1.2 relative priorities
Perceptual property (also referred to as attribute) refers to can be by the characteristic (example that manually marks and can observe in the picture Such as, have wing, dark hair), attribute is broadly divided into two-value property and relative priority, wherein relative priority indicate certain piece image with Other images compare the number containing a certain attribute.Since things is appreciated and understood often from the angle of cognition of the mankind It is to go to treat in a manner of comparing, therefore can more accurately express semantic attribute information using relative priority.For example, from figure It can visually see in 2, (a) shows the attribute of apparent " young ", do not have " young " this attribute (c), for (b) it for, then cannot simply be described with " young " or " not young ", and the mode accurate description compared can be used Are as follows: " (b) is more young than (c) and less young than (a) ".Therefore, relative priority can be more quasi- Semantic information really is expressed, there is stronger iamge description ability and interactive capability, it being capable of effective downscaled images bottom spy Semantic gap between sign and high-level semantics features.
Relative priority is other than the above-mentioned size that can be applied to description attribute intensity, additionally it is possible to by manually having It supervises and provides a feedback to classifier[And interactively choose image retrieval as a result, to improve Active Learning Habit ability.There is research to mix two-value property with relative priority and propose a kind of Spoken attributive classification device, enable attribute with A kind of more natural mode describes image.There are also researchs mutually to tie depth nerve convolutional network and relative priority learning framework It closes, to increase the precision of attribute sequence.In a recent study, relative priority be used to solve text description and zero sample learning The problems such as, such as: firstly, establishing Gaussian distribution model for all known class;Then, it is built by artificial selection known class Found the Gaussian distribution model of unknown classification;Finally, the method using maximal possibility estimation carries out Tag Estimation to test image.So And this method has some disadvantages: (1) assuming all known class images and the unknown equal Gaussian distributed of class image is not Reasonably;(2) due to needing to select classification with manually having supervision in modeling process, it will receive artificial master The influence of sight factor is not high so as to cause the accuracy of model;(3) there are biggish errors for maximum Likelihood, this also will The accuracy of image classification is impacted.
Summary of the invention
Technical problem solved by the invention is to provide a kind of zero sample image based on relative priority random forest point Class method establishes attribute sequence score mould according to the image that the relativeness between image category and image attributes is unknown classification The attribute of all images sequence score model is trained random forest grader as training sample, finally according to test by type The random forest grader that the attribute sequence score of image and training obtain predicts the label of test image.This method It can be realized the classification of zero sample image, and Classification and Identification rate is high, model stability is strong.
The technical solution for realizing the aim of the invention is as follows:
The zero sample image classification method based on relative priority random forest, comprising the following steps:
Step 1: the low-level image feature and class label collection { x of given known class image1,x2,...,xS;y1,y2,...,yS}、 Low-level image feature collection { the z of unknown class image1,z2,...,zU, the orderly attribute of known class image is to collection { O1,...,OM, known class The like attribute of image is to collection { S1,...,SM, random tree number T and sampling percentage η, establish majorized function, wherein S, U, M, T are positive integer, η ∈ (0,1);
Step 2: utilizing the low-level image feature and class label collection { x of known class image1,x2,...,xS;y1,y2,...,yS}、 Orderly attribute is to collection { O1,...,OMAnd like attribute to collection { S1,...,SMSolving optimization function, obtain M attribute sequence letter NumberWherein wmFor projection vector,For wmTransposition, i=1,2 ..., s, m=1,2 ..., M;
Step 3: establishing the attribute sequence score model of known class imageIt is arranged with the attribute of unknown class image Sequence score modelAnd training sample set Ω is formed, all images are positioned in attribute space, whereinIt respectively indicates known class image and corresponds to low-level image feature x1,x2,...,xSAttribute sort score,It respectively indicates unknown class image and corresponds to low-level image feature z1,z2,...,zUAttribute sequence point;
Step 4: the T Bootstrap stochastical sampling that sampling percentage is η being carried out to training sample set Ω, is sampled Sample set Ωt=BootstrapSampling (Ω), t=1,2 ..., T;
Step 5: generate random tree classification device:
Step 5-1: if ΩtIn all samples classification it is identical, then returned using present node as leaf node, and according to sample This class label marks the node classification;Otherwise, step 5-2 is gone to;
Step 5-2: random selection parameter space subset:Γ(Ωt) it is complete parameter space, Γsubt) it is Γ (Ωt) subset, for each parameter space subset Γsubt), calculate information gain IG (θjt), it obtains To the optimized parameter of Weak Classifier:J=1,2 ..., | Γsub|, θjIndicate subset Γsubt) In j-th of classification;
Step 5-3: enable the current data set of left and right child node for sky:
Step 5-4: according to optimized parameter θ*Calculate Weak Classifier h (ri*) value, if h (ri*)=1, then by (ri, yi) it is added to the data set of left child node: Ωleftleft∪{(ri,yi)};If h (ri*)=0, then by (ri,yi) addition To the data set of right child node: Ωrightright∪{(ri,yi), wherein riIndicate attribute sequence score, yiIndicate classification Label;
Step 5-5: data set ΩleftAnd ΩrightAs the child node of the node, these child nodes are repeated to walk respectively Rapid 5-1 to 5-4, obtains t-th of random tree classification device;
Step 6: repeating step 4 to step 5, obtain the zero sample image classifier based on relative priority random forest TreeRoot1,...,TreeRootT
Step 7: utilizing attribute ranking functionsCalculate the attribute sequence score r of test image(u)
Step 8: by r(u)Substitute into classifier TreeRoot1,...,TreeRootTIn, obtain r(u)Belong to the probability of classification C, Calculate and export the class label of test image.
Further, the zero sample image classification method of the invention based on relative priority random forest, it is excellent in step 1 Change function are as follows:
Wherein, ξijAttribute is ordered into { O1,...,OMNon-negative relaxation factor, γijIt is like attribute to { S1,..., SMNon-negative relaxation factor, parameter C Edge Distance and meets attribute to relativeness for weighing to maximize.
Further, the zero sample image classification method of the invention based on relative priority random forest solves in step 2 Majorized function is using the edge 1/ that will sort | | wm| | it maximizes, non-negative relaxation factor ξijAnd γijIt minimizes, to obtain optimal throwing Shadow vector.
Further, the zero sample image classification method of the invention based on relative priority random forest is established in step 3 Known class image attribute sequence score model the following steps are included:
Step 3-1: by known class it is of all categories between relation object than to the relationship between the image attributes for belonging to such;
Step 3-2: the attribute sequence score of all known class images is calculated ri (s)Every dimensional table diagram picture correspond to the sequence score of attribute;
Step 3-3: the attribute of all known class images sort to be grouped as attribute sequence score model
Further, the zero sample image classification method of the invention based on relative priority random forest is unknown in step 3 Class image establishes relativeness with known class by the relationship between attribute, is sorted with this to establish the attribute of unknown class image Sub-model specifically includes following three kinds of situations:
(1) known class if it existsM-th of attributeWith unknown classM-th of attributeSimultaneously Meet:And known classWithIt is and unknown classTwo nearest known class of relative priority sequence, then scheme As m-th of attribute sequence score of model are as follows:
Wherein, i=1,2 ..., I, k=1,2 ..., K, j=1,2 ..., IK, I and K be known class respectivelyWithTotal number of images,Respectively known classM-th of attribute sort score;
(2) if unknown classIn boundary and there are known classM-th of attribute meetThen image mould M-th of attribute sequence score of type are as follows:
Wherein, i=1,2 ..., I, j=1,2 ..., I,It indicates between the attribute sequence score of training class image Mean difference, Indicate the mean value of m-th of attribute,
(3) if unknown classIn boundary and there are known classM-th of attribute meetThen image mould M-th of attribute sequence score of type are as follows:
Wherein, k=1,2 ..., K, j=1,2 ..., K,
Further, the zero sample image classification method of the invention based on relative priority random forest, r in step 8(u) Belong to the probability of classification c are as follows:
Wherein, T is the number of random tree in forest, pt(c|r(u)) be leaf node category distribution.
Further, the zero sample image classification method of the invention based on relative priority random forest is tested in step 8 The class label of image are as follows:
The invention adopts the above technical scheme compared with prior art, has following technical effect that
1, the zero sample image classification method of the invention based on relative priority random forest is that each image is individually built Vertical attribute sequence score model, so that the model for participating in classifier training is more reasonable and accurate;
2, the zero sample image classification method of the invention based on relative priority random forest automatically select known class come Model is established for unknown classification, avoids subjective impact brought by artificial selection known class;
3, the zero sample image classification method of the invention based on relative priority random forest uses random forest grader Error in classification is reduced, to improve the accuracy of zero sample image classification;
4, the zero sample image classification method Classification and Identification rate of the invention based on relative priority random forest is high, model is steady It is qualitative strong.
Detailed description of the invention
Fig. 1 is zero sample image classification schematic diagram;
Fig. 2 is relative priority schematic diagram;
Fig. 3 is the zero sample image classification method structural block diagram based on relative priority random forest;
Fig. 4 is the flow chart for learning attribute ranking functions;
Fig. 5 be ordered into attribute to and like attribute to schematic diagram;
Fig. 6 is attribute ranking functions schematic diagram;
Fig. 7 is the flow chart for establishing AR model;
Fig. 8 is the flow chart of trained random forest grader;
Fig. 9 is the flow chart for predicting test image class label.
Specific embodiment
Embodiments of the present invention are described below in detail, the example of the embodiment is shown in the accompanying drawings, wherein from beginning Same or similar element or element with the same or similar functions are indicated to same or similar label eventually.Below by ginseng The embodiment for examining attached drawing description is exemplary, and for explaining only the invention, and is not construed as limiting the claims.
The zero sample image classification method based on relative priority random forest, as shown in Figure 3, comprising the following steps:
Step 1: such as (1) in Fig. 4, giving the low-level image feature and class label collection { x of known class image1,x2,...,xS; y1,y2,...,yS, the low-level image feature collection { z of unknown class image1,z2,...,zU, the orderly attribute of known class image is to collection {O1,...,OM, the like attribute of known class image is to collection { S1,...,SM, random tree number T and sampling percentage η, In, S, U, M, T are positive integer, and η ∈ (0,1) establishes following majorized function:
Wherein, ξijAttribute is ordered into { O1,...,OMNon-negative relaxation factor, γijIt is like attribute to { S1,..., SMNon-negative relaxation factor, parameter C Edge Distance and meets attribute to relativeness for weighing to maximize.
Concrete principle is as follows:
Given training image collection I={ i }, each image feature vector xi∈RdIt indicates;The given category with M attribute Property collectionFor each attribute am, a series of orderly attributes are given to Om={ (i, j) } and like attribute are to Sm= { (i, j) }, whereinIndicate that image i contains attribute more than image j;Indicate image i It is similar to image j containing attribute.Fig. 5 be orderly attribute by taking " laughing at " this attribute as an example to and like attribute to schematic diagram, In orderly attribute centering image attributes intensity it is in different size, there are certain strong or weak relation, and the figure of like attribute centering Picture attribute intensity is similar, and attribute strong or weak relation is not present.The purpose of study attribute ranking functions is that M attribute is arranged in order to obtain Order function:
For m=1 ..., M, meet following limitation most possibly:
Wherein, wmIt is projection vector, therefore, study attribute ranking functions are intended to find optimal throwing in low-level image feature space Shadow direction, so that the projection of all images in this direction possesses correct sequence.To solve the above problems, introducing non-negative relaxation Variable ξijAnd γij, obtain following majorized function:
Wherein, ξijAttribute is ordered into OmThe non-negative relaxation factor of={ (i, j) }, γijIt is like attribute to Sm=(i, J) non-negative relaxation factor }, parameter C is for weighing maximization Edge Distance and meeting attribute to relativeness.
Step 2: utilizing the low-level image feature and class label collection { x of known class image1,x2,...,xS;y1,y2,...,yS}、 Orderly attribute is to collection { O1,...,OMAnd like attribute to collection { S1,...,SMSolving optimization function, by the edge 1/ that sorts | | wm|| It maximizes, non-negative relaxation factor ξijAnd γijIt minimizes, optimal projection vector is obtained, to obtain M attribute ranking functionsWherein wmFor projection vector,For wmTransposition, i=1,2 ..., s, m=1,2 ..., M.Establish attribute row Order function is actually to learn the function that accurately training image can sort, as the edge of Fig. 4 (2), sequence limits System is exactly to allow the distance between nearest two images in entire sequence maximum.As shown in fig. 6, the purpose of sequence is by data point (being indicated respectively with 1,2,3,4,5,6) preferably sorts, allow the edge in queue between two nearest data points (2,3) most Greatly, therefore ranking functions can preferably indicate the relativeness of attribute intensity.
Step 3: establishing the attribute sequence score model of known class imageIt is arranged with the attribute of unknown class image Sequence score modelAnd training sample set Ω is formed, all images are positioned in attribute space, whereinIt respectively indicates known class image and corresponds to low-level image feature x1,x2,...,xSAttribute sort score,It respectively indicates unknown class image and corresponds to low-level image feature z1,z2,...,zUAttribute sequence point, such as Fig. 7 (3) institute Show.
Assuming that having S class image is known class, U class image is unknown classification, by the general of zero sample image classification It reads it is found that the image of S class known class can directly participate in the training of classifier, and the image of the unknown classification of U class cannot be direct The training for participating in classifier can only appear on the test phase of zero sample image classification, therefore in the attribute sequence for establishing image When score model, it is known that method used by the image of classification and the image of unknown classification be it is different, the present invention proposition adopt The attribute sequence score model of image is established using the following method:
The attribute sequence score model for initially setting up known class image, includes the following steps, such as Fig. 7 (1):
Step 3-1: by known class it is of all categories between relation object than to the relationship between the image attributes for belonging to such, That is: for arbitrary attribute amAnd known classWith
Step 3-2: the attribute sequence score of all known class images is calculated Every dimensional table diagram picture correspond to the sequence score of attribute;
Step 3-3: the attribute of all known class images sort to be grouped as attribute sequence score model
The known class image x that thus script is indicated with d dimensional feature vectoriWith the attribute sequence score r of M dimensioniIt indicates: xi∈Rd→ri∈RM, whereinriEvery dimensional table diagram picture correspond to the sequence of attribute Score.
It, cannot be with same method to unknown class since unknown classification cannot directly participate in the training process of classifier Image modeled, still, unknown class image establishes relativeness, such as " bear " by relationship between attribute and known class (unknown classification) is more than " giraffe " (known class) hair, but more not as good as " rabbit " (known class) hair.It is specific next such as Fig. 7 (2) It says, for attribute am, unknown classKnown class can be usedWithPoint or less three kinds of situations carry out associated descriptions:
(1) known class if it existsM-th of attributeWith unknown classM-th of attributeSimultaneously Meet:And known classWithIt is and unknown classTwo nearest known class of relative priority sequence, then scheme As m-th of attribute sequence score of model are as follows:
Wherein, i=1,2 ..., I, k=1,2 ..., K, j=1,2 ..., IK, I and K be known class respectivelyWithTotal number of images,Respectively known classM-th of attribute sort score;
(2) if unknown classIn boundary and there are known classM-th of attribute meetThen image mould M-th of attribute sequence score of type are as follows:
Wherein, i=1,2 ..., I, j=1,2 ..., I,It indicates between the attribute sequence score of training class image Mean difference, Indicate the mean value of m-th of attribute,
(3) if unknown classIn boundary and there are known classM-th of attribute meetThen image mould M-th of attribute sequence score of type are as follows:
Wherein, k=1,2 ..., K, j=1,2 ..., K,
The present invention automatically selects suitable known class using following strategy to establish the attribute of unknown class sequence score mould Type: preferential selection meetsKnown classWithAndWithIt is and unknown classRelative priority sequence Two nearest classes;IfIn boundary, do not meetKnown classWithThen select relative priority It sorts highestOr it is minimumUnknown class is modeled.
Step 4: the T Bootstrap stochastical sampling that sampling percentage is η being carried out to training sample set Ω, is sampled Sample set Ωt=BootstrapSampling (Ω), t=1,2 ..., T.
Step 5: random tree classification device is generated, as shown in figure 8, each section in random forest grader, in each tree Point can be regarded as a Weak Classifier, to training sample set Ω (including the known class sample set for reaching the nodeWith unknown class sample set) be calculated a sorting criterion h (r | θ)={ 0,1 }, r ∈ RMIndicate that a training sample, θ={ φ, ψ } are the parameter of this Weak Classifier, wherein φ () is sieve Function is selected, ψ is a parameter matrix.
Step 5-1: if ΩtIn all samples classification it is identical, then returned using present node as leaf node, and according to sample This class label marks the node classification;Otherwise, step 5-2 is gone to;
Step 5-2: random selection parameter space subset:Γ(Ωt) it is complete parameter space, Γsubt) it is Γ (Ωt) subset, for each node ΓsubAll be it is randomly selected from Γ, this is embodied in node split Randomness in the process, to each parameter space subset Γsubt), calculate information gain IG (θjt), information gain weighing apparatus The fall for having measured training sample impurity level after division, can be with is defined as:
Wherein, NcThe classification number of expression training sample, and p (c | Ω) indicate ratio shared by classification c in training sample set Ω Example,Expression falls into the set of all training samples of the node, yiIndicate the label of i-th of sample, Ωleft (θ) and Ωright(θ) is illustrated respectively in the sample set that left and right child node is fallen under parameter θ,Indicate the member in set omega Plain number, H (Ω) are indicated to fall into the impurity level of the sample set of a node, be described with comentropy.
Then the optimized parameter of Weak Classifier is obtained:J=1,2 ..., | Γsub|, θjTable Show subset Γsubt) in j-th of classification.
It follows that " optimal " parameter θ of each node*Should making node, impurity level fall is maximum after cleaving.
Step 5-3: enable the current data set of left and right child node for sky:
Step 5-4: according to optimized parameter θ*Calculate Weak Classifier h (ri*) value, if h (ri*)=1, then by (ri, yi) it is added to the data set of left child node: Ωleftleft∪{(ri,yi)};If h (ri*)=0, then by (ri,yi) addition To the data set of right child node: Ωrightright∪{(ri,yi), wherein riIndicate attribute sequence score, yiIndicate classification Label;
Step 5-5: data set ΩleftAnd ΩrightAs the child node of the node, these child nodes are repeated to walk respectively Rapid 5-1 to 5-5, obtains t-th of random tree classification device, it may be assumed that at each leaf node, is concentrated and is reached by statistics training sample The histogram of the tag along sort of this leaf node can estimate the class distribution on this leaf node.Such repetitive exercise process is always Going to cannot be by until continuing the bigger information gain of division acquisition.
Step 6: repeating step 4 to step 5, obtain the zero sample image classifier based on relative priority random forest TreeRoot1,...,TreeRootT
Step 7: such as Fig. 9, utilizing attribute ranking functionsCalculate the attribute sequence score r of test image(u), All test images are the images of unknown classification herein, are not engaged in the training of random forest grader.
Step 8: such as Fig. 9, by r(u)Substitute into classifier TreeRoot1,...,TreeRootTIn, iteratively each random Carried out in tree or the branch of left or right, the leaf node until reaching each random tree, classification distribution on each leaf node namely It is that this sets the classification results made.Classification distribution in each leaf nodes is averaged, r can be obtained(u)Belong to classification The probability of c:Wherein, T is the number of random tree in forest, pt(c|r(u)) be leaf node class It is not distributed.Then calculate and export the class label of test image:
The above is only some embodiments of the invention, it is noted that for the ordinary skill people in zero domain of this technology For member, without departing from the principle of the present invention, several improvement can also be made, these improvement should be regarded as guarantor of the invention Protect range.

Claims (7)

1. the zero sample image classification method based on relative priority random forest, which comprises the following steps:
Step 1: the low-level image feature and class label collection { x of given known class image1,x2,...,xS;y1,y2,...,yS, it is unknown Low-level image feature collection { the z of class image1,z2,...,zU, the orderly attribute of known class image is to collection { O1,...,OM, known class image Like attribute to collection { S1,...,SM, random tree number T and sampling percentage η, establish majorized function, wherein S, U, M, T It is positive integer, η ∈ (0,1);
Step 2: utilizing the low-level image feature and class label collection { x of known class image1,x2,...,xS;y1,y2,...,yS, orderly Attribute is to collection { O1,...,OMAnd like attribute to collection { S1,...,SMSolving optimization function, obtain M attribute ranking functionsWherein wmFor projection vector,For wmTransposition, i=1,2 ..., s, m=1,2 ..., M;
Step 3: establishing the attribute sequence score model of known class imageIt sorts with the attribute of unknown class image Sub-modelAnd training sample set Ω is formed, all images are positioned in attribute space, whereinIt respectively indicates known class image and corresponds to low-level image feature x1,x2,...,xSAttribute sort score,It respectively indicates unknown class image and corresponds to low-level image feature z1,z2,...,zUAttribute sort score;
Step 4: the T Bootstrap stochastical sampling that sampling percentage is η being carried out to training sample set Ω, obtains sample Collect Ωt=BootstrapSampling (Ω), t=1,2 ..., T;
Step 5: generate random tree classification device:
Step 5-1: if ΩtIn all samples classification it is identical, then returned using present node as leaf node, and according to sample Class label marks the node classification;Otherwise, step 5-2 is gone to;
Step 5-2: random selection parameter space subset:Γ(Ωt) it is complete parameter space, Γsubt) For Γ (Ωt) subset, for each parameter space subset Γsubt), calculate information gain IG (θjt), it obtains weak The optimized parameter of classifier:J=1,2 ..., | Γsub|, θjIndicate subset Γsubt) in J-th of classification;
Step 5-3: enable the current data set of left and right child node for sky:
Step 5-4: according to optimized parameter θ*Calculate Weak Classifier h (ri*) value, if h (ri*)=1, then by (ri,yi) addition To the data set of left child node: Ωleftleft∪{(ri,yi)};If h (ri*)=0, then by (ri,yi) it is added to right sub- section The data set of point: Ωrightright∪{(ri,yi), wherein riIndicate attribute sequence score, yiIndicate class label;
Step 5-5: data set ΩleftAnd ΩrightAs the child node of the node, 5- is respectively repeated steps for these child nodes 1 to 5-4, obtain t-th of random tree classification device;
Step 6: repeating step 4 to step 5, obtain the zero sample image classifier based on relative priority random forest TreeRoot1,...,TreeRootT
Step 7: utilizing attribute ranking functionsCalculate the attribute sequence score r of test image(u)
Step 8: by r(u)Substitute into classifier TreeRoot1,...,TreeRootTIn, obtain r(u)Belong to the probability of classification C, calculates And export the class label of test image.
2. the zero sample image classification method according to claim 1 based on relative priority random forest, which is characterized in that Majorized function in step 1 are as follows:
Wherein, ξijAttribute is ordered into { O1,...,OMNon-negative relaxation factor, γijIt is like attribute to { S1,...,SM? Non-negative relaxation factor, parameter C is for weighing maximization Edge Distance and meeting attribute to relativeness.
3. the zero sample image classification method according to claim 1 based on relative priority random forest, which is characterized in that Solving optimization function is using the edge 1/ that will sort in step 2 | | wm| | it maximizes, non-negative relaxation factor ξijAnd γijIt minimizes, from And obtain optimal projection vector.
4. the zero sample image classification method according to claim 1 based on relative priority random forest, which is characterized in that Established in step 3 known class image attribute sequence score model the following steps are included:
Step 3-1: by known class it is of all categories between relation object than to the relationship between the image attributes for belonging to such;
Step 3-2: the attribute sequence score of all known class images is calculatedri (s) Every dimensional table diagram picture correspond to the sequence score of attribute;
Step 3-3: the attribute of all known class images sort to be grouped as attribute sequence score model
5. the zero sample image classification method according to claim 1 based on relative priority random forest, which is characterized in that In step 3 unknown class image establishes relativeness by relationship between attribute and known class, establishes unknown class image with this Attribute sort score model, specifically include following three kinds of situations:
(1) known class if it existsM-th of attributeWith unknown classM-th of attributeIt is full simultaneously Foot:And known classWithIt is and unknown classTwo nearest known class of relative priority sequence, then image M-th of attribute sequence score of model are as follows:
Wherein, i=1,2 ..., I, k=1,2 ..., K, j=1,2 ..., IK, I and K be known class respectivelyWithFigure As total,Respectively known classM-th of attribute sort score;
(2) if unknown classIn boundary and there are known classM-th of attribute meetThen the of iconic model M attribute sequence score are as follows:
Wherein, i=1,2 ..., I, j=1,2 ..., I,Indicate being averaged between the attribute sequence score of training class image Difference, Indicate the mean value of m-th of attribute,
(3) if unknown classIn boundary and there are known classM-th of attribute meetThen the of iconic model M attribute sequence score are as follows:
Wherein, k=1,2 ..., K, j=1,2 ..., K,
6. the zero sample image classification method according to claim 1 based on relative priority random forest, which is characterized in that R in step 8(u)Belong to the probability of classification c are as follows:
Wherein, T is the number of random tree in forest, pt(c|r(u)) be leaf node category distribution.
7. the zero sample image classification method according to claim 6 based on relative priority random forest, which is characterized in that The class label of test image in step 8 are as follows:
CN201610465880.9A 2016-06-24 2016-06-24 The zero sample image classification method based on relative priority random forest Active CN106096661B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610465880.9A CN106096661B (en) 2016-06-24 2016-06-24 The zero sample image classification method based on relative priority random forest

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610465880.9A CN106096661B (en) 2016-06-24 2016-06-24 The zero sample image classification method based on relative priority random forest

Publications (2)

Publication Number Publication Date
CN106096661A CN106096661A (en) 2016-11-09
CN106096661B true CN106096661B (en) 2019-03-01

Family

ID=57252659

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610465880.9A Active CN106096661B (en) 2016-06-24 2016-06-24 The zero sample image classification method based on relative priority random forest

Country Status (1)

Country Link
CN (1) CN106096661B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3593284A4 (en) * 2017-03-06 2021-03-10 Nokia Technologies Oy A transductive and/or adaptive max margin zero-shot learning method and system
CN107563444A (en) * 2017-09-05 2018-01-09 浙江大学 A kind of zero sample image sorting technique and system
CN111079468B (en) * 2018-10-18 2024-05-07 珠海格力电器股份有限公司 Method and device for identifying object by robot
CN109886289A (en) * 2019-01-08 2019-06-14 深圳禾思众成科技有限公司 A kind of deep learning method, equipment and computer readable storage medium
CN110704662A (en) * 2019-10-17 2020-01-17 广东工业大学 Image classification method and system
CN111126049B (en) * 2019-12-14 2023-11-24 中国科学院深圳先进技术研究院 Object relation prediction method, device, terminal equipment and readable storage medium
CN111612047B (en) * 2020-04-29 2023-06-02 杭州电子科技大学 Zero sample image recognition method based on attribute feature vector and reversible generation model
CN111783531B (en) * 2020-05-27 2024-03-19 福建亿华源能源管理有限公司 Water turbine set fault diagnosis method based on SDAE-IELM
CN112257765B (en) * 2020-10-16 2022-09-23 济南大学 Zero sample image classification method and system based on unknown similarity class set
CN112990161A (en) * 2021-05-17 2021-06-18 江苏数兑科技有限公司 Electronic certificate identification method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101923650A (en) * 2010-08-27 2010-12-22 北京大学 Random forest classification method and classifiers based on comparison mode
CN103473231A (en) * 2012-06-06 2013-12-25 深圳先进技术研究院 Classifier building method and system
CN105512679A (en) * 2015-12-02 2016-04-20 天津大学 Zero sample classification method based on extreme learning machine

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101923650A (en) * 2010-08-27 2010-12-22 北京大学 Random forest classification method and classifiers based on comparison mode
CN103473231A (en) * 2012-06-06 2013-12-25 深圳先进技术研究院 Classifier building method and system
CN105512679A (en) * 2015-12-02 2016-04-20 天津大学 Zero sample classification method based on extreme learning machine

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Attributerelationlearningforzero-shotclassification;Mingxia Liu et al.;《Neurocomputing》;20140403;第514-521页 *
Zero-Shot Recognition with Unreliable Attributes;Dinesh Jayaraman et al.;《Proceedings of Advances in Neural Processing Systems》;20141130;第3464-3472页 *

Also Published As

Publication number Publication date
CN106096661A (en) 2016-11-09

Similar Documents

Publication Publication Date Title
CN106096661B (en) The zero sample image classification method based on relative priority random forest
CN106897738B (en) A kind of pedestrian detection method based on semi-supervised learning
CN102314614B (en) Image semantics classification method based on class-shared multiple kernel learning (MKL)
CN108875816A (en) Merge the Active Learning samples selection strategy of Reliability Code and diversity criterion
CN104156734B (en) A kind of complete autonomous on-line study method based on random fern grader
CN108229550B (en) Cloud picture classification method based on multi-granularity cascade forest network
CN110569886A (en) Image classification method for bidirectional channel attention element learning
CN106779087A (en) A kind of general-purpose machinery learning data analysis platform
CN108399428A (en) A kind of triple loss function design method based on mark than criterion
CN112926405A (en) Method, system, equipment and storage medium for detecting wearing of safety helmet
CN104657718A (en) Face recognition method based on face image feature extreme learning machine
CN112149721B (en) Target detection method for reducing labeling requirements based on active learning
CN110688888B (en) Pedestrian attribute identification method and system based on deep learning
CN104966105A (en) Robust machine error retrieving method and system
CN114998220B (en) Tongue image detection and positioning method based on improved Tiny-YOLO v4 natural environment
CN110132263A (en) A kind of method for recognising star map based on expression study
CN112633382A (en) Mutual-neighbor-based few-sample image classification method and system
CN110110845B (en) Learning method based on parallel multi-level width neural network
CN104268552B (en) One kind is based on the polygonal fine classification sorting technique of part
WO2022062419A1 (en) Target re-identification method and system based on non-supervised pyramid similarity learning
CN108573274A (en) A kind of selective clustering ensemble method based on data stability
CN108595558A (en) A kind of image labeling method of data balancing strategy and multiple features fusion
CN112115829B (en) Expression recognition method based on classifier selective integration
CN107392254A (en) A kind of semantic segmentation method by combining the embedded structural map picture from pixel
CN109656808A (en) A kind of Software Defects Predict Methods based on hybrid active learning strategies

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant