CN107038456A - A kind of image classification method of the probability linear discriminant analysis based on L1 norms - Google Patents

A kind of image classification method of the probability linear discriminant analysis based on L1 norms Download PDF

Info

Publication number
CN107038456A
CN107038456A CN201710178667.4A CN201710178667A CN107038456A CN 107038456 A CN107038456 A CN 107038456A CN 201710178667 A CN201710178667 A CN 201710178667A CN 107038456 A CN107038456 A CN 107038456A
Authority
CN
China
Prior art keywords
image
dimensionality reduction
class
distribution
norms
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710178667.4A
Other languages
Chinese (zh)
Inventor
丁文鹏
胡向杰
孙艳丰
胡永利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN201710178667.4A priority Critical patent/CN107038456A/en
Publication of CN107038456A publication Critical patent/CN107038456A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2132Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on discrimination criteria, e.g. discriminant analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate

Abstract

The present invention discloses a kind of image classification method of the probability linear discriminant analysis based on L1 norms, can solve the problem that the problem of there is exceptional value in image.From unlike traditional PLDA, the present invention describes noise using laplacian distribution, and Laplce is the probability density function based on L1 norms, it will not fault in enlargement value.By introducing hidden variable, the parameter in variation greatest hope solving model, and dimensionality reduction matrix are used.Matrix after dimensionality reduction is regarded to the feature of sample as, of the invention is that L1 norms describe error in this model, and the dimensionality reduction matrix so solved can lift image classification effect closer to principal direction.

Description

A kind of image classification method of the probability linear discriminant analysis based on L1 norms
Technical field
The invention belongs to machine learning techniques field, more particularly to a kind of probability linear discriminant analysis based on L1 norms Image classification method, is particularly suitable for carrying the image classification of exceptional value in image.
Background technology
In to image procossing, image vector is often melted into the data of a higher-dimension.However, high dimensional data is often It is evenly distributed on a lower dimensional space or popular world.So, find mapping relations of the high dimensional data into lower dimensional space As a major issue to image classification.Recent decades, the algorithm of Data Dimensionality Reduction has been furtherd investigate.To linear discriminant It is a kind of dimension reduction method for being widely used in image classification to analyze (LDA).LDA is to be mapped to high dimensional data using projection matrix Lower dimensional space so that class spacing after mapping with class away from ratio it is maximum.LDA is a kind of method based on algebraically, and algebraically is asked Solution method only relies on initial data, does not assume any parameter, and the typically degree of belief to model lacks flexibility.
In order to overcome the defect of algebraically, probability linear discriminant analysis (PLDA) method is proposed within 2007.In this model In, image data table is shown as one-dimensional vector, it is assumed that noise is to obey zero-mean, and covariance is the Gaussian Profile of unit matrix. PLDA is the theoretical log of applied probability according to dimensionality reduction, solution dimensionality reduction matrix.In the case where noise is the hypothesis of Gaussian Profile, algorithm is obtained Preferable classification results.
In actual treatment image, the distribution of noise is often complicated, differs and establishes a capital Gaussian distributed, when in image When there is exceptional value, above method to image when classifying, and accuracy rate will be affected.Because either LDA, goes back It is PLDA, algorithm is all based on L2- norms, and the method for L2- norms has an obvious defect:L2- norms can be by image In exceptional value infinitely amplify, when projecting image onto lower dimensional space, the projection matrix gone out with Algorithm for Solving can deviate from very Positive principal direction.
The content of the invention
The technical problem to be solved in the present invention is to provide a kind of image of the probability linear discriminant analysis based on L1 norms point Class method, can solve the problem that the problem of there is exceptional value in image.From unlike traditional PLDA, this method is with Laplce point Cloth describes noise, and Laplce is the probability density function based on L1- norms, it will not fault in enlargement value.It is hidden by introducing Variable, uses the parameter in variation greatest hope solving model, and dimensionality reduction matrix.Regard the matrix after dimensionality reduction as sample Feature, of the invention is that L1- norms describe error in this model, the dimensionality reduction matrix so solved closer to principal direction, Image classification effect can be lifted.
To achieve the above object, the present invention is adopted the following technical scheme that:
A kind of image classification method of the probability linear discriminant analysis based on L1 norms, comprises the following steps:
Step 1, the image to acquisition set up probability linear discriminant analysis model (PLDA)
OrderIt is one group of independent identically distributed image set, wherein have I classes, Per class JiIndividual image pattern, and J1+...Ji+...+JI=N, each sample is a column vector and size is RD, D be image to Line number after quantization, PLDA model is represented by:
xij=μ+Fhi+Gwijij (1)
Wherein, vector xijRepresent j-th of image of the i-th class in sample set;μ(μ∈RD) be image set X average;εijFor by mistake Poor item, F (F ∈ RD×d) and G (G ∈ RD×d) be between class respectively and class in dimensionality reduction matrix, d≤D is the columns after dimensionality reduction;hi(hi ∈Rd) and wij(wij∈Rd) it is xijHidden variable core, i.e. coefficient vector, RdIt is the image feature vector after dimensionality reduction, hiRepresent body Part variable, for the i-th class, represents the general character of similar image, for wij, represent that class internal variable represents in similar image Property;
Step 2, set up L1-PLDA models
Under conditions of exceptional value presence, to sample dimensionality reduction, after by center of a sample, PLDA models can be expressed as:
Wherein, by dimensionality reduction matrix merges between class and in class, i.e. B=[F, G] (B ∈ RD ×2d);The coefficient of dimensionality reduction matrix is closed And, it is designated asRepresent the hidden variable of j-th of sample of the i-th class;Error term εijObey the drawing of L1- norms This distribution of pula, it has identical dimension (R with sample dataD);Assuming that each element in noise vector is independent same The laplacian distribution of distribution, its probability density function is:
Wherein,The absolute value sum of each element in error vector, σ is error term εijYardstick Parameter,
For noise item, give parameter σ mono- prior distribution, if it obeys gamma distribution, make the σ of ρ=1/2, then ρ distribution For:
Wherein, aρIt is form parameter, bρIt is ρ scale parameter;
Step 3, the variation greatest hope of L1-PLDA models are solved
For the variable in model (2), it is estimated using maximum likelihood function, dimensionality reduction matrix { F, G } is solved,
Remember y={ y11,...y1J1;...;yi1,...,yiJi;...;yI1,...,yIJIIt is all hidden variable in model; Θ={ B, aρ,bρBe model in parameter set,
The form of unlimited Gaussian Profile sum can be expanded to distribution in (3) formula, i.e.,:
(5)
Wherein, β is image detection operator, and η (β) is equal to summation coefficient, is expressed as:
Under the hypothesis of (3)-(5) formula, for given image setBy to hidden variable y and parameter Θ Edge distribution quadrature, the object function of model can be obtained:
Using the lower bound Ω (Q (y, ρ, β), Θ) of the object function of EM algorithm maximization models, walked in E and update y, ρ and β Posterior distrbutionp, matrix { B } is updated in M steps, and alternating iteration is walked in E steps and M, stops changing when making object function increase and tend towards stability In generation, β value is exported, dimensionality reduction matrix is extracted from matrix { B };
Step 4, image classification
Based on dimensionality reduction matrix F and G, image is classified using Similarity Measures, i.e., two images of calculating is similar Degree, when this value is less than given threshold value, just provides that two images belong to same class.
Preferably, the model that likelihood function is solved for there is hidden variable generally utilizes greatest hope (EM) algorithm, mould The lower bound of the object function of type is:
Wherein, Q (y, ρ, β) is hidden variable y, parameter ρ, an APPROXIMATE DISTRIBUTION function of β Posterior distrbutionps;Ω(Q(y,ρ,β), Θ) be object function lower bound.
Preferably, it is as follows to calculate two image similarities:
If image x1,x2, the probability distribution p (x of two images1) and p (x2) be:
p(x1)=G ' (0, BBT+β)
p(x2)=G ' (0, BBT+β)
Wherein, G ' is gauss of distribution function.
Calculate x1,x2Joint probability distribution p (x1,x2):
p(x1,x2)=G ' (0, AAT+β')
Wherein,
Dimensionality reduction matrix F and G are calculated by EM algorithms and obtained in β and matrix A, calculate the similarity R of two images:
In the present invention, solution annual reporting law is mainly paid attention to exceptional value sensitive issue in image.On the basis of PLDA, The present invention replaces Gaussian Profile with laplacian distribution to assume the distribution of noise.Because L1- norms are insensitive to exceptional value, this The projecting direction that sample is found to image is closer to principal direction.
Brief description of the drawings
Fig. 1 is the flow chart of the image classification method of the probability linear discriminant analysis based on L1 norms;
Fig. 2 is the schematic diagram of rejecting outliers in face database in embodiment;
Fig. 3 is the schematic diagram of Feret storehouses classification results.
Embodiment
As shown in figure 1, the embodiment of the present invention provides a kind of image classification of the probability linear discriminant analysis based on L1 norms Method, comprises the following steps:
Step 1, the image to acquisition set up probability linear discriminant analysis model (PLDA)
OrderIt is one group of independent identically distributed image set, wherein have I classes, Per class JiIndividual image pattern, and J1+...Ji+...+JI=N, each sample is a column vector and size is RD, D be image to Line number after quantization, PLDA model is represented by:
xij=μ+Fhi+Gwijij (1)
Wherein, vector xijRepresent j-th of image of the i-th class in sample set;μ(μ∈RD) be image set X average, in reality In operation, data are done with a pretreatment for subtracting average, data center is allowed;F(F∈RD×d) and G (G ∈ RD×d) be respectively Dimensionality reduction matrix between class and in class, d≤D is the columns after dimensionality reduction;hi(hi∈Rd) and wij(wij∈Rd) it is xijHidden variable core, That is coefficient vector, RdIt is the image feature vector after dimensionality reduction;εijFor error term.In PLDA models, make as follows for hidden variable A priori assumption:
hi:N(hi|0,Id)
wij:N(wij|0,Id)
Wherein, hiRepresentative capacity variable, for the i-th class, it keeps constant, the general character of similar image is represented, so often Class image only one of which identity variable;For wij, it represents that class internal variable represents the individual character in similar image, in the i-th class Change with different samples;IdFor d*d unit matrix.
Step 2, set up L1-PLDA models
Present invention understands that PLDA Gaussian Profile is based on L2- norms, quadratic term can exaggerate the effect of exceptional value, for The principal direction that such data PLDA is found is irrational.Therefore L1-PLDA models proposed by the present invention, first order will not be right Exceptional value produces influence, so that the principal direction found is more reasonable.
Present invention contemplates that under conditions of exceptional value presence, to sample dimensionality reduction, after by center of a sample, model can To be expressed as:
Wherein, by dimensionality reduction matrix merges between class and in class, i.e. B=[F, G] (B ∈ RD×2d);The coefficient of dimensionality reduction matrix also may be used To merge, it is designated asRepresent the hidden variable of j-th of sample of the i-th class;Error term εijObey L1- norms Laplacian distribution, it has identical dimension (R with sample dataD);Assuming that each element in noise vector is only The laplacian distribution with distribution is found, its probability density function is:
Wherein,The absolute value sum of each element in error vector, σ is error term εijYardstick Parameter.
For noise item, (whether the noise item is error term εij), give parameter σ mono- prior distribution, if it obeys gamma point Cloth, calculates make the σ of ρ=1/ for convenience2, then ρ be distributed as:
Here aρIt is form parameter, bρIt is ρ scale parameter.
Step 3, the variation greatest hope of L1-PLDA models are solved
For the variable in model (2), it is estimated using maximum likelihood function, dimensionality reduction matrix { F, G } is solved Sample can just be classified.
NoteFor hidden variable all in model;Θ={ B, aρ,bρ} For the set of parameter in model.
The form of unlimited Gaussian Profile sum can be expanded to distribution in (3) formula, i.e.,:
Wherein, β is image detection operator, and η (β) is equal to summation coefficient, is expressed as:
Under the hypothesis of (3)-(5) formula, for given image setBy to hidden variable y and parameter Θ Edge distribution quadrature, the object function of model can be obtained:
Greatest hope (EM) algorithm is generally utilized for this model that there is hidden variable solution likelihood function, EM algorithms lead to The lower bound that Jensen inequality finds likelihood function is crossed, the actual value that lower limit function approaches likelihood function is then maximized, The lower bound of object function:
Wherein, the second row make use of Jensen inequality, and Q (y, ρ, β) is hidden variable y, parameter ρ, one of β Posterior distrbutionps APPROXIMATE DISTRIBUTION function;Ω (Q (y, ρ, β), Θ) is the lower bound of object function.
Using EM algorithms maximization function Ω (Q (y, ρ, β), Θ), the Posterior distrbutionp for updating y, ρ and β is walked in E, in M steps more New matrix { B }.Alternating iteration is walked in E steps and M, stops iteration when object function is increased and is tended towards stability, β value is exported, from square Battle array { B } extraction dimensionality reduction matrix F, G, wherein, by β, when there is exceptional value in data, the rejecting outliers of image are come out.
Step 4, image classification
Image is classified using Similarity Measures, that is, calculates the similarity of two images, is given when this value is less than During fixed threshold value, just provide that two images belong to same class;Wherein, two image similarities are calculated as follows:
If image x1,x2, the probability distribution p (x of two images1) and p (x2) be:
p(x1)=G ' (0, BBT+β)
p(x2)=G ' (0, BBT+β)
Wherein, G ' is gauss of distribution function (G being changed into G ', if accurate).
Calculate x1,x2Joint probability distribution p (x1,x2):
p(x1,x2)=G ' (0, AAT+β')
Wherein,
Dimensionality reduction matrix F and G are calculated by EM algorithms and obtained in β and matrix A, calculate the similarity R of two images:
Whether the description of above yellow flag is accurate
Embodiment 1
The image classification method experimental result of the present invention is as follows:
1st, L1-PLDA rejecting outliers result
For face database, by taking Yale storehouses as an example, illustrate β detection effect;It can be seen that from the experimental result in Fig. 2 When being blocked in face in the presence of one piece, display β image can accurately find the position for blocking block.
2nd, L1-PLDA classification results
The present invention, which has been applied to L1-PLDA algorithms in face database, to be classified, and achieves obvious effect.With Exemplified by Yale storehouses and Feret storehouses, the effect of classification is illustrated.
1) Yale storehouses have 15 classes, and everyone has 11 pictures, and the present invention is from 8 training in an experiment, and 3 are tested, Every class in training set blocks block with the presence of 2 pictures;The gray-scale map that it is 64 × 64 that image is unified.By the method for the present invention (L1-PLDA) it is compared with other three kinds of methods, is PLDA, Fisher face (LDA) and based on L1 norms respectively Linear discriminant analysis (LDA-L1), method of the invention has obvious advantage in classification, as shown in table 1.
The Yale storehouses classification results of table 1
2) 50 classes are randomly selected from Feret storehouses, everyone there are 7 pictures, the present invention is from 5 training, 2 in an experiment Test is opened, every class in training set blocks block with the presence of 1 pictures;The gray-scale map that it is 50 × 50 that image is unified.By the present invention Method (L1-PLDA) and PLDA, LDA, LDA-L1, be compared, method of the invention has obvious advantage in classification, such as Shown in Fig. 3.

Claims (3)

1. a kind of image classification method of the probability linear discriminant analysis based on L1 norms, it is characterised in that comprise the following steps:
Step 1, the image to acquisition set up probability linear discriminant analysis model (PLDA)
OrderIt is one group of independent identically distributed image set, wherein having I classes, per class Ji Individual image pattern, and J1+...Ji+...+JI=N, each sample is a column vector and size is RD, D is after image vector Line number, PLDA model is represented by:
xij=μ+Fhi+Gwijij (1)
Wherein, vector xijRepresent j-th of image of the i-th class in sample set;μ(μ∈RD) be image set X average;εijFor error term, F(F∈RD×d) and G (G ∈ RD×d) be between class respectively and class in dimensionality reduction matrix, d≤D is the columns after dimensionality reduction;hi(hi∈Rd) And wij(wij∈Rd) it is xijHidden variable core, i.e. coefficient vector, RdIt is the image feature vector after dimensionality reduction, hiRepresentative capacity becomes Amount, for the i-th class, represents the general character of similar image, for wij, represent that class internal variable represents the individual character in similar image;
Step 2, set up L1-PLDA models
Under conditions of exceptional value presence, to sample dimensionality reduction, after by center of a sample, PLDA models can be expressed as:
Wherein, by dimensionality reduction matrix merges between class and in class, i.e. B=[F, G] (B ∈ RD×2d);The coefficient of dimensionality reduction matrix merges, note ForRepresent the hidden variable of j-th of sample of the i-th class;Error term εijObey the Laplce of L1- norms Distribution, it has identical dimension (R with sample dataD);Assuming that each element in noise vector is independent identically distributed Laplacian distribution, its probability density function is:
Wherein,The absolute value sum of each element in error vector, σ is error term εijScale parameter,
For noise item, give parameter σ mono- prior distribution, if it obeys gamma distribution, make the σ of ρ=1/2, then ρ be distributed as:
Wherein, aρIt is form parameter, bρIt is ρ scale parameter;
Step 3, the variation greatest hope of L1-PLDA models are solved
For the variable in model (2), it is estimated using maximum likelihood function, dimensionality reduction matrix { F, G } is solved,
Remember y={ y11,...y1J1;...;yi1,...,yiJi;...;yI1,...,yIJIIt is all hidden variable in model;Θ= {B,aρ,bρBe model in parameter set,
The form of unlimited Gaussian Profile sum can be expanded to distribution in (3) formula, i.e.,:
Wherein, β is image detection operator, and η (β) is equal to summation coefficient, is expressed as:
1
Under the hypothesis of (3)-(5) formula, for given image setPass through the side to hidden variable y and parameter Θ Fate cloth is quadratured, and can obtain the object function of model:
Using the lower bound Ω (Q (y, ρ, β), Θ) of the object function of EM algorithm maximization models, the posteriority for updating y, ρ and β is walked in E Distribution, matrix { B } is updated in M steps, walks alternating iteration in E steps and M, iteration is stopped when object function is increased and is tended towards stability, β value is exported, dimensionality reduction matrix is extracted from matrix { B };
Step 4, image classification
Based on dimensionality reduction matrix F and G, image is classified using Similarity Measures, that is, calculates the similarity of two images, when When this value is less than given threshold value, just provide that two images belong to same class.
2. the image classification method of the probability linear discriminant analysis as claimed in claim 1 based on L1 norms, it is characterised in that The model that likelihood function is solved for there is hidden variable generally utilizes greatest hope (EM) algorithm, the lower bound of the object function of model For:
Wherein, Q (y, ρ, β) is hidden variable y, parameter ρ, an APPROXIMATE DISTRIBUTION function of β Posterior distrbutionps;Ω(Q(y,ρ,β),Θ) It is the lower bound of object function.
3. the image classification method of the probability linear discriminant analysis as claimed in claim 1 based on L1 norms, it is characterised in that Calculate two image similarities as follows:
If image x1,x2, the probability distribution p (x of two images1) and p (x2) be:
p(x1)=G ' (0, BBT+β)
p(x2)=G ' (0, BBT+β)
Wherein, G ' is gauss of distribution function.
Calculate x1,x2Joint probability distribution p (x1,x2):
p(x1,x2)=G ' (0, AAT+β')
Wherein,
Dimensionality reduction matrix F and G are calculated by EM algorithms and obtained in β and matrix A, calculate the similarity R of two images:
2
CN201710178667.4A 2017-03-23 2017-03-23 A kind of image classification method of the probability linear discriminant analysis based on L1 norms Pending CN107038456A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710178667.4A CN107038456A (en) 2017-03-23 2017-03-23 A kind of image classification method of the probability linear discriminant analysis based on L1 norms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710178667.4A CN107038456A (en) 2017-03-23 2017-03-23 A kind of image classification method of the probability linear discriminant analysis based on L1 norms

Publications (1)

Publication Number Publication Date
CN107038456A true CN107038456A (en) 2017-08-11

Family

ID=59534171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710178667.4A Pending CN107038456A (en) 2017-03-23 2017-03-23 A kind of image classification method of the probability linear discriminant analysis based on L1 norms

Country Status (1)

Country Link
CN (1) CN107038456A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107704887A (en) * 2017-10-20 2018-02-16 北京工业大学 A kind of image-recognizing method of the locality preserving projections based on F norms
CN108492404A (en) * 2018-02-07 2018-09-04 上海灵至科技有限公司 A kind of the face lock method for unlocking and device of additional Expression Recognition
CN110097117A (en) * 2019-04-30 2019-08-06 哈尔滨工程大学 Data classification method based on linear discriminant analysis Yu polynary adaptive batten
CN110147792A (en) * 2019-05-22 2019-08-20 齐鲁工业大学 The Key works Drug packing character high-speed detection system and method optimized based on memory

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101315663A (en) * 2008-06-25 2008-12-03 中国人民解放军国防科学技术大学 Nature scene image classification method based on area dormant semantic characteristic
CN104700117A (en) * 2015-03-16 2015-06-10 北京工业大学 Principal component analysis method of two-dimensional probability
CN105095856A (en) * 2015-06-26 2015-11-25 上海交通大学 Method for recognizing human face with shielding based on mask layer
CN105389343A (en) * 2015-10-23 2016-03-09 北京工业大学 Vectorized dimension reduction method
CN105469101A (en) * 2015-12-31 2016-04-06 北京工业大学 Mixed two-dimensional probabilistic principal component analysis method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101315663A (en) * 2008-06-25 2008-12-03 中国人民解放军国防科学技术大学 Nature scene image classification method based on area dormant semantic characteristic
CN104700117A (en) * 2015-03-16 2015-06-10 北京工业大学 Principal component analysis method of two-dimensional probability
CN105095856A (en) * 2015-06-26 2015-11-25 上海交通大学 Method for recognizing human face with shielding based on mask layer
CN105389343A (en) * 2015-10-23 2016-03-09 北京工业大学 Vectorized dimension reduction method
CN105469101A (en) * 2015-12-31 2016-04-06 北京工业大学 Mixed two-dimensional probabilistic principal component analysis method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SIMON J.D. PRINCE 等: "Probabilistic Linear Discriminant Analysis for Inferences About Identity", 《2007 IEEE 11TH INTERNATIONAL CONFERENCE ON COMPUTER VISION》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107704887A (en) * 2017-10-20 2018-02-16 北京工业大学 A kind of image-recognizing method of the locality preserving projections based on F norms
CN107704887B (en) * 2017-10-20 2021-04-02 北京工业大学 Image identification method based on F norm local preserving projection
CN108492404A (en) * 2018-02-07 2018-09-04 上海灵至科技有限公司 A kind of the face lock method for unlocking and device of additional Expression Recognition
CN108492404B (en) * 2018-02-07 2020-10-09 上海灵至科技有限公司 Face lock unlocking method and device with additional expression recognition function
CN110097117A (en) * 2019-04-30 2019-08-06 哈尔滨工程大学 Data classification method based on linear discriminant analysis Yu polynary adaptive batten
CN110097117B (en) * 2019-04-30 2023-12-12 哈尔滨工程大学 Data classification method based on linear discriminant analysis and multi-element self-adaptive spline
CN110147792A (en) * 2019-05-22 2019-08-20 齐鲁工业大学 The Key works Drug packing character high-speed detection system and method optimized based on memory
CN110147792B (en) * 2019-05-22 2021-05-28 齐鲁工业大学 Medicine package character high-speed detection system and method based on memory optimization

Similar Documents

Publication Publication Date Title
CN109801337A (en) A kind of 6D position and orientation estimation method of Case-based Reasoning segmentation network and iteration optimization
CN104636721B (en) A kind of palm grain identification method based on profile Yu Edge texture Fusion Features
CN105139004B (en) Facial expression recognizing method based on video sequence
CN107506702A (en) Human face recognition model training and test system and method based on multi-angle
CN107038456A (en) A kind of image classification method of the probability linear discriminant analysis based on L1 norms
CN105809651B (en) Image significance detection method based on the comparison of edge non-similarity
CN109740603A (en) Based on the vehicle character identifying method under CNN convolutional neural networks
CN105740842A (en) Unsupervised face recognition method based on fast density clustering algorithm
CN109615014A (en) A kind of data sorting system and method based on the optimization of KL divergence
CN103984953A (en) Cityscape image semantic segmentation method based on multi-feature fusion and Boosting decision forest
CN107808129A (en) A kind of facial multi-characteristic points localization method based on single convolutional neural networks
Aditya et al. Batik classification using neural network with gray level co-occurence matrix and statistical color feature extraction
CN104834941A (en) Offline handwriting recognition method of sparse autoencoder based on computer input
CN105224937A (en) Based on the semantic color pedestrian of the fine granularity heavily recognition methods of human part position constraint
CN104077605A (en) Pedestrian search and recognition method based on color topological structure
CN108108760A (en) A kind of fast human face recognition
CN106874879A (en) Handwritten Digit Recognition method based on multiple features fusion and deep learning network extraction
CN108268865A (en) Licence plate recognition method and system under a kind of natural scene based on concatenated convolutional network
Obaidullah et al. A system for handwritten script identification from Indian document
CN109583493A (en) A kind of credit card detection and digit recognition method based on deep learning
CN105809113A (en) Three-dimensional human face identification method and data processing apparatus using the same
CN107886066A (en) A kind of pedestrian detection method based on improvement HOG SSLBP
Pratikakis et al. Partial 3d object retrieval combining local shape descriptors with global fisher vectors
CN105069403B (en) A kind of three-dimensional human ear identification based on block statistics feature and the classification of dictionary learning rarefaction representation
CN106548195A (en) A kind of object detection method based on modified model HOG ULBP feature operators

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170811

RJ01 Rejection of invention patent application after publication