CN110555386A - Face recognition identity authentication method based on dynamic Bayes - Google Patents
Face recognition identity authentication method based on dynamic Bayes Download PDFInfo
- Publication number
- CN110555386A CN110555386A CN201910710960.XA CN201910710960A CN110555386A CN 110555386 A CN110555386 A CN 110555386A CN 201910710960 A CN201910710960 A CN 201910710960A CN 110555386 A CN110555386 A CN 110555386A
- Authority
- CN
- China
- Prior art keywords
- face
- matrix
- training
- steps
- following
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
- G06F18/2135—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
- G06F18/24155—Bayesian classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
Abstract
a face recognition identity authentication method based on dynamic Bayes comprises the following steps: firstly, a camera captures and tracks a target face in real time, and a histogram equalization is adopted to preprocess a captured face image; then extracting the preprocessed face image data and adopting Principal Component Analysis (PCA) to reduce the dimension; and finally, classifying the detection target by using a dynamic Bayesian classification algorithm, and outputting a final recognition result. The method can rapidly extract the face characteristic value and reduce the influence of illumination on the face image. The PCA algorithm is adopted to reduce the dimension of the high-dimensional data and remove noise. In addition, the traditional Bayesian classification algorithm assumes that the attributes are independent, so that the classification accuracy is low in a complex network structure, and the dynamic Bayesian classification algorithm adopted in the text can determine the optimal Bayesian network structure through intersection and variation in an iterative process, so that the overall recognition accuracy can be remarkably improved.
Description
Technical Field
The invention relates to the technical field of biological feature recognition, in particular to a face recognition identity authentication method based on dynamic Bayes.
background
the face recognition is a biological recognition technology for identity recognition based on face feature information of people, which is to distinguish organism individuals from biological features of organisms (generally, specially, people), and has some significant advantages: the method has the advantages of non-contact, intelligent interaction, high user acceptance degree, prominent intuition, accordance with the cognitive rule of people for recognizing people with appearances, strong adaptability, difficult counterfeiting, good safety and easy popularization and use. Face recognition is achieved by capturing and tracking an image or video stream containing a face with a camera or a video camera, automatically detecting and tracking the face in the image, and further analyzing, matching and recognizing the detected face, which are generally called face recognition and face recognition.
From the research results at home and abroad, the difficulty of face recognition mainly focuses on the aspects of face image capture and tracking, face image preprocessing, face image feature extraction, face image matching and recognition and the like:
in the aspect of human face image capture and tracking, the phenomena of abnormal human face image capture, incapability of tracking and the like are caused by changes of illumination, posture, clothes and the like.
In the aspect of face image preprocessing, due to the 3D structure of the face, the shadow cast by illumination can strengthen or weaken the original face features. Especially at night, the face shadow caused by insufficient light causes a sharp drop in the recognition rate, making it difficult for the system to meet practical requirements.
In the aspect of face image feature extraction, the common face recognition system is difficult to solve face dimension disaster and extract face key feature points, and has low recognition efficiency and large time consumption.
In the aspect of face image matching and recognition, the algorithm adopted by the general face recognition system is low in efficiency and the recognition rate is not ideal.
Face recognition is a classification problem. Compared with other classification methods, naive Bayes and other classification methods have the lowest error rate of classification theoretically, but in actual situations, attributes are not necessarily independent, and if the assumed prior model is not selected properly, the recognition accuracy is reduced. The naive Bayes training process is a process of determining the dependency relationship among attributes, and is carried out in continuous iteration, and the finally obtained Bayes network structure directly influences the classification accuracy.
Disclosure of Invention
The invention aims to overcome the technical defects and shortcomings and provides a face recognition identity authentication method based on dynamic Bayes. The invention applies the dynamic Bayesian algorithm to the field of face recognition, has strong theoretical basis and technical feasibility. The dynamic Bayesian classification algorithm can determine the optimal Bayesian network structure through intersection and variation in the iterative process, and can remarkably improve the overall recognition accuracy.
The technical scheme adopted by the invention is as follows:
A face recognition identity authentication method based on dynamic Bayes comprises the following steps:
step 1, a camera captures and tracks a target face in real time and performs graying processing on an image to obtain a grayed image sequence;
Step 2, carrying out histogram equalization processing on the gray face image;
Step 3, extracting pixel point data of the face image after the equalization processing, adopting PCA (principal component analysis) to reduce dimensions, extracting a face image characteristic value, and constructing a face characteristic database;
Step 4, training and identifying by using a dynamic Bayesian classification algorithm;
And 5, outputting the identification result.
The process of capturing and tracking the target face in the step 1 is divided into 2 methods:
The method comprises the following steps: a Haar classifier is used for detecting a face region in real time, the method comprises the steps that a sub-window continuously shifts and slides in a picture window to be detected when the face is detected, the feature of the region is calculated when the sub-window reaches one position, then the trained classifier is used for screening the feature, and once the feature passes the screening of all strong classifiers, the region is judged to be the face. Specifically, the method comprises the following steps:
and 1.1, calculating Haar characteristics. The Haar features are presented as two rectangles of black and white in the detection window and defined as the difference between the sum of white rectangular pixels and the sum of black rectangular pixels;
And 1.2, accelerating to obtain Haar characteristics. The Haar feature uses an integral graph to accelerate the calculation method:
sum=I(M1)+I(M2)+I(M3)+I(M4)
where M1, M2, M3, M4 denote coordinate points of four corners of the detection window, and I (x, y) denotes a pixel value at the coordinate point (x, y), as shown in fig. 1.
And step 1.3, training a strong classifier for distinguishing human faces and non-human faces by using an AdaBoost algorithm. The AdaBoost iteration is divided into the following steps:
And step 1.3.1, initializing weight distribution of training data. If there are N training samples, the initial value of each training sample at the beginning is 1/N.
and step 1.3.2, training a weak classifier. In the training process, if the sampling points are correctly classified, the weight of the sampling points is reduced in the next training set; conversely, if a sample point is not classified correctly, the weight of that sample point will be improved.
And step 1.3.3, training the weak classifier into a strong classifier. After the training process of each weak classifier is completed, the weight of the weak classifier with small classification error rate is increased, so that the weak classifier plays a decisive role in the final classification function.
The second method comprises the following steps: the human face area is fixed, and the monitoring target is required to move the face to a specified area.
the histogram equalization process of step 2 comprises 3 sub-steps:
Step 2.1, quantizing the captured human face picture into pixel point data, counting the probability of each gray level of the histogram,
where k is the image correspondenceNumber of gray levels of (2), Pr(ri) Representing a gray value of riThe probability of (d);
Step 2.2, calculating the cumulative distribution function of each gray level,
Where k is the number of gray levels corresponding to the image, n is the sum of the pixels in the image, njIs the number of pixels of the current gray level, and L is the total number of gray levels in the image;
And 2.3, performing mapping conversion according to the cumulative distribution function, and calculating a new pixel value.
and 3, extracting the processed face image pixel point data and adopting PCA (principal component analysis) to reduce the dimension, wherein the method comprises 4 sub-steps:
Step 3.1, quantizing the preprocessed face image into pixel point data and storing the pixel point data in a two-dimensional array;
and 3.2, solving a covariance matrix of the two-dimensional array, wherein the method comprises the following steps of: firstly, centralizing a sample matrix, namely subtracting the mean value of the dimension from each dimension, directly multiplying the newly obtained sample matrix by the transpose of the newly obtained sample matrix, and then dividing by (sample dimension-1), wherein each column of a two-dimensional array is one dimension;
and 3.3, calculating the eigenvalue and the eigenvector of the covariance matrix A by using a QR decomposition method, wherein the method comprises the following steps:
Step 3.3.1, settingis a real matrix of m x m, is takenRecording: a is1=sgn(x1)||x1||2,a1Is a vector of order m, satisfies a1 Ta1When 1, construct the Hausholde matrix H1(a1) Hereinafter abbreviated as H1。
H1=I-ρ1 -1u1u1 T
wherein I represents an m × m identity matrix, u1=x1+a1e1,e1=(1,0,...,0)T,
Step 3.3.2, calculate A(2)。
step 3.3.3 takingrecording:
ai=sgn(xi)||xi||2Constructing a Haosholde matrix Hi(ai) Hereinafter abbreviated as Hi. A is calculated in turn according to the following formula(3),A(4),…,A(m):
A(i)=HiA(i)
step 3.3.4, A(1)is divided into an upper triangular array A(m):
because of Hifor self-inverse matrix, let Q ═ H1H2L Hm-1then, there is A ═ QR. Where Q is an orthogonal matrix and R is an upper triangular matrix.
and 3.3.5, solving the eigenvalue and the eigenvector of the upper triangular matrix R, namely the eigenvalue and the eigenvector of the covariance matrix A.
And 3.4, after the eigenvalue and the eigenvector of the covariance matrix are solved, arranging the eigenvalues from large to small, selecting the eigenvectors corresponding to the first k eigenvalues to form a new two-dimensional array, namely a face dimension reduction corresponding relation, and multiplying the face image data matrix extracted by the detection target by the corresponding relation to form the eigenvalue of the detection target.
the 4 th step dynamic Bayesian classification algorithm consists of 4 sub-steps:
step 4.1, the multivariate non-independent joint conditional probability of the Bayesian network structure has the following solving formula:
wherein xparent(i)a predecessor node representing node i, j being the number of nodes of the Bayesian network, xirepresents the ith node;
Step 4.2, assuming that there are q types, let α represent the training sample type set, i.e. for the unknown type sample, the identification method of the maximum a posteriori probability (MAP) for the target is:
MAP=max{P(α|x1,x2,...xj)
and 4.3, initializing j Bayesian network nodes and converting the Bayesian network nodes into an adjacency matrix, wherein the Bayesian network structure is converted into the adjacency matrix by a method that the Bayesian network structure can be represented by a directed graph, a node V of the directed graph represents a random variable, a directed edge E represents conditional dependence among the random variables, and if the adjacency matrix B represents a directed graph G (V, E), then:
And 4.4, if the current iteration number is within the user-defined iteration number, generating a random number from 0 to 1, if the generated random number is less than p, executing intersection, and if the generated random number is greater than p, executing mutation, sequencing the obtained matrixes according to the classification accuracy, removing the matrix with the accuracy arranged in the last three bits, and if the current iteration number is not within the user-defined iteration number, exiting the algorithm flow and obtaining the optimal Bayesian network structure, wherein the intersection and mutation method comprises the following steps:
Selecting 2 matrixes with the highest classification accuracy, and interchanging elements at the corresponding positions of the left half part;
And (4) mutation, namely randomly generating a Bayesian network structure.
Step 5, outputting the identification result;
The method comprises the following steps: and classifying the targets by adopting a dynamic Bayesian algorithm, calculating the similarity probability of the characters, and taking the highest probability character as a final recognition result.
Advantages and advantageous effects of the invention
The invention can complete the capture and tracking of the face image under various conditions of illumination, posture and clothing change, reduce the influence of the face image on the illumination intensity by using histogram equalization, avoid dimension disaster by adopting PCA to extract key feature data of the face, and determine the optimal Bayesian network structure by adopting a dynamic Bayesian classification algorithm through crossing and variation in an iteration process, thereby improving the classification performance.
Drawings
FIG. 1 is a general flow of a face recognition method according to the present invention;
FIG. 2 is a general flow of a dynamic Bayesian classification method based on genetic algorithm;
fig. 3 is a schematic diagram of the face features in the face feature database.
Detailed Description
The invention is described in detail below with reference to the figures and examples.
As shown in figure 1, the invention relates to a face recognition identity authentication method based on dynamic Bayes, which is used for recognizing 7 human face images and intercepting 30 images of a video shot by each person. The specific implementation scheme comprises the following contents and steps:
step 1, capturing and tracking a target face in real time by using a camera of a notebook, storing the target face in a specific folder, and carrying out graying processing on an image to obtain a grayed image sequence;
Step 2, carrying out histogram equalization processing on the gray face image;
Step 3, extracting pixel point data of the face image after the equalization processing, adopting PCA to reduce dimension, extracting a feature value of the face image, and obtaining data such as positions of left and right eyes, positions between the eyebrows of the left and right eyes, positions of nose tips, positions of lips, and the like, as shown in FIG. 3, (taking the face image No. 006 as an example, obtaining positions of the left eye (26, 43), the right eye (100,42), positions between the eyebrows of the left eye (50,43), positions between the eyebrows of the right eye (121,43), positions of nose tips (87,118), positions of lips (87,158)), and constructing a face feature database;
and 4, training and identifying by using a dynamic Bayesian classification algorithm, wherein the specific process is shown in FIG. 2, the cross probability is 0.8, 3 Bayesian network structures are initialized, and the iteration times are 50. Table 1 shows the classification accuracy per iteration using dynamic bayes.
TABLE 1 classification accuracy of dynamic Bayes per iteration
Number of iterations | Rate of identification accuracy | number of iterations | rate of identification accuracy |
1 | 88.095% | 26 | 95.238% |
2 | 88.095% | 27 | 95.238% |
3 | 88.095% | 28 | 95.238% |
4 | 91.667% | 29 | 95.238% |
5 | 94.047% | 30 | 95.238% |
6 | 94.047% | 31 | 95.238% |
7 | 94.047% | 32 | 100% |
8 | 94.047% | 33 | 100% |
9 | 94.047% | 34 | 100% |
10 | 94.047% | 35 | 100% |
11 | 94.047% | 36 | 100% |
12 | 94.047% | 37 | 100% |
13 | 94.047% | 38 | 100% |
14 | 94.047% | 39 | 100% |
15 | 94.047% | 40 | 100% |
16 | 94.047% | 41 | 100% |
17 | 94.047% | 42 | 100% |
18 | 94.047% | 43 | 100% |
19 | 94.047% | 44 | 100% |
20 | 94.047% | 45 | 100% |
21 | 94.047% | 46 | 100% |
22 | 94.047% | 47 | 100% |
23 | 94.047% | 48 | 100% |
24 | 95.238% | 49 | 100% |
25 | 95.238% | 50 | 100% |
And 5, outputting the identification result. Taking the recognition result of image number 006 as an example, the results are as follows: the similarity probability with 001 is 0%, the similarity probability with 002 is 0%, the similarity probability with 003 is 0%, the similarity probability with 004 is 0%, the similarity probability with 005 is 0%, the similarity probability with 006 is 100%, the similarity probability with 007 is 0%, and the person is finally identified as 006.
the dynamic Bayes method provided by the invention uses KDDCUP99 data set provided by the United states department of defense advanced planning agency to perform repeated tests for many times, and compares the test with the traditional Bayes classification method. The KDDCUP99 data set contains the types DDOS (denial of service), R2L (unauthorized access from remote host), U2R (unauthorized local supervisor privileged access), Probe (port monitoring or scanning) and normal, with the results shown in table 2.
TABLE 2 accuracy of various algorithms on KDDCUP99 data set
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.
Claims (6)
1. A face recognition identity authentication method based on dynamic Bayes is characterized by comprising the following steps:
Step 1, a camera captures and tracks a target face in real time and performs graying processing on an image to obtain a grayed image sequence;
step 2, carrying out histogram equalization processing on the gray face image;
Step 3, extracting pixel point data of the face image after the equalization processing, adopting PCA (principal component analysis) to reduce dimensions, extracting a face image characteristic value, and constructing a face characteristic database;
step 4, training and identifying by using a dynamic Bayesian classification algorithm;
And 5, outputting the identification result.
2. the dynamic Bayesian-based face recognition identity authentication method according to claim 1, wherein: the process of capturing and tracking the target face in step 1 is divided into two methods:
the method comprises the following steps: when detecting the human face, a sub-window continuously shifts and slides in a picture window to be detected, the characteristic of the region can be calculated when the sub-window reaches a position, then the trained classifier is used for screening the characteristic, and once the characteristic passes the screening of all strong classifiers, the region is judged to be the human face; the method specifically comprises the following steps:
1.1, calculating Haar characteristics; the Haar features are presented as two rectangles of black and white in the detection window and defined as the difference between the sum of white rectangular pixels and the sum of black rectangular pixels;
1.2, accelerating to obtain Haar characteristics; the Haar feature uses an integral graph to accelerate the calculation method:
sum=I(M1)+I(M2)+I(M3)+I(M4)
where M1, M2, M3, M4 respectively denote coordinate points of four corners of the detection window, and I (x, y) denotes a pixel value at the coordinate point (x, y);
step 1.3, training a strong classifier for distinguishing a human face from a non-human face by using an AdaBoost algorithm; the AdaBoost iteration is divided into the following steps:
Step 1.3.1, initializing weight distribution of training data; if N training samples exist, the initial value of each training sample at the beginning is 1/N;
Step 1.3.2, training a weak classifier; in the training process, if the sampling points are correctly classified, the weight of the sampling points is reduced in the next training set; conversely, if a sample point is not correctly classified, the weight of the sample point will be improved;
step 1.3.3, training the weak classifier into a strong classifier; after the training process of each weak classifier is completed, the weight of the weak classifier with small classification error rate is increased, so that the weak classifier plays a decisive role in the final classification function;
the second method comprises the following steps: the human face area is fixed, and the monitoring target is required to move the face to a specified area.
3. The dynamic Bayesian-based face recognition identity authentication method according to claim 1, wherein: the histogram equalization processing described in step 2 consists of the following steps:
Step 2.1, quantizing the captured human face picture into pixel point data, and counting the probability of each gray level of the histogram
Where k is the number of gray levels corresponding to the image, Pr(ri) Representing a gray value of rithe probability of (d);
step 2.2, calculating the cumulative distribution function of each gray level,
Where k is the number of gray levels corresponding to the image, n is the sum of the pixels in the image, njis the number of pixels of the current gray level, and L is the total number of gray levels in the image;
and 2.3, carrying out mapping conversion according to the cumulative distribution function, and calculating a new pixel value.
4. The dynamic Bayesian-based face recognition identity authentication method according to claim 1, wherein: the step 3 of extracting the processed face image pixel point data and adopting PCA to reduce the dimension comprises the following steps:
step 3.1, quantizing the preprocessed face image into pixel point data and storing the pixel point data in a two-dimensional array;
and 3.2, solving a covariance matrix of the two-dimensional array, wherein the method comprises the following steps: firstly, centralizing a sample matrix, namely subtracting the mean value of the dimension from each dimension, directly multiplying the newly obtained sample matrix by the transpose of the newly obtained sample matrix, and then dividing by (sample dimension-1), wherein each column of a two-dimensional array is one dimension;
And 3.3, calculating the eigenvalue and the eigenvector of the covariance matrix A by using a QR decomposition method, wherein the method comprises the following steps:
Step 3.3.1, settingi, j is 1, 2.. m is a real m × m matrix, which is takenrecording: a is1=sgn(x1)||x1||2,a1Is a vector of order m, satisfies a1 Ta1when 1, construct the Hausholde matrix H1(a1) Hereinafter abbreviated as H1;
H1=I-ρ1 -1u1u1 T
Wherein I represents an m × m identity matrix, u1=x1+a1e1,e1=(1,0,...,0)T,
step 3.3.2, calculate A(2);
Step 3.3.3 takingrecording:
ai=sgn(xi)||xi||2Constructing a Haosholde matrix Hi(ai) Hereinafter abbreviated as Hi(ii) a A is calculated in turn according to the following formula(3),A(4),…,A(m):
A(i)=HiA(i)
Step 3.3.4, A(1)Is divided into an upper triangular array A(m):
Because of HiFor self-inverse matrix, let Q ═ H1H2L Hm-1then, a is QR, where Q is the orthogonal matrix and R is the upper triangular matrix;
3.3.5, solving the eigenvalue and the eigenvector of the upper triangular matrix R, namely the eigenvalue and the eigenvector of the covariance matrix A;
And 3.4, after the eigenvalue and the eigenvector of the covariance matrix are solved, arranging the eigenvalues from large to small, selecting the eigenvectors corresponding to the first k eigenvalues to form a new two-dimensional array, namely a face dimension reduction corresponding relation, and multiplying the face image data matrix extracted by the detection target by the corresponding relation to form the eigenvalue of the detection target.
5. The dynamic Bayesian-based face recognition identity authentication method according to claim 1, wherein: the step of using dynamic Bayesian classification algorithm for identification in the step 4 comprises the following steps:
step 4.1, the multivariate non-independent joint conditional probability of the Bayesian network structure has the following solving formula:
wherein xparent(i)A predecessor node representing node i, j being the number of nodes of the Bayesian network, xiRepresents the ith node;
step 4.2, assuming that there are q types, let α represent the training sample type set, i.e. for the unknown type sample, the identification method of the maximum a posteriori probability (MAP) for the target is:
MAP=max{P(α|x1,x2,...xj)
And 4.3, initializing j Bayesian network nodes and converting the Bayesian network nodes into an adjacency matrix, wherein the Bayesian network structure is converted into the adjacency matrix by a method that the Bayesian network structure can be represented by a directed graph, a node V of the directed graph represents a random variable, a directed edge E represents conditional dependence among the random variables, and if the adjacency matrix B represents a directed graph G (V, E), then:
And 4.4, if the current iteration number is within the user-defined iteration number, generating a random number from 0 to 1, if the generated random number is less than p, executing intersection, and if the generated random number is greater than p, executing mutation, sequencing the obtained matrixes according to the classification accuracy, removing the matrix with the accuracy arranged in the last three digits, and if the current iteration number is not within the user-defined iteration number, exiting the algorithm flow and obtaining the optimal Bayesian network structure, wherein the intersection and mutation method comprises the following steps:
Selecting 2 matrixes with the highest classification accuracy, and interchanging elements at the corresponding positions of the left half part;
and (4) mutation, namely randomly generating a Bayesian network structure.
6. The dynamic Bayesian-based face recognition identity authentication method according to claim 1, wherein the method for outputting the recognition result is: and classifying the targets by adopting a dynamic Bayesian algorithm, calculating the similarity probability of the characters, and taking the highest probability character as a final recognition result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910710960.XA CN110555386A (en) | 2019-08-02 | 2019-08-02 | Face recognition identity authentication method based on dynamic Bayes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910710960.XA CN110555386A (en) | 2019-08-02 | 2019-08-02 | Face recognition identity authentication method based on dynamic Bayes |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110555386A true CN110555386A (en) | 2019-12-10 |
Family
ID=68736936
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910710960.XA Pending CN110555386A (en) | 2019-08-02 | 2019-08-02 | Face recognition identity authentication method based on dynamic Bayes |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110555386A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111881843A (en) * | 2020-07-30 | 2020-11-03 | 河南天迈科技有限公司 | Taxi passenger carrying number counting method based on face detection |
CN112102366A (en) * | 2020-09-24 | 2020-12-18 | 湘潭大学 | Improved algorithm for tracking unmanned aerial vehicle based on dynamic target |
CN113205619A (en) * | 2021-03-15 | 2021-08-03 | 广州朗国电子科技有限公司 | Door lock face recognition method, equipment and medium based on wireless network |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060088207A1 (en) * | 2004-10-22 | 2006-04-27 | Henry Schneiderman | Object recognizer and detector for two-dimensional images using bayesian network based classifier |
CN104636729A (en) * | 2015-02-10 | 2015-05-20 | 浙江工业大学 | Three-dimensional face recognition method based on Bayesian multivariate distribution characteristic extraction |
CN106228142A (en) * | 2016-07-29 | 2016-12-14 | 西安电子科技大学 | Face verification method based on convolutional neural networks and Bayesian decision |
CN108304788A (en) * | 2018-01-18 | 2018-07-20 | 陕西炬云信息科技有限公司 | Face identification method based on deep neural network |
-
2019
- 2019-08-02 CN CN201910710960.XA patent/CN110555386A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060088207A1 (en) * | 2004-10-22 | 2006-04-27 | Henry Schneiderman | Object recognizer and detector for two-dimensional images using bayesian network based classifier |
CN104636729A (en) * | 2015-02-10 | 2015-05-20 | 浙江工业大学 | Three-dimensional face recognition method based on Bayesian multivariate distribution characteristic extraction |
CN106228142A (en) * | 2016-07-29 | 2016-12-14 | 西安电子科技大学 | Face verification method based on convolutional neural networks and Bayesian decision |
CN108304788A (en) * | 2018-01-18 | 2018-07-20 | 陕西炬云信息科技有限公司 | Face identification method based on deep neural network |
Non-Patent Citations (3)
Title |
---|
刘美丽等: "《MATLAB语言与应用》", 30 April 2012, 国防工业出版社 * |
徐洪水: "基于智能优化的贝叶斯网络分类模型研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
郭宝龙等: "《数字图像处理系统工程导论》", 31 July 2012, 西安电子科技大学出版社 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111881843A (en) * | 2020-07-30 | 2020-11-03 | 河南天迈科技有限公司 | Taxi passenger carrying number counting method based on face detection |
CN111881843B (en) * | 2020-07-30 | 2023-12-29 | 河南天迈科技有限公司 | Face detection-based taxi passenger carrying number counting method |
CN112102366A (en) * | 2020-09-24 | 2020-12-18 | 湘潭大学 | Improved algorithm for tracking unmanned aerial vehicle based on dynamic target |
CN112102366B (en) * | 2020-09-24 | 2024-04-02 | 湘潭大学 | Unmanned aerial vehicle tracking improvement algorithm based on dynamic target |
CN113205619A (en) * | 2021-03-15 | 2021-08-03 | 广州朗国电子科技有限公司 | Door lock face recognition method, equipment and medium based on wireless network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wu et al. | Face detection in color images using AdaBoost algorithm based on skin color information | |
JP4543423B2 (en) | Method and apparatus for automatic object recognition and collation | |
CN103605972A (en) | Non-restricted environment face verification method based on block depth neural network | |
CN110555386A (en) | Face recognition identity authentication method based on dynamic Bayes | |
CN108108760A (en) | A kind of fast human face recognition | |
Sisodia et al. | ISVM for face recognition | |
CN112395901A (en) | Improved face detection, positioning and recognition method in complex environment | |
Ramalingam et al. | Robust face recognition using enhanced local binary pattern | |
Kumar et al. | Survey on handwritten digit recognition using machine learning | |
CN110222660B (en) | Signature authentication method and system based on dynamic and static feature fusion | |
CN110287973B (en) | Image feature extraction method based on low-rank robust linear discriminant analysis | |
Sasankar et al. | A study for face recognition using techniques pca and knn | |
Mahdi et al. | 3D facial matching by spiral convolutional metric learning and a biometric fusion-net of demographic properties | |
Zhao et al. | Adaptive sampling and learning for unsupervised outlier detection | |
Zhang et al. | Person re-identification with multi-features based on evolutionary algorithm | |
CN109241886B (en) | Face recognition method and system based on OLBP and PCA | |
Mousa Pasandi | Face, Age and Gender Recognition Using Local Descriptors | |
Loderer et al. | Optimization of LBP parameters | |
Fritz et al. | Rapid object recognition from discriminative regions of interest | |
Zhao et al. | A head pose estimation method based on multi-feature fusion | |
Khair et al. | Face Recognition in Kindergarten Students using the Principal Component Analysis Algorithm | |
Paul et al. | Automatic adaptive facial feature extraction using CDF analysis | |
Wang et al. | Genetic eigenhand selection for handshape classification based on compact hand extraction | |
CN110443255A (en) | The local retentivity homing method of relaxation for image characteristics extraction | |
Ibrahim et al. | A Framework for Fingerprint Liveness Detection Using Support Vector Machine Optimized by Genetic Algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20191210 |