CN107273842B - Selective integrated face recognition method based on CSJOGA algorithm - Google Patents

Selective integrated face recognition method based on CSJOGA algorithm Download PDF

Info

Publication number
CN107273842B
CN107273842B CN201710432432.3A CN201710432432A CN107273842B CN 107273842 B CN107273842 B CN 107273842B CN 201710432432 A CN201710432432 A CN 201710432432A CN 107273842 B CN107273842 B CN 107273842B
Authority
CN
China
Prior art keywords
gen
population
algorithm
orthogonal
svm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710432432.3A
Other languages
Chinese (zh)
Other versions
CN107273842A (en
Inventor
杨新武
王聿铭
牛文杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN201710432432.3A priority Critical patent/CN107273842B/en
Publication of CN107273842A publication Critical patent/CN107273842A/en
Application granted granted Critical
Publication of CN107273842B publication Critical patent/CN107273842B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a selective integration face recognition method based on a CSJOGA algorithm, belongs to the technical field of face recognition, and particularly relates to a novel face recognition method based on a combined sparse projection improved hybrid orthogonal genetic algorithm (CSJOGA), a bagging method, a selective integration technology and a multi-classification SVM method. The method combines minimum spanning tree clustering, joint sparse projection and orthogonal experimental design, and provides an orthogonal crossover operator based on joint sparse projection. Then, an orthogonal experiment method is used for designing intersection, a clustering local search strategy is introduced, and a hybrid orthogonal genetic algorithm based on joint sparse projection is provided. The method can ensure that the accuracy of face recognition can be further improved to a higher level on the premise of greatly reducing the number of integrated classifiers, reducing the storage and calculation overhead and effectively improving the face recognition speed, and ensures that the classification accuracy in actual application reaches the ideal requirement.

Description

Selective integrated face recognition method based on CSJOGA algorithm
Technical Field
The invention belongs to the technical field of face recognition, and particularly relates to a novel face recognition method based on a combined sparse projection improved hybrid orthogonal genetic algorithm (CSJOGA), a bagging method, a selective integration technology and a multi-classification SVM method.
Background
In recent decades, biometric recognition has attracted much attention in various fields, and has become one of the research hotspots in the fields of pattern recognition and machine vision. Among them, face recognition is an important component of biometric recognition.
In face recognition, face feature extraction is one of the key steps. The face feature extraction is to extract identification information which is helpful for classification from high-dimensional face data and remove useless redundant information.
In most cases, the face features are divided into global features and local features. The global feature means that each dimension of the feature vector can distinguish differences brought by different people, and emphasis is placed on the main features of the human face; the local features mean that each dimension of the feature vector can distinguish differences caused by factors such as illumination, expression, posture and the like, and reflect detail changes of the face.
A number of global feature extraction algorithms have been proposed, including classical algorithms such as principal component analysis, linear discriminant analysis, and sparse-preserving projection. The principal component analysis is to seek the best identification characteristic under the principle of minimizing mean square error; the linear discriminant analysis is to acquire characteristic information by the principle of maximizing inter-class dispersion and minimizing intra-class dispersion; sparse-preserving projection is an optimization problem for reconstructing original signals by solving sparse weight vectors to obtain main identification features. The principal component analysis (PCA for short) does not need to know the class information of the training samples, the linear discriminant analysis and the sparse preserving projection both need to know the class information of the training samples, and the projection matrix is obtained through the training of the training samples.
On the other hand, with the rapid development of science and technology and the improvement of product complexity, the solution of many problems requires searching for the optimal solution in a complex search space, and the finding of the optimal solution of the complex problem by using the traditional optimization algorithm has little effect. With the popularization of parallel computing and the continuous improvement of computer performance, the performance of a machine is no longer a factor influencing the scientific development of computing, so that an artificial intelligence algorithm is generated at the same time. The genetic algorithm is more and more concerned by people as an intelligent search algorithm capable of solving the complex problem which is difficult to solve by the traditional optimization algorithm, and is widely applied in many fields.
The genetic algorithm is an algorithm for simulating natural selection and population genetic law in the biological evolution process by using a random search mode, and takes a coding population as a basis, and implements genetic operation on a contemporary population through an iterative process to gradually enable the population to contain or approach an optimal solution. As a fusion product of engineering science and life science, the genetic algorithm provides a general solution of an optimization problem which is not limited by the problem, and has the characteristics of simple thought, easy realization, obvious application effect, strong robustness, easy parallel processing and the like.
Although the genetic algorithm has made great progress in theoretical research, a series of problems still exist in the practice and application process, such as slow convergence speed, low search precision, early convergence and the like. In addition, in practical application, many factors influence the optimization problem effect, and if a genetic algorithm is used for solving the problem, a very long coding length is needed, so that the algorithm convergence is slow. If the genetic algorithm can be combined with other intelligent algorithms, and the characteristic information in the population evolution process is utilized to carry out targeted operation on the population, the convergence speed of the algorithm can be greatly improved.
Hybrid genetic algorithms are a hot spot in genetic algorithm research, and the basic idea is to integrate genetic algorithms with knowledge specific to the problem to be solved, thereby producing an excellent solution to the problem. In practical application, the hybrid genetic algorithm not only inherits the advantage of global search of the traditional genetic algorithm, but also can realize optimization of problems more pertinently by fusing knowledge in a specific field. The hybrid genetic algorithm provides a new idea for further improving the performance of the genetic algorithm, shows good performance in practical application, and becomes an important method for solving practical engineering problems and difficult problems.
Disclosure of Invention
The invention aims to provide a novel face recognition method, so that the recognition accuracy is effectively improved, the space overhead is effectively reduced, and the recognition speed is effectively improved.
In order to solve the technical problems, the technical scheme of the invention is as follows: a novel face recognition method combining a combined sparse projection-based improved hybrid orthogonal genetic algorithm (CSJOGA), a bagging method, a selective integration technology and a multi-classification SVM method. It comprises the following stages, as shown in fig. 1:
stage 1) extracting HOG characteristics of a face image, and performing characteristic dimension reduction by using PCA to form a characteristic face;
stage 2) generating a plurality of data sets by using a bagging method, and training on a training set by using a multi-classification SVM;
stage 3) respectively predicting each SVM model on a training set and a test set, adding label columns in a data set, and combining all model prediction result matrixes with labels;
stage 4) solving the optimal selective integrated combination by using a joint sparse projection-based improved hybrid orthogonal genetic algorithm;
and stage 5) according to the optimal selective integration combination calculated by the genetic algorithm, integrating the SVM by using a prediction result matrix on the test set, and calculating the integration classification error rate on the test set.
Compared with the prior art, the invention has the following beneficial effects: the traditional face recognition method based on the bagging integration mode mainly has two defects: one is that the number of the integrated classifiers is large, which causes large storage cost, large calculation cost and slow recognition speed; and the other is that the accuracy of integration is not improved enough, so that the effect in practical application is not ideal. The patent uses a population projection matrix training method based on a joint sparse model and sparse preserving mapping, and mixes the method with an orthogonal cross method to provide a subspace orthogonal cross hybrid genetic algorithm (CSJOGA) which is used for solving a combination problem of selective integration in face recognition. By using the above approach to selective bagging integration, there are two advantages: one is that the number of integrated classifiers can be greatly reduced, the storage and calculation overhead is reduced, and the identification speed is effectively improved; and the other one is that on the premise that the number of integrated classifiers is greatly reduced, the recognition accuracy can be further improved to a higher level, and the classification accuracy in actual application is ensured to meet ideal requirements.
Drawings
Fig. 1 is a flow chart of a specific implementation method of each stage of a selective integrated face recognition method based on a CSJOGA algorithm.
FIG. 2 is a schematic of the main flow of the process.
Detailed Description
And stage 1) extracting HOG characteristics of the face image, and performing characteristic dimension reduction by using PCA to form a characteristic face.
Step 1, HOG characteristics are extracted from a multi-classification face data set.
And 2, performing dimensionality reduction on the HOG features of the human face extracted in the step 1 by using PCA (principal component analysis), and storing the HOG features into a csv format.
And 3, converting the format of the csv-format data set obtained in the step 2 into a format supported by libsvm.
And stage 2) generating a plurality of data sets by using a bagging method, and respectively training on the training sets by using a multi-classification SVM.
And 4, carrying out bagging on the training data set in the libsvm format obtained in the step 3 to generate 100 new data sets with the same size as the original data set.
And 5, training the 100 bagging data sets generated in the step 4 by using a multi-classification SVM respectively.
And 3) respectively predicting each SVM model on the training set and the test set, adding the label columns in the data set, and combining all model prediction result matrixes with labels.
And 6, respectively predicting the 100 SVM classifiers trained in the step 5 in a training set and a test set, and storing prediction results.
And 7, combining the 100 SVM prediction results stored in the step 6 and the label columns in the corresponding data set into a matrix, and storing the matrix.
Stage 4) solving the optimal selective integration combination by using a joint sparse projection-based improved hybrid orthogonal genetic algorithm.
And 8, carrying out population coding.
And (5) real number coding is adopted, the coding length N is 100 (so as to correspond to the 100 SVM classifiers trained in the step 5 one by one), and the coding value range is (0, 1). When the value of a certain bit code is less than or equal to Threshold (the Threshold is generally 0.5 and can be adjusted by self), the SVM classifier represented by the bit is considered to participate in selective integration; otherwise, it is deemed not to participate in the selective integration. In this way, the encoding of each individual represents a selective integration combination itself.
And 9, orthogonally generating an initialization population.
Generating n individuals by using a method of orthogonal initialization of population to form initial population P0The specific method comprises the following steps:
firstly, the feasible solution space is divided into S subspaces (as shown in the algorithm of table 1), and then an orthogonal table is constructed
Figure BDA0001317712600000041
(as shown in the algorithm of Table 5), according to an orthogonal table
Figure BDA0001317712600000042
Performing intersection on each subspace by using the proposed orthogonal intersection operator (shown as an algorithm in table 2) based on joint sparse projection to generate an initial population P0
TABLE 1 subspace partitioning Algorithm
Figure BDA0001317712600000043
Step 10, generating a population P 'to be crossed'gen
PgenLet's represent the current generation population, wherein gen represents the current evolution generation of the algorithm, and total _ gen is the total evolution generation, wherein gen ∈ {1,2, …, total _ gen }, for the current generation population PgenAccording to the cross probability PcrossSelecting and putting into a population P 'to be crossed'genIn (1).
And step 11, performing a crossover operation.
For the population P 'to be crossed generated in the step 10'genObtaining the crossed population C by using an orthogonal cross operator (shown as an algorithm in table 2) based on joint sparse projectiongen
TABLE 2-Joint sparse projection-based orthogonal crossover operator
Figure BDA0001317712600000044
The orthogonal cross operator based on the joint sparse projection provided by the patent integrates three algorithms of self-adaptive minimum spanning tree clustering (shown as an algorithm in table 3), joint sparse projection based on a DCT (shown as an algorithm in table 4) and construction of an orthogonal table (shown as an algorithm in table 5), and the algorithms respectively comprise the following steps:
TABLE 3 adaptive minimum spanning tree clustering algorithm
Figure BDA0001317712600000051
TABLE 4 Joint sparse projection Algorithm based on DCT basis
Figure BDA0001317712600000052
TABLE 5-creation of orthogonal Table LM(QF) Algorithm
Figure BDA0001317712600000053
And 12, executing local search.
For the population P 'to be crossed generated in the step 10'genA clustered local search strategy (shown in the algorithm of table 6) is implemented, and the resulting individuals are placed into the population L after the local search is performedgenIn (1).
TABLE 6 clustering local search strategy Algorithm
Figure BDA0001317712600000061
And step 13, performing mutation operation.
For the population P 'to be crossed generated in the step 10'genWith a mutation probability PmutationNew population G after participating in mutation operation to generate mutationgen. Let p bei=(pi,1,pi,2,…,pi,N) Represents the ith individual to be mutated, wherein, i ∈ {1,2, …, n }, then p is the ith individual to be mutatediThe specific operation for carrying out mutation is as follows: let p beiEach bit of pi,j=lj+r*(uj-lj) Wherein r is in [0,1 ]]A fraction randomly generated within the range, j being in [1, N]An integer randomly generated within the range.
And step 14, calculating the fitness.
And (4) integrating the SVM by using the prediction result matrix on the training set obtained in the step (7) according to the current selective integration combination represented by the genetic algorithm population code, and calculating the integration classification error rate on the training set as the value of the fitness function f.
And step 15, executing selection operation.
Firstly (P)gen+Cgen+Lgen+Ggen) Front with best medium fitness
Figure BDA0001317712600000062
Individuals are placed into next generation population Pgen+1In (wherein, PgenIs the current generation population, CgenIs the population after the crossover in step 11, LgenIs the population after performing the local search in step 12, GgenIs the population after the mutation in step 13, n is the population size), then (P)gen+Cgen+Lgen+Ggen) Random selection among the remaining individuals
Figure BDA0001317712600000063
Individuals are placed into next generation population Pgen+1In (1).
And step 16, judging an iteration termination condition.
If one of the following three conditions is satisfied, stopping iteration and outputting, otherwise, turning to the step 10: (1) the iteration algebra reaches a stop condition; (2) the optimal value of the continuous 50 generations does not change; (3) a global optimal solution has been found.
And stage 5) according to the optimal selective integration combination calculated by the genetic algorithm, integrating the SVM by using a prediction result matrix on the test set, and calculating the integration classification error rate on the test set.
And step 17, according to the optimal selective integration combination calculated after the iteration is stopped in the step 16, selecting partial columns participating in selective integration in all SVM prediction result matrixes predicted and stored by the test set in the step 7, performing selective integration, and calculating the classification error rate of the integrated classifier on the test set.
The CSJOGA in the above embodiment improves the physical significance of the symbols and their parameters appearing in the genetic algorithm, as shown in Table 7.
TABLE 7-CSJOGA improved genetic Algorithm symbol Table
Figure BDA0001317712600000071
In order to test the performance of the improved orthogonal crossover operator provided by the invention, 8 high-dimensional functions are selected as a test set in an experiment and compared with other algorithms. For all functions, take N-30 dimensions. Wherein the function f1-f5Are multi-peaked functions that are used to verify the global search capability of the algorithm.
The test function settings are shown in table 8:
TABLE 8 eight high dimensional test function information
Figure BDA0001317712600000081
The experiment was performed in Matlab 2012a with the experimental parameters set to: using real number coding, population size n is 200, cross probability Pcross0.6, probability of mutation Pmutation0.1, the algorithm stops running algebraic total _ gen 120, and each test function runs independently 10 times to test the stability of the performance of the algorithm. CSJOGA was compared to HSOGA and algorithm OGA/Q, LEA algorithm in terms of the optimal mean M-best and standard deviation st. The results of the experiment are shown in Table 9.
TABLE 9 comparison of the results of the four algorithms (OGA/Q, LEA, HSOGA, CSJOGA)
Figure BDA0001317712600000091
CSJOGA algorithm in function f1、f4-f5The solution found above approximates the globally optimal solution of the function at function f2-f3、f6-f8In the above, the CSJOGA can successfully find the global optimal solution of the function.
Compared with OGA/Q algorithm, CSJOGA algorithm is in function f1、f3、f4-f5Both M-best and St.dev above are superior to the OGA/Q algorithm, at function f2、f6-f8In the above way, both the CSJOGA algorithm and the OGA/Q algorithm can successfully find the global optimal solution of the function.
Compared with LEA algorithm, CSJOGA algorithm is in function f1-f2、f4-f8Both M-best and St.dev above are clearly superior to LEA. At function f3In the above, both CSJOGA and LEA can successfully find the optimal solution of the function.
Compared with HSOGA algorithm, CSJOGA is in function f1、f4-f5Both M-best and St.dev above are clearly superior to HSOGA. At function f2-f3、f6-f8In the above, both CSJOGA and HSOGA algorithms can successfully find the letterA global optimal solution of numbers.
From the above experimental results, the CSJOGA algorithm proposed by the present invention shows good performance.
In addition to the above performance test for improving the orthogonal operators, in order to test the improvement effect of the performance (mainly referring to the error rate and the number of classifiers integrated) of the invention (the selective integrated face recognition method based on the CSJOGA algorithm) compared with the traditional method (the face recognition method based on the bagging integration), an experiment is designed to test three data sets (ar, yale and yaleb, which all comprise a training set and a test set) respectively.
The experimental parameters were as follows:
in the bagging selective integration, the bagging times are 100 times, so that the total number of trained classifiers is 100.
In the CSJOGA improved genetic algorithm, the population size n is set to be 200 individuals, the iteration number total _ gen of each round is 120 generations, the total iteration number is 6 rounds, and the cross rate P iscross0.6, the rate of variation PmutationIs 0.1, the coding length N is 100, and the coding value range is (0, 1).
The experimental conditions were as follows:
1) a programming environment: matlab R2012a (7.14.0.739), 64-bit (Win64)
2) A processor: intel i 7-47903.60 GHz
3) Installing a memory: 8.00GB (7.89GB available)
4) Operating the system: win 764 place flagship edition
The results of the experiment are shown in tables 10 and 11:
TABLE 10 comparison of experimental results in terms of error rates for the present invention and conventional face recognition methods
Figure BDA0001317712600000101
TABLE 11 comparison of experimental results of the present invention and conventional face recognition methods in terms of number of classifiers integrated
The experimental conclusion is as follows:
compared with the traditional face recognition method, the invention can further reduce the error rate of face recognition while greatly reducing the number of integrated classifiers, thereby reducing the storage and calculation expenses, effectively improving the recognition speed and the recognition accuracy and ensuring that the classification precision in practical application can meet the ideal requirement.

Claims (1)

1. The selective integrated face recognition method based on the CSJOGA algorithm is characterized in that:
stage 1) extracting HOG characteristics of a face image, and performing characteristic dimension reduction by using PCA to form a characteristic face;
step 1, extracting HOG characteristics from a multi-classification face data set;
step 2, performing dimensionality reduction on the HOG features of the human face extracted in the step 1 by using PCA, and storing the HOG features into a csv format;
step 3, carrying out format conversion on the csv format data set obtained in the step 2 to change the csv format data set into a format supported by libsvm;
stage 2) generating a plurality of data sets by using a bagging method, and respectively training on the training sets by using a multi-classification SVM;
step 4, bagging is carried out on the training data set in the libsvm format obtained in the step 3, and 100 new data sets with the same size as the original data set are generated;
step 5, training the 100 bagging data sets generated in the step 4 by using a multi-classification SVM respectively;
stage 3) respectively predicting each SVM model on a training set and a test set, adding label columns in a data set, and combining all model prediction result matrixes with labels;
step 6, respectively predicting the 100 SVM classifiers trained in the step 5 in a training set and a test set, and storing prediction results;
step 7, combining the 100 SVM prediction results stored in the step 6 and the label columns in the corresponding data set into a matrix for storage;
stage 4) solving the optimal selective integration combination by using a joint sparse projection-based improved hybrid orthogonal genetic algorithm;
step 8, carrying out population coding;
real number coding is adopted, the coding length N is 100, so that the real number coding length N corresponds to the 100 SVM classifiers trained in the step 5 one by one, and the coding value range is (0, 1); when the value of a certain bit code is less than or equal to Threshold which is 0.5, the SVM classifier represented by the bit is considered to participate in selective integration; otherwise, the system is not considered to participate in the selective integration; thus, the encoding of each individual represents a selective integration combination in itself;
step 9, generating an initialization population in an orthogonal mode;
generating n individuals by using a method of orthogonal initialization of population to form initial population P0The specific method comprises the following steps:
firstly, the feasible solution space is divided into S subspaces, and then an orthogonal table is constructed
Figure FDA0002490468380000011
According to orthogonal tables
Figure FDA0002490468380000012
Performing intersection operation on each subspace by using the proposed orthogonal intersection operator based on the joint sparse projection to generate an initial population P0
TABLE 1 subspace partitioning Algorithm
Figure FDA0002490468380000021
Step 10, generating a population P 'to be crossed'gen
PgenLet's represent the current generation population, wherein gen represents the current evolution generation of the algorithm, and total _ gen is the total evolution generation, wherein gen ∈ {1,2genAccording to the cross probability PcrossSelecting and putting into a population P 'to be crossed'genPerforming the following steps;
step 11, performing a crossover operation;
for the population P 'to be crossed generated in the step 10'genObtaining a crossed population C by using an orthogonal cross operator based on joint sparse projectiongen
TABLE 2-Joint sparse projection-based orthogonal crossover operator
Figure FDA0002490468380000022
The proposed orthogonal cross operator based on the joint sparse projection integrates three algorithms of self-adaptive minimum spanning tree clustering, joint sparse projection based on DCT basis and orthogonal table construction, and the algorithms respectively comprise the following steps:
TABLE 3 adaptive minimum spanning tree clustering algorithm
Figure FDA0002490468380000031
TABLE 4 Joint sparse projection Algorithm based on DCT basis
Figure FDA0002490468380000032
TABLE 5-creation of orthogonal Table LM(QF) Algorithm
Figure FDA0002490468380000033
Step 12, local search is executed;
for the population P 'to be crossed generated in the step 10'genImplementing a clustering local search strategy, placing the generated individuals into a population L after local search is performedgenPerforming the following steps;
TABLE 6 clustering local search strategy Algorithm
Figure FDA0002490468380000041
Step 13, performing mutation operation;
for the population P 'to be crossed generated in the step 10'genWith a mutation probability PmutationNew population G after participating in mutation operation to generate mutationgen(ii) a Let p bei=(pi,1,pi,2,...,pi,N) Represents the ith individual to be mutated, wherein, i ∈ {1, 2.., n }, then p is the ith individual to be mutatediThe specific operation for carrying out mutation is as follows: let p beiEach bit of pi,j=lj+r*(uj-lj) Wherein r is in [0,1 ]]A fraction randomly generated within the range, j being in [1, N]An integer randomly generated within the range;
step 14, calculating the fitness;
integrating SVM by using a prediction result matrix on the training set obtained in the step (7) according to the current selective integration combination represented by the genetic algorithm population code, and calculating the integration classification error rate on the training set as the value of a fitness function f;
step 15, executing selection operation;
firstly (P)gen+Cgen+Lgen+Ggen) Front with best medium fitness
Figure FDA0002490468380000042
Individuals are placed into next generation population Pgen+1In which P isgenIs the current generation population, CgenIs the population after the crossover in step 11, LgenIs the population after performing the local search in step 12, GgenIs the population after the variation of step 13, n is the population size, then (P)gen+Cgen+Lgen+Ggen) Random selection among the remaining individuals
Figure FDA0002490468380000043
Individuals are placed into next generation population Pgen+1Performing the following steps;
step 16, judging iteration termination conditions;
if one of the following three conditions is satisfied, stopping iteration and outputting, otherwise, turning to the step 10: (1) the iteration algebra reaches a stop condition; (2) the optimal value of the continuous 50 generations does not change; (3) a global optimal solution has been found;
stage 5) according to the optimal selective integration combination calculated by the genetic algorithm, integrating the SVM by using a prediction result matrix on the test set, and calculating the integration classification error rate on the test set;
step 17, according to the optimal selective integration combination calculated after the iteration is stopped in the step 16, selecting partial columns participating in selective integration in all SVM prediction result matrixes predicted and stored by the test set in the step 7, performing selective integration, and calculating the classification error rate of the integrated classifier on the test set;
the CSJOGA in the above embodiments improves the physical significance of the symbols and their parameters appearing in the genetic algorithm, as shown in Table 7;
TABLE 7-CSJOGA improved genetic Algorithm symbol Table
Figure FDA0002490468380000051
CN201710432432.3A 2017-06-09 2017-06-09 Selective integrated face recognition method based on CSJOGA algorithm Active CN107273842B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710432432.3A CN107273842B (en) 2017-06-09 2017-06-09 Selective integrated face recognition method based on CSJOGA algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710432432.3A CN107273842B (en) 2017-06-09 2017-06-09 Selective integrated face recognition method based on CSJOGA algorithm

Publications (2)

Publication Number Publication Date
CN107273842A CN107273842A (en) 2017-10-20
CN107273842B true CN107273842B (en) 2020-07-03

Family

ID=60065989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710432432.3A Active CN107273842B (en) 2017-06-09 2017-06-09 Selective integrated face recognition method based on CSJOGA algorithm

Country Status (1)

Country Link
CN (1) CN107273842B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509487B (en) * 2018-02-08 2022-09-30 杨睿嘉 Image retrieval method, device and storage medium based on pulse-issued cortex model
CN113505695A (en) * 2021-07-09 2021-10-15 上海工程技术大学 AEHAL characteristic-based track fastener state detection method
CN113657178A (en) * 2021-07-22 2021-11-16 浙江大华技术股份有限公司 Face recognition method, electronic device and computer-readable storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090161912A1 (en) * 2007-12-21 2009-06-25 Raviv Yatom method for object detection
US8478005B2 (en) * 2011-04-11 2013-07-02 King Fahd University Of Petroleum And Minerals Method of performing facial recognition using genetically modified fuzzy linear discriminant analysis
CN103246874B (en) * 2013-05-03 2017-02-15 北京工业大学 Face identification method based on JSM (joint sparsity model) and sparsity preserving projection
CN103793694B (en) * 2014-02-10 2017-02-08 天津大学 Human face recognition method based on multiple-feature space sparse classifiers
CN103793695B (en) * 2014-02-10 2017-11-28 天津大学 A kind of method of the sub- dictionary joint training of multiple feature spaces for recognition of face
CN104008375B (en) * 2014-06-04 2017-08-25 北京工业大学 The integrated face identification method of feature based fusion
CN106503648A (en) * 2016-10-20 2017-03-15 北京邮电大学 Face identification method and device based on sparse projection binary-coding

Also Published As

Publication number Publication date
CN107273842A (en) 2017-10-20

Similar Documents

Publication Publication Date Title
Chen et al. Supervised feature selection with a stratified feature weighting method
Zeng et al. Incremental partial least squares analysis of big streaming data
WO2021164625A1 (en) Method of training an image classification model
Avrithis et al. Approximate gaussian mixtures for large scale vocabularies
CN111125469B (en) User clustering method and device of social network and computer equipment
CN107273842B (en) Selective integrated face recognition method based on CSJOGA algorithm
CN106446603A (en) Gene expression data clustering method based on improved PSO algorithm
Zhang et al. A hybrid feature selection algorithm for classification unbalanced data processsing
Liu et al. Deep Boltzmann machines aided design based on genetic algorithms
Sreepada et al. An efficient approach for classification of gene expression microarray data
Chandra et al. Learning multiple non-linear sub-spaces using k-rbms
Sun et al. An ECOC approach for microarray data classification based on minimizing feature related complexities
Zhao et al. The ordinal relation preserving binary codes
Liang et al. Facial feature extraction method based on shallow and deep fusion CNN
He et al. Doubly stochastic distance clustering
CN114595336A (en) Multi-relation semantic solution model based on Gaussian mixture model
Han et al. A Feature Selection Method of the Island Algorithm Based on Gaussian Mutation.
CN106909894A (en) Vehicle brand type identifier method and system
Mao et al. Optimizing locally linear classifiers with supervised anchor point learning
Li et al. A new heuristic of the decision tree induction
Chen et al. A method for special vehicle recognition based on deep-transfer model
Zhao et al. A hybrid method for incomplete data imputation
Zheng et al. Online feature selection based on passive-aggressive algorithm with retaining features
Wong et al. Fuzzy system design by a GA-based method for data classification
Gao et al. Vector Quantization for Large Scale CBIR

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant