CN108446640A - Feature extraction optimization method based on image character face's Expression Recognition - Google Patents

Feature extraction optimization method based on image character face's Expression Recognition Download PDF

Info

Publication number
CN108446640A
CN108446640A CN201810237230.8A CN201810237230A CN108446640A CN 108446640 A CN108446640 A CN 108446640A CN 201810237230 A CN201810237230 A CN 201810237230A CN 108446640 A CN108446640 A CN 108446640A
Authority
CN
China
Prior art keywords
particle
image
matrix
representing
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810237230.8A
Other languages
Chinese (zh)
Inventor
陈志�
刘玲
岳文静
周传
陈璐
掌静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201810237230.8A priority Critical patent/CN108446640A/en
Publication of CN108446640A publication Critical patent/CN108446640A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/12Computing arrangements based on biological models using genetic models
    • G06N3/126Evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physiology (AREA)
  • Genetics & Genomics (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of feature extraction optimization methods based on image character face's Expression Recognition.Invention input face image first is simultaneously indicated with the form of matrix, after axis of projection is arranged, obtains projection properties vector.The projection properties covariance matrix of given image is set, and with its trace description best projection directivity function.Best projection directivity function is updated according to the total population scatter matrix of training sample image, meets the vector set update projection properties vector of the axis of projection composition of this function maxima, and forms the matrix for indicating expressive features.Then it revests each element in eigenmatrix to weights, passes through the particle cluster algorithm changed through Gaussian mutation, optimization best projection direction, that is, globally optimal solution.The globally optimal solution in particle cluster algorithm is finally divided into two level group, is divided into leader and follower indicates that multiple main portions of image character face's expression carry out double optimization and identification.The invention can effectively distinguish and optimize the expressive features fruit of character face.

Description

Feature extraction optimization method based on image character facial expression recognition
Technical Field
The invention relates to a feature extraction optimization method based on image character facial expression recognition, which mainly utilizes a two-dimensional principal component analysis method based on statistical feature extraction and an improved particle swarm optimization algorithm to optimize the solution of an image matrix and belongs to the field of image processing, mode recognition and computer vision cross technology application.
Background
The purpose of the feature extraction optimization method is to avoid that the accuracy of facial expressions is reduced due to premature convergence which is trapped prematurely when extracting main features. The recognition of facial expressions of people according to images becomes a hot problem in the application of the existing artificial intelligence, and has an important role in real-time emotion change analysis of the people in the aspect of computer vision. The following three methods are mainly used for extracting facial expression features:
(1) principal component analysis method: the most commonly used linear dimensionality reduction method. The goal is to map the high dimensional data into a lower dimensional representation by some linear projection and expect the variance of the data to be the largest over the projected dimension, thereby using fewer data dimensions while preserving the properties of more raw data points.
(2) Particle swarm optimization: a method based on population iteration. The method is that the particles follow the optimal particles in the solution space to search. The method has the advantages of no need of gradient information, less parameters, and especially, the natural real number coding characteristic is suitable for processing real optimization problems.
(3) Genetic algorithm: a search method based on natural selection of evolution theory, selection of superior and inferior, survival of suitable person and species heredity thought. The aim is to improve the solution of the problem in competition by simulating the living competition and genetic variation and other genetic behaviors of organisms in the natural environment so as to obtain the optimal solution of the problem.
Disclosure of Invention
The technical problem is as follows: the invention aims to make up the defects of feature extraction in the existing image facial expression recognition, and provides a feature extraction optimization method based on image character facial expression recognition. The method combines an improved principal component analysis method with a particle swarm algorithm integrated with a genetic algorithm, solves the detailed extraction of the facial expression of a person in an image, and avoids the problems of overlarge data volume and premature convergence in the extraction process. The method effectively extracts accurate facial expression features and has the characteristics of local optimization and global expression diversity.
The technical scheme is as follows: the invention relates to a feature extraction optimization method based on image character facial expression recognition, which comprises the following steps:
step 1): inputting a facial image matrix A of m X n, and setting Y to AX, wherein m represents the number of columns of the image matrix, n represents the number of rows of the image matrix, X represents the projection axis of the image, and Y represents the projection feature vector of the image;
step 11): setting M and { x1,x2,...,xMM denotes the number of images of the training sample, { x1,x2,...,xMMeans the set of each image in the training sample image, set { Q }1,Q2,…,QNIndicates that the training sample image comes from N expressions Q, wherein the number of images from the k expression is MkCalculatingThe above-mentionedMean value representing training sample image x, calculationSaid StAn overall scatter matrix representing the training sample image x;
step 12): setting J (x) tr (Sx), wherein Sx refers to a covariance matrix of a training sample image x to a projection feature vector Y, and using a formula Sx ═ E (Y-EY)TThat is, tr (Sx) is the trace of Sx, J (x) is the optimal projection direction function, and this formula is combined with S in step 11)tThe expression formula updates the optimal projection direction function to obtain J (X) ═ XTStX, when Xopt={X1,X2,…,Xj}, image projection feature vector Yj=AXjThe composition matrix B ═ Y1,Y2,…,Yj]Said X isoptJ (x) represents the eigenvector corresponding to j maximum eigenvalue, and the matrix B represents the principal component characteristic of the face image matrix A;
step 2): optimizing a matrix B representing expression characteristics, and specifically comprising the following steps:
step 21): each element in Y is endowed with an optimized weight, and the weight is represented as W in a matrix form, and W*Set to the optimal solution of the matrix W, set Ci=(ci1,ci2,…,ciD) And Vi=(vi1,vi2,…,viD) Said C isiIndicates the position of the ith particle, ViThe speed of the ith particle is represented, D represents the dimension of the particle in a search space, pbest is set to represent the position corresponding to the best fitness obtained by each particle in the self-solving process, namely a local optimal solution, and gbest is set to represent the solution corresponding to the highest fitness value of all the particles in the local optimal solution, namely a global optimal solution;
step 22): updating the position of each particleAnd velocity
T and D represent the t-th iteration and D-th dimension in D, respectively, the inertial weight w represents the iterative influence of embedding the previous velocity, r1And r2Is represented by [0,1 ]]Random value in the range, h1And h2Refers to the acceleration constant; p is a radical ofidAnd pgdElements representing pbest and gbest in the d-dimension;
step 23): speed of the modified particles:
the above-mentionedMeans that in the d-dimension, pbest is searched using an average, saidMeans that in the d-dimension, the gbest is applied with a Gaussian distribution, phi (o, h) represents the Gaussian distribution, o represents the average value of h distribution,andrepresenting the upper and lower bounds of the decision vector in the d dimension;
step 24): setting a fitness function Fit (G) w of the particlea*accg+wf*(num_featf)-1Said acc isgNum _ flat representing classification performancefRepresenting the number of selected features, G representing accgAnd num _ featfCriteria of composition, where the classification accuracy score represents the accuracy of each expression, waAnd wfTwo predetermined weights, w, representing the classification performance and the number of selected features, respectivelya=1-wf
Step 25): carrying out iterative solution on the particles, and returning to the step 2) when the iteration does not reach the times, and reevaluating the fitness of each particle;
step 3): the particles are divided into different facial expression parts, and the steps are as follows:
step 31): generating a secondary population, setting a leader particle L to represent the best particle after multiple iterations, setting a following particle F to represent the particle with the lowest correlation degree with the best particle, and returning to the step 25) if not, and re-evaluating the fitness of the particles;
step 32): dividing each particle in the secondary population into a plurality of feature sub-portions, each sub-portion consisting of a portion dimension indicative of a particular facial region; repeating the operation in step 2), updating the optimal solution for the feature subsection, replacing the particles in the secondary population with the updated optimal solution, and settingRepresenting n optimized global optimal solutions W*
Step 4): each element and each W in the feature matrix B*And (4) element multiplication clustering, calculating the distance according to the input image and the feature matrix B of the training sample, and classifying expression features by using an artificial neural network.
Wherein,
in the step 21), the number of the particles is empirically 30, which is indicated to be beneficial to the re-optimization of the particle population.
In said step 22), an acceleration constant h1=h2=2。
In said step 24), waEmpirically, take 0.9, wfThen 0.1 is taken to indicate that the fitness of the particle is optimal at that time.
In the step 31), the leader particle takes 1 and the follower particle F takes 4 according to experience, which indicates the precision of the available range.
Said step 32) each particle in the secondary population is empirically divided into 5 feature sub-segments representing the 5 best recognized segments of the facial expression of the person.
Has the advantages that: the feature extraction optimization method based on character facial expression recognition provided by the invention has the following specific beneficial effects:
(1) according to the invention, through improvement of a principal component analysis method, the positioning and analysis of principal components in the image are improved, the accuracy of expression recognition is improved, and the quantity and dimensionality of data calculation are effectively reduced by a two-dimensional image extraction method;
(2) according to the method, expression image matrix elements with principal components extracted are endowed with optimized weights, the thought of a particle swarm algorithm is utilized, the solution of each optimization problem is regarded as a particle in a search space, and the overall situation is optimized by changing parameters such as the speed and the position of the particle swarm, and the real-time performance is achieved;
(3) the method is combined with an improved genetic algorithm, the problem of premature convergence of data in expression recognition is solved, the diversity of particles in a search space is enhanced, and part of useful data is not easy to lose.
Drawings
Fig. 1 is a flow chart of principal component feature extraction for an expression image of a human face.
Fig. 2 is a flowchart for describing the secondary clustering of the improved particle swarm algorithm.
FIG. 3 is a flow chart of a process for optimizing extracted features of a principal component by a PSO algorithm incorporating an embedded genetic algorithm.
Detailed Description
Some embodiments of the invention are described in more detail below with reference to the accompanying drawings.
In a specific implementation, fig. 1 is a flow chart of principal component feature extraction for an expression image of a human face. An input face image of size 30 × 20 is first set as a matrix a. X is a 30-dimensional column vector representing a projection axis, and a linear change of Y-AX is directly projected onto X to obtain a 30-dimensional column vector Y representing a projection feature, and the optimum projection axis X isoptThe determination can be made according to the divergence distribution of the feature vector Y. Covariance of projection feature vector Y with training samplesThe trace of the difference matrix Sx, where the training sample image is from an existing image database for mapping of a given facial image, represents the optimal projection direction function, j (x) ═ tr (Sx), and the covariance matrix is formulated as Sx ═ E (Y-EY)T
The M facial expression images with the size of 30 × 20 are used as training samples. The images are from N expressions, where the number of images from the k expression is Mk. Calculate all sample meansAnd an overall scatter matrixCombining all the formulas to obtain J (X) ═ XTStX, optimal projection direction XoptIs the eigenvector corresponding to the largest eigenvalue of this function. If X of the J (X) function is satisfiedoptThere are 20, then Xopt={X1,X2,…,X20Is then Y20=AX20Principal component feature matrix B ═ Y composing image a1,Y2,…,Y20]。
Next, the principal component feature matrix B mentioned above is optimized, and fig. 2 is a flowchart for specifically describing the quadratic clustering of the improved particle swarm optimization. Since the particles are in the search space, the optimal solution is found by following the global optimal solution. Thereby initializing a particle group having 30 particles and giving an optimal weight to each element in Y so that its matrix form is W and the optimal solution of W is W*. Each particle ZiIs a solution of W, and the position in the D-dimension search space is Ci=(ci1,ci2,…,ciD) And velocity Vi=(vi1,vi2,…,viD). pbest is that the position of the particle corresponding to the best fitness obtained by each particle in the self-solving process is a local optimal solution, and gbest is that the solution corresponding to the highest fitness value of all the particles in the local optimal solution is a global optimal solution.
Updating the position of each particleAnd velocityt and D are the t-th iteration and D-th dimension in D, respectively, the inertial weight w is used to embed the iterative effect of the previous velocity, r1And r2Is [0,1 ]]Random value in the range, h1And h2Is an acceleration constant, typically 2, pidAnd pgdElements representing pbest and gbest in the d-dimension.
To better maintain the ability to search for diversity in the d-dimension, the velocity of the particles is modified: is the pbest where the average search was used,is a gbest using a gaussian distribution. Where φ (o, h) is a Gaussian distribution, o represents the mean of the h distribution, as the standard deviation of the linear drop during execution,andis the upper and lower bounds of the decision vector in the d dimension.
The fitness of each particle is then evaluated, the criterion G being determined by the classification performance accgAnd the number of selected features num _ featfComposition, where the classification precision score represents the accuracy of each expression, rather than the combined precision of all emotion categories, which helps to avoid bias towards specific emotion categories during optimization, resulting in the definition of expressionsThe bits are more specifically refined. Fit (g) ═ wa*accg+wf*(num_featf)-1Is a fitness function of the particle. w is aaAnd wfTwo predetermined weights, respectively classification performance and number of selected features, representing their relative importance, and wa=1-wf. When w isa=0.9,wfWhen 0.1, the fitness function is optimal. And if the iteration of the particles does not reach the number, re-evaluating the fitness of each particle, and performing the updating operation on the speed of the particle swarm, the local optimal solution and the global optimal solution again.
And generating a secondary population by the particle swarm, obtaining 1 optimal particle serving as a leader particle L of the secondary population through multiple iterations, and selecting 4 following particles F with the lowest correlation degree with the optimal particle. In the process of searching the leader particle L and the follower particle F, pbest and gbest are continuously updated, and the method can enhance the local development and diversity of the search.
Finally, each particle in the secondary population is divided into 5 feature sub-portions, each sub-portion consisting of part dimensions indicative of a specific facial area, e.g., nose, mouth, eyes, eyebrows, cheek, where 5 expressive features stand out, which facilitates the location and recognition of expressions. The updating and modifying operations on the particle population are repeated continuously, updating the optimal solution for the corresponding feature subdivision. Replacing particles in the second-level population by the optimal solution obtained each time to finally obtain the optimized global optimal solutionEach element and each W in the feature matrix B*And (3) performing element multiplication clustering, calculating a distance according to the feature matrix of the given image and the training sample, and classifying expression features by using an artificial neural network, wherein FIG. 3 is a flow chart of a process of optimizing extraction features of a principal component by a PSO algorithm combined with an embedded genetic algorithm, and the whole method is finished.

Claims (6)

1. A feature extraction optimization method based on image character facial expression recognition is characterized by comprising the following steps:
step 1): inputting a facial image matrix A of m X n, and setting Y to AX, wherein m represents the number of columns of the image matrix, n represents the number of rows of the image matrix, X represents the projection axis of the image, and Y represents the projection feature vector of the image;
step 11): setting M and { x1,x2,...,xMM denotes the number of images of the training sample, { x1,x2,...,xMMeans the set of each image in the training sample image, set { Q }1,Q2,…,QNIndicates that the training sample image comes from N expressions Q, wherein the number of images from the k expression is MkCalculatingThe above-mentionedMean value representing training sample image x, calculationSaid StAn overall scatter matrix representing the training sample image x;
step 12): setting J (x) tr (Sx), wherein Sx refers to a covariance matrix of a training sample image x to a projection feature vector Y, and using a formula Sx ═ E (Y-EY)TThat is, tr (Sx) is the trace of Sx, J (x) is the optimal projection direction function, and this formula is combined with S in step 11)tThe expression formula updates the optimal projection direction function to obtain J (X) ═ XTStX, when Xopt={X1,X2,…,Xj}, image projection feature vector Yj=AXjThe composition matrix B ═ Y1,Y2,…,Yj]Said X isoptJ (x) represents the eigenvector corresponding to j maximum eigenvalue, and the matrix B represents the principal component characteristic of the face image matrix A;
step 2): optimizing a matrix B representing expression characteristics, and specifically comprising the following steps:
step 21): each element in Y is endowed with an optimized weight, and the weight is represented as W in a matrix form, and W*Set to the optimal solution of the matrix W, set Ci=(ci1,ci2,…,ciD) And Vi=(vi1,vi2,…,viD) Said C isiIndicates the position of the ith particle, ViRepresenting the velocity of the ith particle, D representing the dimension of the particle in the search space, and pbest represents a position corresponding to the best fitness obtained by each particle in the self-solving process, namely a local optimal solution, and gbest is set to represent a solution corresponding to the highest fitness value of all the particles in the local optimal solution, namely a global optimal solution;
step 22): updating the position of each particleAnd velocity
T and D represent the t-th iteration and D-th dimension in D, respectively, the inertial weight w represents the iterative influence of embedding the previous velocity, r1And r2Is represented by [0,1 ]]Random value in the range, h1And h2Refers to the acceleration constant; p is a radical ofidAnd pgdElements representing pbest and gbest in the d-dimension;
step 23): speed of the modified particles:
the above-mentionedMeans that in the d-dimension, pbest is searched using an average, saidMeans that in the d-dimension, the gbest is applied with a Gaussian distribution, phi (o, h) represents the Gaussian distribution, o represents the average value of h distribution,andrepresenting the upper and lower bounds of the decision vector in the d dimension;
step 24): setting a fitness function Fit (G) w of the particlea*accg+wf*(num_featf)-1Said acc isgNum _ flat representing classification performancefRepresenting the number of selected features, G representing accgAnd num _ featfCriteria of composition, where the classification accuracy score represents the accuracy of each expression, waAnd wfTwo predetermined weights, w, representing the classification performance and the number of selected features, respectivelya=1-wf
Step 25): carrying out iterative solution on the particles, and returning to the step 2) when the iteration does not reach the times, and reevaluating the fitness of each particle;
step 3): the particles are divided into different facial expression parts, and the steps are as follows:
step 31): generating a secondary population, setting a leader particle L to represent the best particle after multiple iterations, setting a following particle F to represent the particle with the lowest correlation degree with the best particle, and returning to the step 25) if not, and re-evaluating the fitness of the particles;
step 32): dividing each particle in the secondary population into a plurality of feature sub-portions, each sub-portion consisting of a portion dimension indicative of a particular facial region; repeating the operation in step 2), updating the optimal solution for the feature subsection, replacing the particles in the secondary population with the updated optimal solution, and settingRepresenting n optimized global optimal solutions W*
Step 4): each element and each W in the feature matrix B*And (4) element multiplication clustering, calculating the distance according to the input image and the feature matrix B of the training sample, and classifying expression features by using an artificial neural network.
2. The method as claimed in claim 1, wherein in the step 21), the number of particles is empirically determined to be 30, which is favorable for re-optimization of the population of particles.
3. The method as claimed in claim 1, wherein the acceleration constant h in step 22) is equal to or greater than the acceleration constant h1=h2=2。
4. The method as claimed in claim 1, wherein in step 24), w is a number of facial expressions of the image personaEmpirically, take 0.9, wfThen 0.1 is taken to indicate that the fitness of the particle is optimal at that time.
5. The feature extraction optimization method based on image character facial expression recognition is characterized in that in the step 31), the refinement of the retrievable range is represented by empirically taking 1 for the leader particle and 4 for the follower particle F.
6. The method of claim 1, wherein in step 32), each particle in the secondary population is empirically divided into 5 feature sub-portions representing the 5 best recognized portions of the facial expression of the person.
CN201810237230.8A 2018-03-21 2018-03-21 Feature extraction optimization method based on image character face's Expression Recognition Pending CN108446640A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810237230.8A CN108446640A (en) 2018-03-21 2018-03-21 Feature extraction optimization method based on image character face's Expression Recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810237230.8A CN108446640A (en) 2018-03-21 2018-03-21 Feature extraction optimization method based on image character face's Expression Recognition

Publications (1)

Publication Number Publication Date
CN108446640A true CN108446640A (en) 2018-08-24

Family

ID=63196440

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810237230.8A Pending CN108446640A (en) 2018-03-21 2018-03-21 Feature extraction optimization method based on image character face's Expression Recognition

Country Status (1)

Country Link
CN (1) CN108446640A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169561A (en) * 2017-05-09 2017-09-15 广西师范大学 Towards the hybrid particle swarm impulsive neural networks mapping method of power consumption
CN107547457A (en) * 2017-09-15 2018-01-05 重庆大学 A kind of approach for blind channel equalization based on Modified particle swarm optimization BP neural network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169561A (en) * 2017-05-09 2017-09-15 广西师范大学 Towards the hybrid particle swarm impulsive neural networks mapping method of power consumption
CN107547457A (en) * 2017-09-15 2018-01-05 重庆大学 A kind of approach for blind channel equalization based on Modified particle swarm optimization BP neural network

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BIN JIAO: "An Improved Cooperative Quantum Particle Swarm Optimization algorithm for", 《2010 INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTATION TECHNOLOGY AND AUTOMATION》 *
DONG LI: "Research on Improved Particle-Swarm-Optimization Algorithm based on", 《2017 29TH CHINESE CONTROL AND DECISION CONFERENCE (CCDC)》 *
吴晓军,杨战中,赵明: "均匀搜索粒子群算法", 《电子学报》 *
张伟,师奕兵,周龙甫,卢涛: "基于改进粒子群算法的小波神经网络分类器", 《仪器仪表学报》 *
陈幼: "基于Gabor小波与粒子群优化算法的人脸表情识别研究", 《中国优秀硕士学位论文全文数据库》 *

Similar Documents

Publication Publication Date Title
CN110414377B (en) Remote sensing image scene classification method based on scale attention network
CN109190524B (en) Human body action recognition method based on generation of confrontation network
CN105701502B (en) Automatic image annotation method based on Monte Carlo data equalization
CN114841257B (en) Small sample target detection method based on self-supervision comparison constraint
CN110443257B (en) Significance detection method based on active learning
CN107992850B (en) Outdoor scene three-dimensional color point cloud classification method
CN108121781B (en) Related feedback image retrieval method based on efficient sample selection and parameter optimization
CN103914705B (en) Hyperspectral image classification and wave band selection method based on multi-target immune cloning
CN111833322B (en) Garbage multi-target detection method based on improved YOLOv3
CN106845528A (en) A kind of image classification algorithms based on K means Yu deep learning
CN109635140B (en) Image retrieval method based on deep learning and density peak clustering
CN113032613B (en) Three-dimensional model retrieval method based on interactive attention convolution neural network
CN110188668B (en) Small sample video action classification method
CN106844620B (en) View-based feature matching three-dimensional model retrieval method
CN104408760A (en) Binocular-vision-based high-precision virtual assembling system algorithm
CN113435108B (en) Battlefield target grouping method based on improved whale optimization algorithm
CN103065158A (en) Action identification method of independent subspace analysis (ISA) model based on relative gradient
CN104318271B (en) Image classification method based on adaptability coding and geometrical smooth convergence
CN108596186B (en) Three-dimensional model retrieval method
CN107220597B (en) Key frame selection method based on local features and bag-of-words model human body action recognition process
CN109948662B (en) Face image depth clustering method based on K-means and MMD
CN114462466A (en) Deep learning-oriented data depolarization method
Ghosal et al. A comparative study among clustering techniques for leaf segmentation in rosette plants
CN107045520A (en) A kind of vehicle image search method that words tree is weighted based on positional information
CN115273645B (en) Map making method for automatically clustering indoor surface elements

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180824