CN116702018A - GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device - Google Patents

GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device Download PDF

Info

Publication number
CN116702018A
CN116702018A CN202310420313.1A CN202310420313A CN116702018A CN 116702018 A CN116702018 A CN 116702018A CN 202310420313 A CN202310420313 A CN 202310420313A CN 116702018 A CN116702018 A CN 116702018A
Authority
CN
China
Prior art keywords
pdpl
dictionary
model
electroencephalogram emotion
electroencephalogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310420313.1A
Other languages
Chinese (zh)
Other versions
CN116702018B (en
Inventor
苏吉普
常洪丽
胡静
宋铁成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202310420313.1A priority Critical patent/CN116702018B/en
Publication of CN116702018A publication Critical patent/CN116702018A/en
Application granted granted Critical
Publication of CN116702018B publication Critical patent/CN116702018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/12Computing arrangements based on biological models using genetic models
    • G06N3/126Evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Evolutionary Computation (AREA)
  • Pure & Applied Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computational Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Algebra (AREA)
  • Physiology (AREA)
  • Genetics & Genomics (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a cross-test electroencephalogram emotion recognition method and device based on a GA-PDPL algorithm, wherein the method comprises the following steps: inputting the tested electroencephalogram emotion data into a pre-trained GA-PDPL model, and calculating residual errors of the tested electroencephalogram emotion data and each electroencephalogram emotion type by utilizing the tested electroencephalogram emotion data and a comprehensive dictionary and an analysis dictionary which are output by the GA-PDPL model; the GA-PDPL model is obtained by adding dictionary pairs of a comprehensive dictionary and an analysis dictionary into the DPL model, introducing a coding coefficient matrix to construct, training the GA-PDPL model by using electroencephalogram emotion data with a plurality of tested brain waves, and optimizing parameters in the constructed GA-PDPL model by using a genetic algorithm in the training process; and taking the electroencephalogram emotion type corresponding to the minimum residual value in the residual values as the emotion type corresponding to the tested electroencephalogram emotion data. The method provided by the invention has the advantages of high identification speed and high accuracy.

Description

GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device
Technical Field
The invention relates to the technical field of electroencephalogram signals, in particular to a method and a device for identifying a cross-tested electroencephalogram emotion based on a GA-PDPL algorithm.
Background
Electroencephalogram (Electro Encephalo Gram, EEG) changes with emotion, and emotion recognition can be performed based on electroencephalogram.
Because EEG signals can vary greatly between individuals, one of the biggest challenges in electroencephalogram emotion recognition is developing models that can be generalized to new, invisible subjects. The existing emotion recognition model is low in recognition speed and accuracy, and needs to be solved.
Disclosure of Invention
The invention provides a cross-test electroencephalogram emotion recognition method based on a GA-PDPL algorithm, which is high in recognition speed and accuracy.
The embodiment of the first aspect of the invention provides a cross-test electroencephalogram emotion recognition method based on a GA-PDPL algorithm, which comprises the following steps: acquiring brain electricity emotion data to be identified; inputting the tested electroencephalogram emotion data into a pre-trained GA-PDPL model, and calculating residual errors of the tested electroencephalogram emotion data and each electroencephalogram emotion type by utilizing a comprehensive dictionary and an analysis dictionary which are output by the tested electroencephalogram emotion data and the GA-PDPL model to obtain a plurality of residual values; the GA-PDPL model is obtained by adding dictionary pairs of a comprehensive dictionary and an analysis dictionary into a DPL model, introducing a coding coefficient matrix to construct, training the GA-PDPL model through a plurality of tested electroencephalogram emotion test samples and corresponding electroencephalogram emotion type labels, and optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in the training process; and taking the electroencephalogram emotion type corresponding to the minimum residual value in the residual values as the emotion type corresponding to the tested electroencephalogram emotion data.
Optionally, in one embodiment of the present invention, before inputting the brain electrical emotion data to be tested into the pre-trained GA-PDPL model, the method further includes:
adding dictionary pairs in the DPL model to build the PDPL model to obtain:
wherein D is a comprehensive dictionary, P is an analysis dictionary, K is an electroencephalogram emotion type, F k Electroencephalogram emotion sample for emotion class k, lambda is a scalar constant,is F k Complement of d i An ith atom of D;
introducing a coding coefficient matrix A to relax the PDPL model to obtain:
wherein A is a coding coefficient matrix;
training the GA-PDPL model by using a tested electroencephalogram emotion sample book with a plurality of tested electroencephalogram emotion type labels and corresponding electroencephalogram emotion type labels, updating a coding coefficient matrix A, a comprehensive dictionary D and an analysis dictionary P in the PDPL model, minimizing the PDPL model, and optimizing a plurality of experience parameters of the GA-PDPL model by using a genetic algorithm in the updating process.
Optionally, in one embodiment of the present invention, the training the GA-PDPL model with the tested electroencephalogram samples having multiple tested electroencephalogram emotion samples and corresponding electroencephalogram emotion class labels, updating the coding coefficient matrix a, the comprehensive dictionary D, and the analysis dictionary P in the PDPL model so that the PDPL model is minimized includes:
1) The comprehensive dictionary D and the analysis dictionary P are fixed, and the coding coefficient matrix A is updated:
where τ is a scalar constant;
a closed form solution of the coding coefficient matrix a is obtained:
wherein I is an identity matrix;
2) Fixing the coding coefficient matrix A, updating the comprehensive dictionary D and the analysis dictionary P:
a solution is obtained for analyzing the closed form of dictionary P:
wherein γ is a scalar constant;
introducing a variable S optimization comprehensive dictionary D:
the optimal solution of the comprehensive dictionary D is obtained through an ADMM algorithm:
and 1) carrying out multi-round optimization training on the PDPL model through the steps 1) and 2), and updating the coding coefficient matrix A, the comprehensive dictionary D and the analysis dictionary P in the PDPL model so as to minimize the PDPL model.
Optionally, in one embodiment of the present invention, optimizing parameters in the GA-PDPL model constructed using genetic algorithms during training includes:
initializing: generating an initial solution population having randomly assigned parameter values;
evaluation: evaluating the fitness of each solution in the population by using a projection dictionary to a learning algorithm and a fitness function;
selecting: selecting a subset of solutions to be used as next generation parents according to fitness;
mutation: creating a new solution by cross-combining the parameters of the selected parents;
evaluation: introducing random variations to the parameters of certain solutions to explore new areas of the search space;
replacement: selecting the best solution from the previous generation and the new generation to form the next generation;
and (3) terminating: terminating the algorithm when a stop condition is satisfied;
and (3) outputting: and returning the optimal solution found by the genetic algorithm, and corresponding to the optimal empirical parameter value of the GA-PDPL model.
Optionally, in an embodiment of the present invention, a determining formula for using an electroencephalogram emotion type corresponding to a minimum residual value in the plurality of residual values as an emotion type corresponding to the tested electroencephalogram emotion data is:
wherein f t D, for the brain electricity emotion data to be identified i P for class i comprehensive sub-dictionary i Is the analysis sub-dictionary of class i.
An embodiment of a second aspect of the present invention provides a device for identifying brain emotion across a test based on a GA-PDPL algorithm, including: the acquisition module is used for acquiring the brain electricity emotion data to be identified; the recognition module is used for inputting the tested electroencephalogram emotion data into a pre-trained GA-PDPL model, and calculating residual errors of the tested electroencephalogram emotion data and each electroencephalogram emotion type by utilizing the tested electroencephalogram emotion data and a comprehensive dictionary and an analysis dictionary which are output by the GA-PDPL model to obtain a plurality of residual values; the GA-PDPL model is obtained by adding dictionary pairs of a comprehensive dictionary and an analysis dictionary into a DPL model, introducing a coding coefficient matrix to construct, training the GA-PDPL model through a plurality of tested electroencephalogram emotion test samples and corresponding electroencephalogram emotion type labels, and optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in the training process; and the output module is used for taking the electroencephalogram emotion type corresponding to the minimum residual value in the residual values as the emotion type corresponding to the tested electroencephalogram emotion data.
Optionally, in one embodiment of the present invention, the apparatus further includes: the construction module is used for adding dictionary pairs in the DPL model to construct the PDPL model, so as to obtain:
wherein D is a comprehensive dictionary, P is an analysis dictionary, K is an electroencephalogram emotion type, F k Electroencephalogram emotion sample lambda for emotion class k is a scalar constant,is F k Complement of d i An ith atom of D;
introducing a coding coefficient matrix A to relax the PDPL model to obtain:
wherein A is a coding coefficient matrix;
1) The comprehensive dictionary D and the analysis dictionary P are fixed, and the coding coefficient matrix A is updated:
where τ is a scalar constant;
a closed form solution of the coding coefficient matrix a is obtained:
wherein I is an identity matrix;
2) Fixing the coding coefficient matrix A, updating the comprehensive dictionary D and the analysis dictionary P:
a solution is obtained for analyzing the closed form of dictionary P:
wherein γ is a scalar constant;
introducing a variable S optimization comprehensive dictionary D:
the optimal solution of the comprehensive dictionary D is obtained through an ADMM algorithm:
performing multi-round optimization training on the PDPL model through 1) and 2), and updating a coding coefficient matrix A, a comprehensive dictionary D and an analysis dictionary P in the PDPL model to minimize the PDPL model;
optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in a training process, wherein the method comprises the following steps of:
initializing: generating an initial solution population having randomly assigned parameter values;
evaluation: evaluating the fitness of each solution in the population by using a projection dictionary to a learning algorithm and a fitness function;
selecting: selecting a subset of solutions to be used as next generation parents according to fitness;
mutation: creating a new solution by cross-combining the parameters of the selected parents;
evaluation: introducing random variations to the parameters of certain solutions to explore new areas of the search space;
replacement: selecting the best solution from the previous generation and the new generation to form the next generation;
and (3) terminating: terminating the algorithm when a stop condition is satisfied;
and (3) outputting: and returning the optimal solution found by the genetic algorithm, and corresponding to the optimal empirical parameter value of the GA-PDPL model.
Optionally, in an embodiment of the present invention, a determining formula for using an electroencephalogram emotion type corresponding to a minimum residual value in the plurality of residual values as an emotion type corresponding to the tested electroencephalogram emotion data is:
wherein f t D, for the brain electricity emotion data to be identified i P for class i comprehensive sub-dictionary i Is the analysis sub-dictionary of class i.
An embodiment of a third aspect of the present invention provides an electronic device, including: the device comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the program to execute the cross-test electroencephalogram emotion recognition method based on the GA-PDPL algorithm as described in the embodiment.
An embodiment of a fourth aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program to be executed by a processor to perform the cross-test electroencephalogram emotion recognition method based on the GA-PDPL algorithm as described in the above embodiment.
According to the GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device, the comprehensive dictionary and the analysis dictionary are used for enhancing feature representation, the genetic algorithm is utilized for parameter optimization, and the best dictionary and parameters are selected, so that the best recognition effect is achieved, the recognition speed is high, and the accuracy is high.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flowchart of a cross-test electroencephalogram emotion recognition method based on a GA-PDPL algorithm, which is provided by an embodiment of the invention;
FIG. 2 is a schematic block diagram of a cross-test electroencephalogram emotion recognition device based on a GA-PDPL algorithm according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
Fig. 1 is a flowchart of a cross-test electroencephalogram emotion recognition method based on a GA-PDPL algorithm according to an embodiment of the present invention.
As shown in FIG. 1, the cross-test electroencephalogram emotion recognition method based on the GA-PDPL algorithm comprises the following steps:
in step S101, brain wave emotion data to be identified is acquired.
In step S102, inputting the tested electroencephalogram emotion data into a pre-trained GA-PDPL model, and calculating residual errors of the tested electroencephalogram emotion data and each electroencephalogram emotion type by utilizing a comprehensive dictionary and an analysis dictionary which are output by the tested electroencephalogram emotion data and the GA-PDPL model to obtain a plurality of residual values; the GA-PDPL model is obtained by adding dictionary pairs of a comprehensive dictionary and an analysis dictionary into the DPL model, introducing a coding coefficient matrix, constructing, training the GA-PDPL model through a plurality of tested electroencephalogram emotion test specimens and corresponding electroencephalogram emotion type labels, and optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in the training process.
In the embodiment of the invention, the GA-PDPL model is obtained by training the GA-PDPL model through a plurality of tested electroencephalogram emotion test specimens and corresponding electroencephalogram emotion type labels. Acquiring an electroencephalogram emotion database containing a plurality of tested electroencephalogram emotion characteristic samples and corresponding electroencephalogram emotion type labels, wherein one sample in the electroencephalogram database can be expressed asb and c are the frequency band number and the electrode number of the EEG signal, respectively. The labels corresponding to samples having K total electroencephalogram emotion categories may be expressed as y e {1,2,3,..k }. With F= { F 1 ,...,F k ,...,F K Sum y= { Y 1 ,...,y k ,...,y K Respectively represent a set of training samples and training tags from class K, wherein +.> For training sample set of class k, p=b×c,/for class k>To train the tag set, n is the number of samples per class.
The discriminant dictionary learning method (DPL) focuses on acquiring a skilled data representation model from F to solve the classification task by using class label information of training data. This can be formulated within the framework set out below:
in the training model (1), λ+.0 is a scalar constant, and the coding coefficient matrix A on D of the integrated dictionary D, F is used. Data fidelity itemEnsure the representation capability of D, and l p -norm regularizer a p Imposed on a. Furthermore, a discriminant function ψ (D, a, Y) is used to ensure discriminant of D and a.
The discriminant model in equation (1) is intended to train a comprehensive dictionary D that can sparsely represent the signal F. Unfortunately, acquiring code A of the dictionary requires time consuming/ rnomm Sparse coding process. To improve efficiency, an analysis dictionary can be foundSatisfying a=pf, and efficient representation of F can be achieved without sparse coding. To achieve this, an analysis dictionary was learned using the integrated dictionary D, resulting in the following formulated model:
in the DPL model, the analysis dictionary P is used for the analytical coding of F, while the synthesis dictionary D is used for the reconstruction of F, and the discriminant function ψ (D, P, F, Y) is applied throughout the process. To improve the efficiency of the model, a structured comprehensive dictionary and an analysis dictionary d= [ D ] are learned 1 ,D 2 ,...,D k ]Sum p= [ P ] 1 ,P 2 ,...,P K ]. Each sub-dictionary pair of k classes is composed ofAnd-> In order to ensure that samples from class i (where i+.k) are projected to the zero space using the structured analysis dictionary P, P is designed k . This is achieved by using sparse subspace clustering, which suggests that under certain incoherent conditions, the signal may beRepresented by their respective dictionaries. The equation for this process is as follows:
the structured comprehensive dictionary D may also be used to reconstruct the data matrix F. Specifically, sub-dictionary D k Can effectively project the code matrix P k F k Reconstructing a data matrix F k . Thus, dictionary pairs are used to minimize reconstruction errors:
based on the foregoing discussion, adding dictionary pairs to the DPL model builds a PDPL model, and the resulting formula for the PDPL model can be expressed as follows:
the comprehensive dictionary D comprises a dictionary D i Atoms represented, wherein the energy of the ith atom is limited to avoid trivial solution P k =0, which stabilizes DPL. In addition, in the case of the optical fiber,representing the complement of Fk in the entire training set F. While sparse coding may not be essential for classification, the DPL model provides faster computation speed and demonstrates highly competitive classification performance. Thus, the following method is used for classification purposes. To optimize the non-convex objective function in (5), a coding coefficient matrix a is introduced and (5) is relaxed to the following problem:
(6) The objective function in (c) consists of a term that relates to the Frobenius norm, a scalar constant τ, which makes the solution very simple. To start the analysis dictionary P and the synthesis dictionary D, starting from a random matrix with the unit Frobenius norm, then continuing to update A, D and P in the process of minimizing (6).
Training the established GA-PDPL model by using a plurality of tested electroencephalogram emotion test specimens and corresponding electroencephalogram emotion type labels, updating a coding coefficient matrix A, a comprehensive dictionary D and an analysis dictionary P in the PDPL model, minimizing the PDPL model, and optimizing a plurality of experience parameters of the GA-PDPL model by using a genetic algorithm in the updating process.
The minimization process alternates between the following two steps:
1) The comprehensive dictionary D and the analysis dictionary P are fixed, and the coding coefficient matrix A is updated:
a closed form solution to this standard least squares problem is available:
2) Fixing the coding coefficient matrix A, updating the comprehensive dictionary D and the analysis dictionary P:
a solution is obtained for analyzing the closed form of dictionary P:
where γ is a scalar constant, which is a fraction.
Introducing a variable S optimization comprehensive dictionary D:
the optimal solution of the comprehensive dictionary D is obtained through an ADMM algorithm:
and 3) carrying out multi-round optimization training on the PDPL model through the steps 1) and 2), and updating the coding coefficient matrix A, the comprehensive dictionary D and the analysis dictionary P in the PDPL model so as to minimize the PDPL model.
The proposed DPL model has a fast training process due to the fast convergence of the closed form solutions of variables a and P in each optimization step. Optimization of D is based on ADMM, which also converges quickly and stops the iteration when the energy difference between two adjacent iterations is less than 0.01. The analysis dictionary P and the synthesis dictionary D are outputs of the classification after convergence. The objective function in equation (9) is used to improve the discrimination of the analysis dictionary P while minimizing the reconstruction error. This balance between targets enables the model to achieve differentiation and representation capabilities. The algorithm flow is as follows:
during training, the optimization of the PDPL model parameters was performed using the Genetic Algorithm (GA) using the university of Sheffield genetic algorithm kit (gatbx).
Initializing: an initial solution population is generated having randomly assigned parameter values. The initialization parameters are shown in Table 1 and include the maximum number of genetics, population size, cross-over function, probability of variation and the t-parameter of PDPL. Furthermore, the GA-optimized PDPL parameters include the following four: m, τ, λ and γ, the threshold ranges and the coding modes are shown in table 2.
Initialization parameters of Table 1 GA
Maximum genetic algebra 50
Population size 20
Selection function sus
Probability of variation 0.7
t 20
Table 2 describes the length and how to decode the moments of each substring in the chromosome
FieldD m τ λ γ
len 9 9 9 9
lb 1 0 0 0
ub 310 0.1 0.01 0.001
code gray gray gray gray
scale arithmetic arithmetic arithmetic arithmetic
lbin 0 0 l 1
ubin 1 1 1 1
Evaluation: the fitness of each solution in the population is evaluated using a projection dictionary to a learning algorithm and a fitness function. The fitness function is designed as the identification accuracy of PDPL on the test set:
Fitness=Accuracy test (13)
selecting: a subset of solutions to be used as next generation parents is selected according to their fitness.
Crossing: a new solution is created by cross-combining the parameters of selected parents.
Mutation: random variations are introduced on the parameters of certain solutions to explore new areas of the search space.
Evaluation: the applicability of the new solution created by crossover and mutation was evaluated.
Replacement: the best solution is selected from the previous generation and the new generation to compose the next generation.
And (3) terminating: the algorithm is terminated when a stopping criterion is fulfilled, for example reaching a maximum number of generations or reaching a desired level of fitness.
And (3) outputting: and returning the optimal solution found by the GA, and corresponding to the optimal parameter value of the projection dictionary to the learning algorithm.
In step S103, the electroencephalogram emotion type corresponding to the smallest residual value among the plurality of residual values is used as the emotion type corresponding to the electroencephalogram emotion data to be tested.
In the identification stage, input the brain electricity emotion data f to be identified t Thereafter, a query sample f of unknown class is calculated for each class t Is a residual of (c). Designating the category corresponding to the minimum residual as the category of the test sample:
if class i reaches the minimum residual in equation (14), sample f t Assigned to class i, wherein D i And P i The comprehensive sub-dictionary and the analysis sub-dictionary of the class are represented respectively.
To verify the effectiveness of the present invention, the proposed GA-PDPL was compared with advanced methods in a subject independent EEG emotion recognition setup on SEED and MPED datasets, as shown in tables 3 and 4, respectively. From the table it can be concluded that the proposed method is superior to the current conventional method under a theme independent protocol. The proposed method uses a comprehensive dictionary and an analytical dictionary to enhance feature representation compared to other methods. And further optimizing parameters by using a genetic algorithm, and selecting the best dictionary and parameters so as to achieve the best recognition effect.
Table 3 mean Accuracy (ACC) and standard deviation (STD) of subject independent experiments on SEED dataset
Table 4 average Accuracy (ACC) and standard deviation (STD) of subject independent experiments on MPED dataset
According to the GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method provided by the embodiment of the invention, the comprehensive dictionary and the analysis dictionary are used for enhancing the characteristic representation, the genetic algorithm is utilized for carrying out parameter optimization, and the best dictionary and parameters are selected, so that the best recognition effect is achieved, the recognition speed is high, and the accuracy is high.
Next, a cross-test electroencephalogram emotion recognition device based on a GA-PDPL algorithm according to an embodiment of the present invention is described with reference to the accompanying drawings.
Fig. 2 is a schematic block diagram of a cross-test electroencephalogram emotion recognition device based on a GA-PDPL algorithm according to an embodiment of the present invention.
As shown in fig. 2, the GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition apparatus 10 includes: an acquisition module 100, an identification module 200 and an output module 300.
The acquisition module 100 is used for acquiring the brain electricity emotion data to be identified; the recognition module 200 is used for inputting the tested electroencephalogram emotion data into a pre-trained GA-PDPL model, and calculating residual errors of the tested electroencephalogram emotion data and each electroencephalogram emotion type by utilizing a comprehensive dictionary and an analysis dictionary which are output by the tested electroencephalogram emotion data and the GA-PDPL model to obtain a plurality of residual values; the GA-PDPL model is obtained by adding dictionary pairs of a comprehensive dictionary and an analysis dictionary into a DPL model, introducing a coding coefficient matrix to construct, training the GA-PDPL model through a plurality of tested electroencephalogram emotion test specimens and corresponding electroencephalogram emotion type labels, and optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in the training process; and the output module 300 is configured to use an electroencephalogram emotion type corresponding to the minimum residual value in the plurality of residual values as an emotion type corresponding to the tested electroencephalogram emotion data.
Optionally, in one embodiment of the present invention, the apparatus further includes: the construction module is used for adding dictionary pairs in the DPL model to construct the PDPL model, so as to obtain:
wherein D is a comprehensive dictionary, P is an analysis dictionary, K is an electroencephalogram emotion type, F k Electroencephalogram emotion sample for emotion class k, lambda is a scalar constant,is F k Complement of d i An ith atom of D;
introducing a coding coefficient matrix A to relax the PDPL model to obtain:
wherein A is a coding coefficient matrix;
1) The comprehensive dictionary D and the analysis dictionary P are fixed, and the coding coefficient matrix A is updated:
where τ is a scalar constant;
a closed form solution of the coding coefficient matrix a is obtained:
wherein I is an identity matrix;
2) Fixing the coding coefficient matrix A, updating the comprehensive dictionary D and the analysis dictionary P:
a solution is obtained for analyzing the closed form of dictionary P:
wherein γ is a scalar constant;
introducing a variable S optimization comprehensive dictionary D:
the optimal solution of the comprehensive dictionary D is obtained through an ADMM algorithm:
performing multi-round optimization training on the PDPL model through the steps of 1) and 2), and updating a coding coefficient matrix A, a comprehensive dictionary D and an analysis dictionary P in the PDPL model to minimize the PDPL model;
optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in the training process comprises the following steps:
initializing: generating an initial solution population having randomly assigned parameter values;
evaluation: evaluating the fitness of each solution in the population by using a projection dictionary to a learning algorithm and a fitness function;
selecting: selecting a subset of solutions to be used as next generation parents according to fitness;
mutation: creating a new solution by cross-combining the parameters of the selected parents;
evaluation: introducing random variations to the parameters of certain solutions to explore new areas of the search space;
replacement: selecting the best solution from the previous generation and the new generation to form the next generation;
and (3) terminating: terminating the algorithm when a stop condition is satisfied;
and (3) outputting: and returning the optimal solution found by the genetic algorithm, and corresponding to the optimal empirical parameter value of the GA-PDPL model.
Optionally, in an embodiment of the present invention, a determination formula for using an electroencephalogram emotion type corresponding to a minimum residual value of the plurality of residual values as an emotion type corresponding to tested electroencephalogram emotion data is:
wherein f t D, for the brain electricity emotion data to be identified i P for class i comprehensive sub-dictionary i Is the analysis sub-dictionary of class i.
It should be noted that, the foregoing explanation of the embodiment of the cross-test electroencephalogram emotion recognition method based on the GA-PDPL algorithm is also applicable to the cross-test electroencephalogram emotion recognition device based on the GA-PDPL algorithm of this embodiment, and will not be described herein.
According to the GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition device provided by the embodiment of the invention, the comprehensive dictionary and the analysis dictionary are used for enhancing the characteristic representation, the genetic algorithm is utilized for carrying out parameter optimization, and the best dictionary and parameters are selected, so that the best recognition effect is achieved, the recognition speed is high, and the accuracy is high.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device may include:
memory 301, processor 302, and a computer program stored on memory 301 and executable on processor 302.
The processor 302 implements the cross-test electroencephalogram emotion recognition method based on the GA-PDPL algorithm provided in the above embodiment when executing a program.
Further, the electronic device further includes:
a communication interface 303 for communication between the memory 301 and the processor 302.
A memory 301 for storing a computer program executable on the processor 302.
The memory 301 may comprise a high-speed RAM memory or may further comprise a non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 301, the processor 302, and the communication interface 303 are implemented independently, the communication interface 303, the memory 301, and the processor 302 may be connected to each other through a bus and perform communication with each other. The bus may be an industry standard architecture (Industry Standard Architecture, abbreviated ISA) bus, an external device interconnect (Peripheral Component, abbreviated PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, abbreviated EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 3, but not only one bus or one type of bus.
Alternatively, in a specific implementation, if the memory 301, the processor 302, and the communication interface 303 are integrated on a chip, the memory 301, the processor 302, and the communication interface 303 may perform communication with each other through internal interfaces.
The processor 302 may be a central processing unit (Central Processing Unit, abbreviated as CPU) or an application specific integrated circuit (Application Specific Integrated Circuit, abbreviated as ASIC) or one or more integrated circuits configured to implement embodiments of the present invention.
The embodiment also provides a computer readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the above method for identifying brain emotion across tests based on the GA-PDPL algorithm.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "N" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more N executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present invention.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. As with the other embodiments, if implemented in hardware, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or part of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, where the program when executed includes one or a combination of the steps of the method embodiments.

Claims (10)

1. A cross-test brain electricity emotion recognition method based on a GA-PDPL algorithm is characterized by comprising the following steps:
acquiring brain electricity emotion data to be identified;
inputting the tested electroencephalogram emotion data into a pre-trained GA-PDPL model, and calculating residual errors of the tested electroencephalogram emotion data and each electroencephalogram emotion type by utilizing a comprehensive dictionary and an analysis dictionary which are output by the tested electroencephalogram emotion data and the GA-PDPL model to obtain a plurality of residual values; the GA-PDPL model is obtained by adding dictionary pairs of a comprehensive dictionary and an analysis dictionary into a DPL model, introducing a coding coefficient matrix to construct, training the GA-PDPL model through a plurality of tested electroencephalogram emotion test samples and corresponding electroencephalogram emotion type labels, and optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in the training process;
and taking the electroencephalogram emotion type corresponding to the minimum residual value in the residual values as the emotion type corresponding to the tested electroencephalogram emotion data.
2. The method of claim 1, further comprising, prior to inputting the brain electrical emotion data to be tested into a pre-trained GA-PDPL model:
adding dictionary pairs in the DPL model to build the PDPL model to obtain:
wherein D is a comprehensive dictionary, P is an analysis dictionary, K is an electroencephalogram emotion type, F k An electroencephalogram emotion sample for emotion class k,for norm calculation, λ is a scalar constant, ++>Is F k Complement of d i An ith atom of D;
introducing a coding coefficient matrix A to relax the PDPL model to obtain:
wherein A is a coding coefficient matrix;
training the GA-PDPL model by using a tested electroencephalogram emotion sample book with a plurality of tested electroencephalogram emotion type labels and corresponding electroencephalogram emotion type labels, updating a coding coefficient matrix A, a comprehensive dictionary D and an analysis dictionary P in the PDPL model, minimizing the PDPL model, and optimizing a plurality of experience parameters of the GA-PDPL model by using a genetic algorithm in the updating process.
3. The method of claim 2, wherein training the GA-PDPL model with a tested electroencephalogram sample with a plurality of tested electroencephalogram emotion class labels and corresponding electroencephalogram emotion class labels, updating the coding coefficient matrix a, the comprehensive dictionary D, and the analysis dictionary P in the PDPL model such that the PDPL model is minimized, comprises:
1) The comprehensive dictionary D and the analysis dictionary P are fixed, and the coding coefficient matrix A is updated:
where τ is a scalar constant;
a closed form solution of the coding coefficient matrix a is obtained:
wherein I is an identity matrix;
2) Fixing the coding coefficient matrix A, updating the comprehensive dictionary D and the analysis dictionary P:
a solution is obtained for analyzing the closed form of dictionary P:
wherein γ is a scalar constant;
introducing a variable S optimization comprehensive dictionary D:
the optimal solution of the comprehensive dictionary D is obtained through an ADMM algorithm:
and 1) carrying out multi-round optimization training on the PDPL model through the steps 1) and 2), and updating the coding coefficient matrix A, the comprehensive dictionary D and the analysis dictionary P in the PDPL model so as to minimize the PDPL model.
4. A method according to claim 3, wherein optimizing parameters in the constructed GA-PDPL model using genetic algorithms during training comprises:
initializing: generating an initial solution population having randomly assigned parameter values;
evaluation: evaluating the fitness of each solution in the population by using a projection dictionary to a learning algorithm and a fitness function;
selecting: selecting a subset of solutions to be used as next generation parents according to fitness;
mutation: creating a new solution by cross-combining the parameters of the selected parents;
evaluation: introducing random variations to the parameters of certain solutions to explore new areas of the search space;
replacement: selecting the best solution from the previous generation and the new generation to form the next generation;
and (3) terminating: terminating the algorithm when a stop condition is satisfied;
and (3) outputting: and returning the optimal solution found by the genetic algorithm, and corresponding to the optimal empirical parameter value of the GA-PDPL model.
5. The method according to any one of claims 1 to 4, wherein a determination formula for using an electroencephalogram emotion type corresponding to a minimum residual value among the plurality of residual values as an emotion type corresponding to the tested electroencephalogram emotion data is:
wherein f t D, for the brain electricity emotion data to be identified i P for class i comprehensive sub-dictionary i Is the analysis sub-dictionary of class i.
6. A cross-test brain electricity emotion recognition device based on a GA-PDPL algorithm is characterized by comprising:
the acquisition module is used for acquiring the brain electricity emotion data to be identified;
the recognition module is used for inputting the tested electroencephalogram emotion data into a pre-trained GA-PDPL model, and calculating residual errors of the tested electroencephalogram emotion data and each electroencephalogram emotion type by utilizing the tested electroencephalogram emotion data and a comprehensive dictionary and an analysis dictionary which are output by the GA-PDPL model to obtain a plurality of residual values; the GA-PDPL model is obtained by adding dictionary pairs of a comprehensive dictionary and an analysis dictionary into a DPL model, introducing a coding coefficient matrix to construct, training the GA-PDPL model through a plurality of tested electroencephalogram emotion test samples and corresponding electroencephalogram emotion type labels, and optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in the training process;
and the output module is used for taking the electroencephalogram emotion type corresponding to the minimum residual value in the residual values as the emotion type corresponding to the tested electroencephalogram emotion data.
7. The apparatus of claim 6, wherein the apparatus further comprises: the construction module is used for adding dictionary pairs in the DPL model to construct the PDPL model, so as to obtain:
wherein D is a comprehensive dictionary, P is an analysis dictionary, K is an electroencephalogram emotion type, F k An electroencephalogram emotion sample for emotion class k,for norm calculation, λ is a scalar constant, ++>Is F k Complement of d i An ith atom of D;
introducing a coding coefficient matrix A to relax the PDPL model to obtain:
wherein A is a coding coefficient matrix;
1) The comprehensive dictionary D and the analysis dictionary P are fixed, and the coding coefficient matrix A is updated:
where τ is a scalar constant;
a closed form solution of the coding coefficient matrix a is obtained:
wherein I is an identity matrix;
2) Fixing the coding coefficient matrix A, updating the comprehensive dictionary D and the analysis dictionary P:
a solution is obtained for analyzing the closed form of dictionary P:
wherein, gamma is a scalar constant,
introducing a variable S optimization comprehensive dictionary D:
the optimal solution of the comprehensive dictionary D is obtained through an ADMM algorithm:
performing multi-round optimization training on the PDPL model through 1) and 2), and updating a coding coefficient matrix A, a comprehensive dictionary D and an analysis dictionary P in the PDPL model to minimize the PDPL model;
optimizing parameters in the constructed GA-PDPL model by utilizing a genetic algorithm in a training process, wherein the method comprises the following steps of:
initializing: generating an initial solution population having randomly assigned parameter values;
evaluation: evaluating the fitness of each solution in the population by using a projection dictionary to a learning algorithm and a fitness function;
selecting: selecting a subset of solutions to be used as next generation parents according to fitness;
mutation: creating a new solution by cross-combining the parameters of the selected parents;
evaluation: introducing random variations to the parameters of certain solutions to explore new areas of the search space;
replacement: selecting the best solution from the previous generation and the new generation to form the next generation;
and (3) terminating: terminating the algorithm when a stop condition is satisfied;
and (3) outputting: and returning the optimal solution found by the genetic algorithm, and corresponding to the optimal empirical parameter value of the GA-PDPL model.
8. The apparatus according to claim 6 or 7, wherein a determination formula for using an electroencephalogram emotion type corresponding to a minimum residual value among the plurality of residual values as an emotion type corresponding to the tested electroencephalogram emotion data is:
wherein f t D, for the brain electricity emotion data to be identified i P for class i comprehensive sub-dictionary i Is the analysis sub-dictionary of class i.
9. An electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement a GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method as claimed in any one of claims 1 to 5.
10. A computer-readable storage medium having stored thereon a computer program, wherein the program is executed by a processor for implementing a GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method according to any one of claims 1 to 5.
CN202310420313.1A 2023-04-19 2023-04-19 GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device Active CN116702018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310420313.1A CN116702018B (en) 2023-04-19 2023-04-19 GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310420313.1A CN116702018B (en) 2023-04-19 2023-04-19 GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device

Publications (2)

Publication Number Publication Date
CN116702018A true CN116702018A (en) 2023-09-05
CN116702018B CN116702018B (en) 2024-03-01

Family

ID=87834636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310420313.1A Active CN116702018B (en) 2023-04-19 2023-04-19 GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device

Country Status (1)

Country Link
CN (1) CN116702018B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103594084A (en) * 2013-10-23 2014-02-19 江苏大学 Voice emotion recognition method and system based on joint penalty sparse representation dictionary learning
CN103793694A (en) * 2014-02-10 2014-05-14 天津大学 Human face recognition method based on multiple-feature space sparse classifiers
CN103793695A (en) * 2014-02-10 2014-05-14 天津大学 Joint training method of sub-dictionaries in multiple characteristic spaces and for face recognition
US20160232340A1 (en) * 2015-02-11 2016-08-11 Samsung Electronics Co., Ltd. Electrocardiogram (ecg)-based authentication apparatus and method thereof, and training apparatus and method thereof for ecg-based authentication
CN106096506A (en) * 2016-05-28 2016-11-09 重庆大学 Based on the SAR target identification method differentiating doubledictionary between subclass class
CN106725452A (en) * 2016-11-29 2017-05-31 太原理工大学 Based on the EEG signal identification method that emotion induces
CN107515978A (en) * 2017-08-17 2017-12-26 广东工业大学 The method of response surface model is built based on genetic algorithm and applies its system
CN110705343A (en) * 2019-08-20 2020-01-17 西南科技大学 Face recognition method and system for structure-incoherent projection dictionary pair learning
CN114201605A (en) * 2021-11-23 2022-03-18 上海大学 Image emotion analysis method based on joint attribute modeling
CN114330535A (en) * 2021-12-24 2022-04-12 南京工业大学 Pattern classification method for learning based on support vector regularization dictionary
CN115062711A (en) * 2022-06-23 2022-09-16 常州工业职业技术学院 Electroencephalogram emotion recognition method based on multi-source domain adaptive dictionary learning and sparse representation
CN115204209A (en) * 2022-05-20 2022-10-18 大连大学 Electroencephalogram dictionary learning method based on optimized label consistency
CN115496950A (en) * 2022-09-30 2022-12-20 常州工业职业技术学院 Neighborhood information embedded semi-supervised discrimination dictionary pair learning image classification method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103594084A (en) * 2013-10-23 2014-02-19 江苏大学 Voice emotion recognition method and system based on joint penalty sparse representation dictionary learning
CN103793694A (en) * 2014-02-10 2014-05-14 天津大学 Human face recognition method based on multiple-feature space sparse classifiers
CN103793695A (en) * 2014-02-10 2014-05-14 天津大学 Joint training method of sub-dictionaries in multiple characteristic spaces and for face recognition
US20160232340A1 (en) * 2015-02-11 2016-08-11 Samsung Electronics Co., Ltd. Electrocardiogram (ecg)-based authentication apparatus and method thereof, and training apparatus and method thereof for ecg-based authentication
CN106096506A (en) * 2016-05-28 2016-11-09 重庆大学 Based on the SAR target identification method differentiating doubledictionary between subclass class
CN106725452A (en) * 2016-11-29 2017-05-31 太原理工大学 Based on the EEG signal identification method that emotion induces
CN107515978A (en) * 2017-08-17 2017-12-26 广东工业大学 The method of response surface model is built based on genetic algorithm and applies its system
CN110705343A (en) * 2019-08-20 2020-01-17 西南科技大学 Face recognition method and system for structure-incoherent projection dictionary pair learning
CN114201605A (en) * 2021-11-23 2022-03-18 上海大学 Image emotion analysis method based on joint attribute modeling
CN114330535A (en) * 2021-12-24 2022-04-12 南京工业大学 Pattern classification method for learning based on support vector regularization dictionary
CN115204209A (en) * 2022-05-20 2022-10-18 大连大学 Electroencephalogram dictionary learning method based on optimized label consistency
CN115062711A (en) * 2022-06-23 2022-09-16 常州工业职业技术学院 Electroencephalogram emotion recognition method based on multi-source domain adaptive dictionary learning and sparse representation
CN115496950A (en) * 2022-09-30 2022-12-20 常州工业职业技术学院 Neighborhood information embedded semi-supervised discrimination dictionary pair learning image classification method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ABHIJIT DAS等: "Fast and Efficent Multimodal Eye Biometrics using Projective Dictionary Pair Learning", 《2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC)》, pages 1 - 7 *
史颖欢: "医学图像处理中的机器学习方法及其应用研究", 《中国博士学位论文全文数据库信息科技辑》, pages 33 - 51 *

Also Published As

Publication number Publication date
CN116702018B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
CN111126564B (en) Neural network structure searching method, device and equipment
Van Der Maaten Accelerating t-SNE using tree-based algorithms
Van Der Maaten Barnes-hut-sne
CN109120462A (en) Prediction technique, device and the readable storage medium storing program for executing of opportunistic network link
CN113535964B (en) Enterprise classification model intelligent construction method, device, equipment and medium
EP4425376A1 (en) Method and apparatus for searching for neural network ensemble model, and electronic device
CN112420125A (en) Molecular attribute prediction method and device, intelligent equipment and terminal
Yousefnezhad et al. A new selection strategy for selective cluster ensemble based on diversity and independency
CN114330650A (en) Small sample characteristic analysis method and device based on evolutionary element learning model training
CN116467141A (en) Log recognition model training, log clustering method, related system and equipment
Teisseyre Feature ranking for multi-label classification using Markov networks
Yuan et al. A novel fault diagnosis method for second-order bandpass filter circuit based on TQWT-CNN
CN116702018B (en) GA-PDPL algorithm-based cross-test electroencephalogram emotion recognition method and device
CN117273060A (en) Data optimization method based on influence function
Liu et al. A weight-incorporated similarity-based clustering ensemble method
CN115907775A (en) Personal credit assessment rating method based on deep learning and application thereof
US11609936B2 (en) Graph data processing method, device, and computer program product
Lassance et al. Graph topology inference benchmarks for machine learning
US20230214643A1 (en) Computer implemented pre-processing method and system for facilitating machine learning signal classes separability, and, non-transitory computer readable storage medium
JP6172315B2 (en) Method and apparatus for mixed model selection
Zhao et al. Spatial temporal graph convolution with graph structure self-learning for early MCI detection
Bellot Pujalte Study of gene regulatory networks inference methods from gene expression data
CN113792132A (en) Target answer determination method, device, equipment and medium
CN113934813A (en) Method, system and equipment for dividing sample data and readable storage medium
Guo et al. Sparse directed acyclic graphs incorporating the covariates

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant