CN114209319A - fNIRS emotion recognition method and system based on graph network and adaptive denoising - Google Patents
fNIRS emotion recognition method and system based on graph network and adaptive denoising Download PDFInfo
- Publication number
- CN114209319A CN114209319A CN202111315105.2A CN202111315105A CN114209319A CN 114209319 A CN114209319 A CN 114209319A CN 202111315105 A CN202111315105 A CN 202111315105A CN 114209319 A CN114209319 A CN 114209319A
- Authority
- CN
- China
- Prior art keywords
- fnirs
- graph
- emotion
- emotion recognition
- variation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000008909 emotion recognition Effects 0.000 title claims abstract description 41
- 230000003044 adaptive effect Effects 0.000 title claims description 13
- 230000008451 emotion Effects 0.000 claims abstract description 24
- 108010064719 Oxyhemoglobins Proteins 0.000 claims abstract description 19
- INGWEZCOABYORO-UHFFFAOYSA-N 2-(furan-2-yl)-7-methyl-1h-1,8-naphthyridin-4-one Chemical compound N=1C2=NC(C)=CC=C2C(O)=CC=1C1=CC=CO1 INGWEZCOABYORO-UHFFFAOYSA-N 0.000 claims abstract description 18
- 108010002255 deoxyhemoglobin Proteins 0.000 claims abstract description 18
- 238000013507 mapping Methods 0.000 claims abstract description 14
- 238000002835 absorbance Methods 0.000 claims abstract description 12
- 210000004556 brain Anatomy 0.000 claims abstract description 11
- 239000000523 sample Substances 0.000 claims abstract description 8
- 230000008859 change Effects 0.000 claims description 15
- 239000011159 matrix material Substances 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 8
- 230000006870 function Effects 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 6
- 230000007246 mechanism Effects 0.000 claims description 6
- 230000015654 memory Effects 0.000 claims description 6
- 210000005013 brain tissue Anatomy 0.000 claims description 4
- 238000011176 pooling Methods 0.000 claims description 4
- 230000009467 reduction Effects 0.000 claims description 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 claims description 2
- 238000011423 initialization method Methods 0.000 claims description 2
- 230000010354 integration Effects 0.000 claims description 2
- 108010054147 Hemoglobins Proteins 0.000 abstract 2
- 102000001554 Hemoglobins Human genes 0.000 abstract 2
- 238000001514 detection method Methods 0.000 description 6
- 230000002996 emotional effect Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 230000004913 activation Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000031700 light absorption Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000000638 stimulation Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008033 biological extinction Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000002329 infrared spectrum Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000004497 NIR spectroscopy Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000000518 effect on emotion Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007787 long-term memory Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000004630 mental health Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Artificial Intelligence (AREA)
- Psychiatry (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Educational Technology (AREA)
- Social Psychology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Optics & Photonics (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention discloses an fNIRS emotion recognition method and system based on a graph network and self-adaptive denoising, which comprises the steps that fNIRS acquisition equipment continuously acquires the variation of light intensity before and after transmitting and receiving, converts the variation of the light intensity into the variation of absorbance, and further obtains the relative variation of the concentration of oxygenated hemoglobin and deoxygenated hemoglobin; denoising through a self-adaptive denoising network model to obtain a pure signal, wherein the input signal of the self-adaptive denoising network model is the relative variation of the concentrations of oxyhemoglobin and deoxyhemoglobin obtained in the last step, and the output signal is the relative variation data of the concentrations of pure oxyhemoglobin and deoxyhemoglobin; mapping graph nodes by combining probe and channel characteristics, restoring brain topology by using a graph network, recognizing network models by dynamic graph attention emotion, and classifying and outputting emotion labels. The invention solves the problems of complex wearing, difficult operation and the like of the brain-computer interface in practical application at present.
Description
Technical Field
The invention relates to the field of human-computer signal identification, in particular to an fNIRS emotion identification method and system based on a graph network and self-adaptive denoising.
Background
Emotions affect the cognitive and behavioral activities of a person and are also important influencing factors of mental health. Emotion recognition as a hotspot in which research is conducted can be classified into both non-physiological signals and physiological signals. As a traditional physiological index detection method, electroencephalogram, brain magnetic resonance, functional magnetic resonance and the like make certain progress in emotion recognition. At the same time, limitations of such approaches have also gradually emerged, such as: low time or space resolution, high acquisition equipment cost, easy interference, inconvenient carrying and the like.
In recent years, with the development of Near-Infrared technology and the technical upgrade of acquisition equipment, a Functional Near-Infrared Spectroscopy (fNIRS) is used as a new non-invasive brain detection means, has the advantages of high compliance, strong anti-interference capability, portability, easy implementation, low cost and the like, and is suitable for all possible tested groups and experimental scenes. With the continuous development of technologies such as a 5G technology, an internet of things, man-machine interaction and machine learning, emotion analysis based on the fNIRS has important significance and wide application prospect in the fields of medical care, media entertainment, information retrieval, education, intelligent wearable equipment and the like. Therefore, the emotion recognition method and system based on the functional near infrared spectrum technology have wide requirements and broad prospects.
Disclosure of Invention
In order to overcome the defects and shortcomings of the prior art, the invention provides a fNIRS emotion recognition method and system based on a graph network and self-adaptive denoising.
The invention adopts the following technical scheme:
as shown in fig. 1, a fNIRS emotion recognition method based on graph network and adaptive denoising includes the following steps:
the S1 fNIRS acquisition equipment continuously acquires the variation of light intensity before and after transmitting and receiving, converts the variation of the light intensity into the variation of absorbance, utilizes the beer-Lambert law to obtain a relational equation of the variation of the absorbance and the variation of the concentration of light absorption chromogens in brain tissues, mainly referring to oxyhemoglobin and deoxyhemoglobin, and further obtains the relative variation of the concentrations of the oxyhemoglobin and the deoxyhemoglobin by solving the equation.
The further process is as follows:
s1.1 obtaining two continuous wave original near infrared light intensity changes with different wavelengths from a collecting device and recording the changes asAnd
s1.3, solving a relation equation of the absorbance change and the relative change quantity of the concentration of the light absorption chromogen in the brain tissue according to the beer-Lambert law;
wherein epsilon is a molar extinction coefficient, d is a detection depth, and DPF is a differential path factor;
s1.4, solving an equation to obtain the relative change amount of the concentration of the oxyhemoglobin and the concentration of the deoxyhemoglobin, and recording the relative change amount as delta CHbO(t) and Δ CHbR(t)。
S2, denoising through a self-adaptive denoising network model to obtain a pure signal, wherein the input signal of the self-adaptive denoising network model is the relative variation of the concentrations of oxyhemoglobin and deoxyhemoglobin obtained in the previous step, and the output signal is the relative variation data of the concentrations of pure oxyhemoglobin and deoxyhemoglobin.
The method further comprises the following steps:
as shown in FIG. 2, the adaptive denoising network model includes a plurality of pairs of convolution and deconvolution blocks of different sizes, denoted as G, forming deep convolution pairspWherein the input is noisy data Δ CHbO(t) and Δ CHbR(t) generating network output for generating pure relative change data of oxyhemoglobin and deoxyhemoglobin concentrations, and recording the relative change data asAnd
the difference between the output and the input of the denoised network model is the generated pure noise signal, which is recorded as
Pure noise signal to be generatedWith clean data PHbO(t) and PHbR(t) adding to obtain the resulting noisy data, denoted asAnd
note Δ CHbO(t),ΔCHbR(t) is Δ C, PHbO(t),PHbR(t) is P, the total loss of the model is defined as:
after iterative training, the denoising algorithm is as follows:
PHbO(t),PHbR(t)=Gp(ΔCHbO(t),ΔCHbR(t)) (8)
the pure relative change data P of the oxyhemoglobin and the deoxyhemoglobin concentration can be obtainedHbO(t) and PHbR(t)。
S3, mapping graph nodes by combining the probe and channel characteristics, restoring brain topology by using a graph network, recognizing a network model by the attention and emotion of a dynamic graph, and classifying and outputting emotion labels. The dynamic graph attention emotion recognition network model comprises graph volume and an attention mechanism, and the probe is an optode of fNIRS and comprises a transmitting party and a receiving party.
As shown in fig. 3, the specific process is as follows:
s3.1 first defines the graph, denoted G (V, E, W), where V denotes the combination of graph nodes, | V | ═ n denotes a total of n nodes, and corresponds to the data sequence Δ C of n channels of fNIRSHbO(t),ΔCHbR(t) is denoted as X; e represents the set of edges in the graph, and the set of different channels in fNIRS; w is an adjacency matrix and defines the connection relation of the nodes, namely the relevance of different channels in the fNIRS, wherein the values in the adjacency matrix describe the importance of the relation between the nodes, and the value WijThe initialization method uses the gaussian kernel method:
in the formula, dist is the Gaussian distance between nodes, and theta and tau are fixed parameters in the Gaussian distance algorithm.
S3.2, in order to introduce the attention of the graph, calculating the similarity coefficient e of each node and the adjacent nodesij:
Wherein, a is a global mapping matrix,andthe weight matrixes of the node i and the node j are respectively;
s3.3 calculating attention coefficient among graph nodes and normalizing, and recording as alphaij:
In the formula, LeakyReLU () is a nonlinear activation function;
s3.4, in graph convolution, weighting and solving are carried out by utilizing multi-head attentionAnd performing parameter integration to obtain new characteristic X'i:
In the formula, sigma is a nonlinear mapping relation, K is the number of attention heads, and | represents splicing;
s3.5, performing pooling dimensionality reduction, outputting emotion categories through a classifier after flattening and full connection layers, wherein a model frame is shown in figure 2;
s3.6, the model adopts the cross entropy and a regularization term as a loss function, and is expressed as follows:
where cossnentropy () represents the cross entropy calculation, l,respectively a true label and a predicted value,in order to obtain a learning rate,all parameters used for representing the model;
s3.7, realizing dynamic change of the adjacent matrix by adopting a back propagation algorithm, and calculating a loss function to carry out network iteration update on partial differential of the adjacent matrix:
the iterative update formula is:
an fNIRS emotion recognition system based on graph network and adaptive denoising, comprising:
the fNIRS acquisition module: continuously acquiring the variation of light intensity before and after transmitting and receiving by using an fNIRS acquisition device, converting the variation of the light intensity into the variation of absorbance, and further obtaining the relative variation of the concentration of oxyhemoglobin and deoxyhemoglobin;
the fNIRS self-adaptive denoising network module: for obtaining a clean signal;
the fNIRS dynamics graph attention emotion recognition network module: and outputting the emotion label according to the pure signal.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the fNIRS emotion recognition method when executing the computer program.
A storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the fNIRS emotion recognition method.
The invention has the beneficial effects that:
(1) in the data denoising process, a self-adaptive denoising model based on a generation countermeasure network is adopted. Compared with the traditional data denoising method based on machine learning, the method can avoid manual participation and experience analysis in the data denoising process to a great extent, overcomes the defect of strong task dependency of the traditional method, and has high adaptivity under the multi-task condition. Meanwhile, by means of generation of countermeasure learning, the problem of noise difference of the fNIRS in a static task and a dynamic task can be solved without specific hypothesis, and the denoising model has strong generalization capability.
(2) In emotion characterization extraction of the fNIRS signal, a deeply learned network model is used. Compared with the traditional manual feature extraction method, the method can realize the extraction of the emotional features in the fNIRS through the data-driven mode and the learning of the network, and solves the problems of limited dimensionality, uncertain effectiveness and the like in the calculation of the fixed features in the traditional method. The features are learned and extracted through a deep learning network, emotion representations with different dimensions can be effectively obtained, and extraction and utilization of emotion features in the fNIRS data are enhanced.
(3) Dynamic graph convolution neural networks are used to efficiently model fNIRS data with probe position information, channel signals. Compared with the traditional method that data is simply regarded as a time series signal and analyzed by using a machine learning method based on a support vector machine and a Bayesian classifier or a recurrent neural network based on long-term and short-term memory, the method provided by the invention uses the method of the figure to carry out topological mapping on the fNIRS data, different probes are mapped to nodes in the figure, the time series data is the characteristics of the nodes, and different passing relations are represented as edges in the figure by an adjacency matrix. The method makes full use of the characteristics of data, has reducibility on the topology of the brain structure, characterizes the data relevance of different channels, and can improve the accuracy of the network model in emotion recognition of fNIRS brain signal detection.
(4) The method introduces a graph attention mechanism, obtains attention coefficients among different nodes by calculating the similarity coefficients of different probe nodes and adjacent nodes, and updates the node characteristics by using the weighted summation of the multi-head attention mechanism in the graph convolution process. The method can introduce the feature relation of the adjacent nodes of the nodes in the training process of the model, so that the model can better extract the associated features of different channel data in the fNIRS, can obtain the activation reaction of different brain areas to different emotions, and has a remarkable effect on emotion recognition of brain signals.
Drawings
FIG. 1 is a flow chart of the operation of the present invention;
FIG. 2 is a diagram of a fNIRS adaptive denoising network model architecture according to the present invention;
FIG. 3 is a diagram of a fNIRS dynamics graph attention emotion recognition network model architecture of the present invention;
FIG. 4 is a schematic diagram of the fNIRS acquisition module of the invention.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but the present invention is not limited to these examples.
Example 1
An fNIRS emotion recognition method based on a graph network and adaptive denoising is suitable for an emotion recognition task of fNIRS acquisition equipment and mainly comprises an external emotion stimulation step, an fNIRS acquisition step, an fNIRS adaptive denoising step and an fNIRS emotion recognition step.
And an external emotional stimulation step, wherein the external emotional stimulation material adopts six emotional tag types of videos of anger, disgust, fear, pleasure, sadness and surprise, and a user carries out emotional induction by watching the videos.
As shown in fig. 4, the fNIRS acquisition step, the fNIRS acquisition device, is composed of a multi-channel dual-wavelength near-infrared continuous wave transmitting-receiving source.
Firstly, a user wears the fNIRS acquisition equipment to acquire and record the variation of light intensity of different optical poles before and after transmitting and receiving in real time.
according to beer-lambert law: Δ a ═ ε × d × DPF, where ε is the molar extinction coefficient, d is the depth of detection, and DPF is the differential path factor.
By solving the relation equation of the absorbance change and the relative change of the concentration of the light absorption chromogen in the brain tissue, the relative change of the concentration of oxyhemoglobin and deoxyhemoglobin is obtained and recorded as delta CHbOAnd Δ CHbR。
fNIRS adaptive denoising step
Inputting the data into a fNIRS self-adaptive denoising module for denoising and enhancing, and obtaining pure oxyhemoglobin and pure oxyhemoglobin through a trained generation networkData P of relative change amount of deoxyhemoglobin concentrationHbOAnd PHbR。PHbO,PHbR=Gp(ΔCHbO,ΔCHbR)。
The fNIRS self-adaptive denoising module is used for carrying out feature extraction on the fNIRS signal through convolution based on a generation countermeasure network and carrying out pure signal generation through deconvolution. In the training process of the network, a pair of noisy signals and pure signals are input, training is carried out through the countermeasure loss of the two discriminators, and meanwhile, the constraint of space mapping is carried out by introducing the loss of cycle consistency, so that the training efficiency of the model is improved.
fnis emotion recognition step
And carrying out graph node mapping on the pure data of all channels: pHbO,PHbR→ V, V is the set of graph nodes. Initializing the adjacency matrix W by using a gaussian kernel function, wherein the gaussian kernel function is:
in the formula, dist is the Gaussian distance between nodes, and theta and tau are fixed parameters in the Gaussian distance algorithm.
Calculating the similarity coefficient of each node and the adjacent nodesa is a global mapping matrix and a is a global mapping matrix,andthe weight matrices for node i and node j, respectively.
Normalization of graph node attention coefficient to alpha using LeakyReLU nonlinear activation functionij,
Obtaining new characteristics by calculating and splicing multi-head attention parameters through graph convolution weighting Sigma is a nonlinear mapping relation, K is the number of attention heads, and | l represents splicing.
And performing pooling dimensionality reduction, and outputting emotion recognition probability through a classifier after flattening and full connection layers.
In this embodiment, the emotion recognition probability is the probability value of six Eckmann emotion classification labels of anger, disgust, pleasure, sadness and surprise, and
the invention provides an emotion recognition method based on a functional near infrared spectrum technology, which fully exploits the function and the potential of a novel non-invasive brain detection method in emotion research, has great significance in practical application, and simultaneously creates a new emotion analysis method of physiological signals.
The signal denoising method can achieve end-to-end adaptive denoising of the multi-channel fNIRS signals, and the algorithm has high generalization capability and universality.
The invention provides a dynamic graph attention model, adopts a dynamic graph convolution method to construct brain topology and extract features, and introduces an attention mechanism to extract correlation features between fNIRS channels, thereby improving the learning ability of the model and obtaining higher emotion recognition accuracy.
The invention adopts a deep learning method, and extracts the learning of the characteristics through data driving, thereby improving the expression capability of emotional characteristics and avoiding artificial participation.
Example 2
An fNIRS emotion recognition system based on graph network and adaptive denoising, comprising:
the fNIRS acquisition module: continuously acquiring the variation of light intensity before and after transmitting and receiving by using an fNIRS acquisition device, converting the variation of the light intensity into the variation of absorbance, and further obtaining the relative variation of the concentration of oxyhemoglobin and deoxyhemoglobin;
the fNIRS self-adaptive denoising network module: for obtaining a clean signal;
the fNIRS dynamics graph attention emotion recognition network module: and outputting the emotion label according to the pure signal.
Example 3
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the fNIRS emotion recognition method when executing the computer program.
Example 4
A storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the fNIRS emotion recognition method.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.
Claims (10)
1. A fNIRS emotion recognition method based on graph network and adaptive denoising is characterized by comprising the following steps:
continuously acquiring the variation of light intensity before and after transmitting and receiving by the fNIRS acquisition equipment, converting the variation of the light intensity into the variation of absorbance, and further obtaining the relative variation of the concentration of oxyhemoglobin and deoxyhemoglobin;
denoising through a self-adaptive denoising network model to obtain a pure signal, wherein the input signal of the self-adaptive denoising network model is the relative variation of the concentrations of oxyhemoglobin and deoxyhemoglobin obtained in the last step, and the output signal is the relative variation data of the concentrations of pure oxyhemoglobin and deoxyhemoglobin;
mapping graph nodes by combining probe and channel characteristics, restoring brain topology by using a graph network, recognizing network models by dynamic graph attention emotion, and classifying and outputting emotion labels.
2. The fNIRS emotion recognition method of claim 1, wherein the adaptive denoising network model comprises a plurality of deep convolution pairs of convolution and deconvolution blocks of different sizes.
3. The fNIRS emotion recognition method of claim 2, wherein the convolution performs feature extraction on the fNIRS signal and clean signal generation is performed by deconvolution blocks.
4. The fNIRS emotion recognition method of claim 1, wherein the dynamical graph attention emotion recognition network model comprises a graph convolution and attention mechanism.
5. The fNIRS emotion recognition method of claim 4, wherein the recognition process of the dynagram attention emotion recognition network model is as follows: and constructing a graph network, mapping data to a graph, mapping pure fNIRS signals to nodes of the graph, mapping characteristic features of probes and channels to edges of the graph, extracting the features by a dynamic graph convolution method, simultaneously introducing an attention mechanism to learn channel relevance, finally classifying and outputting through dimensionality reduction and flattening splicing of graph pooling, and realizing accurate emotion identification.
6. The fNIRS emotion recognition method of claim 5, wherein the emotion labels are output in a classified manner by the network model for attention emotion recognition of the dynamic graph, specifically:
the definition of the graph is denoted as G (V, E, W), where V represents the combination of graph nodes and | V | ═ n tableData sequence Δ C showing n nodes in total, corresponding to n channels of fNIRSHbO(t),ΔCHbR(t) is denoted as X; e represents the set of edges in the graph, and the set of different channels in fNIRS; w is an adjacency matrix and defines the connection relation of the nodes, namely the relevance of different channels in the fNIRS, wherein the values in the adjacency matrix describe the importance of the relation between the nodes, and the value WijThe initialization method uses a Gaussian kernel function method;
drawing attention, and calculating similarity coefficients of each node and adjacent nodes;
calculating attention coefficients among the nodes of the graph and normalizing the attention coefficients;
in graph convolution, weighted summation is carried out by using multi-head attention to carry out parameter integration, and new features are obtained;
and performing pooling dimensionality reduction, and outputting emotion categories through a classifier after flattening and full connection layers.
7. The fNIRS emotion recognition method of any one of claims 1 to 6, wherein the equation relating the change in absorbance to the change in the concentration of the light-absorbing chromophore in the brain tissue is obtained using the beer-lambert law, and the relative change in the concentrations of oxyhemoglobin and deoxyhemoglobin is obtained by solving the equation.
8. A system for implementing the method of fNIRS emotion recognition of any of claims 1-6, comprising:
the fNIRS acquisition module: continuously acquiring the variation of light intensity before and after transmitting and receiving by using an fNIRS acquisition device, converting the variation of the light intensity into the variation of absorbance, and further obtaining the relative variation of the concentration of oxyhemoglobin and deoxyhemoglobin;
the fNIRS self-adaptive denoising network module: for obtaining a clean signal;
the fNIRS dynamics graph attention emotion recognition network module: and outputting the emotion label according to the pure signal.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the fNIRS emotion recognition method of any of claims 1 to 7.
10. A storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, performs the steps of the fNIRS emotion recognition method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111315105.2A CN114209319B (en) | 2021-11-08 | 2021-11-08 | fNIRS emotion recognition method and system based on graph network and self-adaptive denoising |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111315105.2A CN114209319B (en) | 2021-11-08 | 2021-11-08 | fNIRS emotion recognition method and system based on graph network and self-adaptive denoising |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114209319A true CN114209319A (en) | 2022-03-22 |
CN114209319B CN114209319B (en) | 2024-03-29 |
Family
ID=80696655
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111315105.2A Active CN114209319B (en) | 2021-11-08 | 2021-11-08 | fNIRS emotion recognition method and system based on graph network and self-adaptive denoising |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114209319B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114782449A (en) * | 2022-06-23 | 2022-07-22 | 中国科学技术大学 | Method, system, equipment and storage medium for extracting key points in lower limb X-ray image |
CN117156072A (en) * | 2023-11-01 | 2023-12-01 | 慧创科仪(北京)科技有限公司 | Device for processing near infrared data of multiple persons, processing equipment and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070202477A1 (en) * | 2004-09-02 | 2007-08-30 | Nagaoka University Of Technology | Emotional state determination method |
US20170172479A1 (en) * | 2015-12-21 | 2017-06-22 | Outerfacing Technology LLC | Acquiring and processing non-contact functional near-infrared spectroscopy data |
CN107280685A (en) * | 2017-07-21 | 2017-10-24 | 国家康复辅具研究中心 | Top layer physiological noise minimizing technology and system |
US20190239792A1 (en) * | 2018-02-07 | 2019-08-08 | Denso Corporation | Emotion identification apparatus |
CN111466876A (en) * | 2020-03-24 | 2020-07-31 | 山东大学 | Alzheimer's disease auxiliary diagnosis system based on fNIRS and graph neural network |
WO2020166091A1 (en) * | 2019-02-15 | 2020-08-20 | 俊徳 加藤 | Biological function measurement device, and biological function measurement method, and program |
WO2021067464A1 (en) * | 2019-10-01 | 2021-04-08 | The Board Of Trustees Of The Leland Stanford Junior University | Joint dynamic causal modeling and biophysics modeling to enable multi-scale brain network function modeling |
CN113180650A (en) * | 2021-01-25 | 2021-07-30 | 北京不器科技发展有限公司 | Near-infrared brain imaging atlas identification method |
KR102288267B1 (en) * | 2020-07-22 | 2021-08-11 | 액티브레인바이오(주) | AI(Artificial Intelligence) BASED METHOD OF PROVIDING BRAIN INFORMATION |
CN113598774A (en) * | 2021-07-16 | 2021-11-05 | 中国科学院软件研究所 | Active emotion multi-label classification method and device based on multi-channel electroencephalogram data |
-
2021
- 2021-11-08 CN CN202111315105.2A patent/CN114209319B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070202477A1 (en) * | 2004-09-02 | 2007-08-30 | Nagaoka University Of Technology | Emotional state determination method |
US20170172479A1 (en) * | 2015-12-21 | 2017-06-22 | Outerfacing Technology LLC | Acquiring and processing non-contact functional near-infrared spectroscopy data |
CN107280685A (en) * | 2017-07-21 | 2017-10-24 | 国家康复辅具研究中心 | Top layer physiological noise minimizing technology and system |
US20190239792A1 (en) * | 2018-02-07 | 2019-08-08 | Denso Corporation | Emotion identification apparatus |
WO2020166091A1 (en) * | 2019-02-15 | 2020-08-20 | 俊徳 加藤 | Biological function measurement device, and biological function measurement method, and program |
WO2021067464A1 (en) * | 2019-10-01 | 2021-04-08 | The Board Of Trustees Of The Leland Stanford Junior University | Joint dynamic causal modeling and biophysics modeling to enable multi-scale brain network function modeling |
CN111466876A (en) * | 2020-03-24 | 2020-07-31 | 山东大学 | Alzheimer's disease auxiliary diagnosis system based on fNIRS and graph neural network |
KR102288267B1 (en) * | 2020-07-22 | 2021-08-11 | 액티브레인바이오(주) | AI(Artificial Intelligence) BASED METHOD OF PROVIDING BRAIN INFORMATION |
CN113180650A (en) * | 2021-01-25 | 2021-07-30 | 北京不器科技发展有限公司 | Near-infrared brain imaging atlas identification method |
CN113598774A (en) * | 2021-07-16 | 2021-11-05 | 中国科学院软件研究所 | Active emotion multi-label classification method and device based on multi-channel electroencephalogram data |
Non-Patent Citations (1)
Title |
---|
LEMONQC: "图注意力网络(GAT)", pages 1 - 3, Retrieved from the Internet <URL:https://zhuanlan.zhihu.com/p/118605260?utm_id=0> * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114782449A (en) * | 2022-06-23 | 2022-07-22 | 中国科学技术大学 | Method, system, equipment and storage medium for extracting key points in lower limb X-ray image |
CN117156072A (en) * | 2023-11-01 | 2023-12-01 | 慧创科仪(北京)科技有限公司 | Device for processing near infrared data of multiple persons, processing equipment and storage medium |
CN117156072B (en) * | 2023-11-01 | 2024-02-13 | 慧创科仪(北京)科技有限公司 | Device for processing near infrared data of multiple persons, processing equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114209319B (en) | 2024-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Liu et al. | EEG emotion recognition based on the attention mechanism and pre-trained convolution capsule network | |
Chen et al. | Accurate EEG-based emotion recognition on combined features using deep convolutional neural networks | |
CN107491726B (en) | Real-time expression recognition method based on multichannel parallel convolutional neural network | |
CN112244873B (en) | Electroencephalogram space-time feature learning and emotion classification method based on hybrid neural network | |
CN110399857A (en) | A kind of brain electricity emotion identification method based on figure convolutional neural networks | |
CN112800998B (en) | Multi-mode emotion recognition method and system integrating attention mechanism and DMCCA | |
CN114209319B (en) | fNIRS emotion recognition method and system based on graph network and self-adaptive denoising | |
Zhong et al. | Cross-scene deep transfer learning with spectral feature adaptation for hyperspectral image classification | |
CN112766355B (en) | Electroencephalogram signal emotion recognition method under label noise | |
CN113343860A (en) | Bimodal fusion emotion recognition method based on video image and voice | |
CN110717423B (en) | Training method and device for emotion recognition model of facial expression of old people | |
CN111407243A (en) | Pulse signal pressure identification method based on deep learning | |
CN117033638B (en) | Text emotion classification method based on EEG cognition alignment knowledge graph | |
CN113128459B (en) | Feature fusion method based on multi-level electroencephalogram signal expression | |
CN113554110A (en) | Electroencephalogram emotion recognition method based on binary capsule network | |
Jinliang et al. | EEG emotion recognition based on granger causality and capsnet neural network | |
Rayatdoost et al. | Subject-invariant EEG representation learning for emotion recognition | |
Jayasekara et al. | Timecaps: Capturing time series data with capsule networks | |
CN113842151B (en) | Cross-test EEG cognitive state detection method based on efficient multi-source capsule network | |
Du et al. | Multivariate time series classification based on fusion features | |
CN114662524B (en) | Plug-and-play domain adaptation method based on electroencephalogram signals | |
CN112998652A (en) | Photoelectric volume pulse wave pressure identification method and system | |
CN116230244A (en) | EHR data analysis method and system based on augmentation discrimination information | |
Xu et al. | New advances in remote heart rate estimation and its application to deepfake detection | |
Truong et al. | Assessing learned features of Deep Learning applied to EEG |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |