CN113778228A - Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment - Google Patents

Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment Download PDF

Info

Publication number
CN113778228A
CN113778228A CN202111062641.6A CN202111062641A CN113778228A CN 113778228 A CN113778228 A CN 113778228A CN 202111062641 A CN202111062641 A CN 202111062641A CN 113778228 A CN113778228 A CN 113778228A
Authority
CN
China
Prior art keywords
emotion
pressure
brain
computer interface
emotions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111062641.6A
Other languages
Chinese (zh)
Inventor
马婷
蔡国庆
彭昊
黄守麟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Harbin Institute of Technology
Original Assignee
Shenzhen Graduate School Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Harbin Institute of Technology filed Critical Shenzhen Graduate School Harbin Institute of Technology
Priority to CN202111062641.6A priority Critical patent/CN113778228A/en
Publication of CN113778228A publication Critical patent/CN113778228A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/38Acoustic or auditory stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Abstract

The invention provides a brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment. The method has the beneficial effects that (1) the extraction of the electroencephalogram characteristics under different emotions is realized by using a space frequency optimization method, and the accuracy of emotion recognition is improved by combining the extraction of the electroencephalogram characteristics with the characteristic fusion of the pressure signals. (2) The emotion marker is changed into a pressure sensor type, and a pressure emotion signal is increased under the condition that the test is not increased, so that the emotion recognition rate is improved. Meanwhile, the pressure sensor can be used as an emotion releasing tool in measurement to achieve various effects of emotion regulation, emotion calibration, emotion measurement and the like.

Description

Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment
Technical Field
The invention relates to human-computer interaction, emotion recognition and emotion regulation technologies, in particular to a brain-computer interface system based on multifunctional emotion recognition and self-adaptive regulation.
Background
Emotion recognition refers to recognition of emotional changes of a person under external stimulation, and is widely applied to the fields of mental state detection, crime prevention, disease treatment and the like. The EEG electroencephalogram signals are gradually paid attention to emotion recognition in recent years, and especially have important practical application value in the aspects of psychological state detection and emotion regulation.
Because the electroencephalogram signal is a direct reaction of the brain electric signal, the electroencephalogram signal has more intuitive expression relative to other signals which depend on sympathetic parasympathetic nerve regulation, such as electrocardio, electrodermal and the like, for example, the electroencephalogram signal has wide space-frequency domain distribution difference under different emotions, and therefore, the electroencephalogram signal has better emotion classification and identification effects. The study of Fumikazu et al shows that the activity intensity of theta wave of frontal lobe is increased in mental state, while the activity of a wave of occipital lobe is inhibited in resting state, and the study of WL Zheng et al shows that the active emotion is higher in beta and gamma wave energy than the passive emotion, and the active emotion is stronger in the frontal lobe and the active energy of delta wave of the apical lobe and occipital lobe is enhanced in the negative emotion. Based on the method, the extraction of the electroencephalogram characteristics under different emotions can be realized by using a space frequency optimization method, so that the classification of different emotions can be realized.
The Brain generates signals with certain characteristics to realize communication and control with electronic devices (such as computers and the like), which is called Brain Computer Interface (BCI), and the key of the BCI is to recognize the characteristics of the Brain signals so as to convert the Brain signals into correct commands for controlling the devices. The brain-computer interface system based on emotion recognition can be realized by performing emotion recognition on the electroencephalogram and applying the emotion recognition result to the reverse emotion regulation of a computer. However, the accuracy of emotion recognition of a single electroencephalogram is not high enough, and the information of emotion calibration is not sufficiently utilized.
Disclosure of Invention
The invention provides a brain-computer interface system based on multifunctional emotion recognition and adaptive adjustment, which is characterized by comprising an emotion induction unit, an electroencephalogram signal acquisition module, a pressure signal emotion calibration module, an emotion decoding unit and an emotion adjusting unit,
the emotion inducing unit is used for playing emotion inducing materials of audio and/or video and inducing different emotions of the tester;
the electroencephalogram signal acquisition module is used for acquiring electroencephalogram signals under different emotional states;
the pressure signal emotion calibration module comprises a first pressure sensor, a second pressure sensor and a third pressure sensor, wherein the first pressure sensor, the second pressure sensor and the third pressure sensor are used for recording the pressure of the pressure sensors under different emotions in the training stage when a tester watches/listens to emotion inducing materials, and the pressure is used as a basis for emotion auxiliary judgment;
the emotion decoding unit is used for emotion decoding;
the emotion adjusting unit is used for adjusting extreme emotions and restoring excited emotions.
As a further improvement of the present invention, the stress signal emotion calibration module further includes a fourth stress sensor, the fourth stress sensor is used as a detector for knocking in the discrimination stage, and the fourth stress sensor is used as an emotion signal collector and also as a catharser for anxiety and excited emotion to counteract emotion regulation.
As a further improvement of the invention, the emotion inducing materials comprise active, neutral and negative emotion materials in a Chinese emotion stimulating material library, the active, neutral and negative emotions of the tester are induced by playing the active, neutral and negative emotion materials, and electroencephalogram signals corresponding to the emotions are acquired by an electroencephalogram signal acquisition module.
As a further improvement of the present invention, the first to third pressure sensors are labeled "positive", "neutral" and "negative", respectively, and the first to third pressure sensors record the pressure under the corresponding emotion-inducing material, and since the pressures of the pressure sensors are differentiated under different emotions, the first to third pressure sensors serve as emotion markers and also serve as emotion discrimination markers.
As a further improvement of the invention, the emotion decoding unit carries out emotion decoding through scale weighted space-frequency optimization algorithm and pressure signal emotion judgment and then carries out feature fusion.
As a further improvement of the invention, the emotion adjusting unit enables the brain-computer interface system to have a self-adaptive learning function, and parameters can be adjusted according to different environments, time and individuals.
As a further improvement of the invention, the method for emotion decoding by the space-frequency optimization algorithm is as follows:
step 1: performing band-pass filtering on the electroencephalogram signals by 1-45hz, and reserving the electroencephalogram signals from delta waves to gamma wave frequency bands;
step 2: dividing the pass band, and then calculating a projection matrix of each sub-band by performing a common space mode on each sub-band;
and step 3: optimizing the subspace projection matrix in combination with the scale score of the training set, and performing weighted average to form a new projection matrix; classifying the test data to find an optimal projection matrix and an optimal classification sub-band;
and 4, step 4: and classifying the electroencephalogram signals by the optimal sub-band and the trained optimal projection matrix to obtain the characteristics generated by the electroencephalogram signals.
As a further improvement of the present invention, in the emotion decoding unit, feature fusion is performed by a feature fusion model, which is as follows:
Figure BDA0003256933340000031
where d is the decoding dimension of the system, fd×1For outputting, namely a result of fusion of the space-frequency optimization algorithm and the emotion distinguishing characteristic of the stress signal, M is the number of system signals,
Figure BDA0003256933340000032
a feature vector representing the ith signal, the number of features being Ni
Figure BDA0003256933340000033
Is that
Figure BDA0003256933340000034
Is a weight of d × NiMatrix of bd×1Is a vector of dX 1, which is a bias coefficient;
task decision is implemented using softmax:
Figure BDA0003256933340000035
wherein f isk,1Is a vector fd×1The subscript is the value of (k, 1).
As a further improvement of the invention, the current emotion state is obtained through feature fusion and judgment, and the emotion adjusting unit realizes the emotion adjustment in the following way: for negative emotions in multiple tests, positive emotion materials are played through an emotion induction unit to induce positive emotions; for positive emotions in multiple tests, neutral emotion materials are played through an emotion induction unit to achieve emotion smoothing;
in the emotion recognition link, a tester can press the fourth pressure sensor at will, and the pressure sensor serves as an emotion signal collector and also serves as a catharsis device for anxiety and excited emotion to counteract emotion regulation.
The invention has the beneficial effects that: (1) the method of space frequency optimization is used for extracting electroencephalogram features and combining with pressure signals to perform feature fusion under different emotions, so that the emotion recognition accuracy is improved. (2) The emotion marker is changed into a pressure sensor type, and a pressure emotion signal is increased under the condition that the test is not increased, so that the emotion recognition rate is improved. Meanwhile, the pressure sensor can be used as an emotion releasing tool in measurement to achieve various effects of emotion regulation, emotion calibration, emotion measurement and the like.
Drawings
FIG. 1 is a functional block diagram of a brain-computer interface system of the present invention;
FIG. 2 is a schematic diagram of a stress signal mood calibration module;
FIG. 3 is a schematic diagram of an emotion space frequency optimization discrimination algorithm;
FIG. 4 is a schematic diagram of a stress signal emotion determination method;
FIG. 5 is a block diagram of an adaptive learning structure based on softmax regression;
fig. 6 is a schematic diagram of the mood-adjusting unit implementing mood adjustment.
Detailed Description
As shown in figure 1, the invention discloses a brain-computer interface system based on multifunctional emotion recognition and adaptive adjustment, which comprises an electroencephalogram signal acquisition module, a pressure signal emotion calibration module, an emotion induction unit, an emotion decoding unit, an emotion adjustment unit and an adaptive learning module. The electroencephalogram signal acquisition module acquires electroencephalogram signals under different emotional states; the pressure signal emotion calibration module comprises four pressure sensors, the first three pressure sensors are used for recording the pressure of the pressing sensor under different emotions when the induced video is watched in the training stage and used as the basis of emotion auxiliary judgment, and the fourth pressure sensor is used as a detector for knocking in the judgment stage; the emotion inducing unit plays emotion inducing materials such as audio and video; the emotion decoding unit performs emotion decoding through a time-space-frequency optimization algorithm; the functions of adjusting extreme emotions, calming excited emotions and the like are realized through the emotion adjusting unit; the self-adaptive learning module is used for learning before a user uses the system for the first time, executing a set behavior task according to an operation guide, collecting physiological activity signals in the process of executing the task, and calculating to obtain optimal parameters of feature extraction and task classification.
As shown in fig. 2, the emotion calibration module of the stress signal excites the corresponding emotion of the tester, the stress signal emotion calibration module includes four pressure sensors that can be depressed, three of the pressure sensors are labeled "positive", "neutral" and "negative", the fourth label "please press after measurement", in the emotion training phase, after watching the inducing material played by the emotion inducing unit, the tester presses the pressure sensor with the "positive" label after watching the positive inducing material, presses the pressure sensor with the "neutral" label after watching the neutral inducing material, presses the pressure sensor with the "negative" label after watching the negative inducing material, and the force is performed according to the corresponding emotion. The pressure sensors record the pressure under the corresponding inducing material, and the pressure sensors are used as emotion marks and also as emotion distinguishing marks when the pressure sensors are used for emotion calibration because the pressure of the pressure sensors under different emotions is distinguished.
The testers need to fill in a titer-arousal degree scale in the emotion training stage to carry out detailed annotation of emotions.
The emotion adjusting unit enables the brain-computer interface system to have a self-adaptive learning function, parameters can be adjusted according to different environments, time and used individuals, and accuracy of system control is guaranteed to the maximum extent.
The schematic diagram of the space-frequency optimization algorithm is shown in fig. 3, and the electroencephalogram signals are subjected to band-pass filtering of 1-45hz, and brain wave signals from delta waves to gamma wave frequency bands are reserved. The pass band is divided and then the projection matrix is calculated for each sub-band in a co-spatial mode. And optimizing the subspace projection matrix by combining the scale scores of the training set, and performing weighted average to form a new projection matrix. The test data is classified to find an optimal projection matrix and an optimal classification subband. And classifying the electroencephalogram signals by the optimal sub-band and the trained optimal projection matrix to obtain the characteristics generated by the electroencephalogram signals.
As shown in fig. 4, in the emotion training stage, after watching the inducing material played by the emotion inducing unit, the tester presses the pressure sensor with the "positive" label after watching the positive inducing material, presses the pressure sensor with the "neutral" label after watching the neutral inducing material, presses the pressure sensor with the "negative" label after watching the negative inducing material, and performs the stress according to the corresponding emotion.
FIG. 5 shows a simplified learning model of softmax regression.
The emotion adjusting unit achieves emotion adjustment as shown in fig. 6, current emotion states (positive, neutral and negative) are obtained through feature fusion judgment, negative emotions are obtained for multiple tests, and positive emotion materials are played through the emotion inducing unit to induce the positive emotions. And (4) playing neutral emotion materials through an emotion inducing unit to perform emotion smoothing for positive emotions in multiple tests. In the emotion recognition link, a tester can press the fourth pressure sensor at will, and the pressure sensor acts as an emotion signal collector and also can act as a catharsis device under anxiety and excited emotion to counteract emotion regulation.
Aiming at the defects in the prior art, the emotion calibration module is designed into a pressure signal emotion calibration module, and the pressure for pressing under different emotions and the label of the emotion at the moment are collected; meanwhile, an emotion space-frequency optimization algorithm is designed for emotion decoding, and the judgment of emotion is optimized by fusing a pressure characteristic and an electroencephalogram space-frequency characteristic through a characteristic fusion model. The emotion is reversely adjusted by combining the brain-computer interface technology, negative emotions are obtained for multiple tests, positive emotion materials are played through the emotion inducing unit to induce positive emotions, positive emotions are obtained for multiple tests, and neutral emotion materials are played through the emotion inducing unit to achieve emotion restoration.
The invention has the beneficial effects that:
(1) the method of space frequency optimization is used for extracting electroencephalogram features and combining with pressure signals to perform feature fusion under different emotions, so that the emotion recognition accuracy is improved.
(2) The emotion marker is changed into a pressure sensor type, and a pressure emotion signal is increased under the condition that the test is not increased, so that the emotion recognition rate is improved. Meanwhile, the pressure sensor can be used as an emotion releasing tool in measurement to achieve various effects of emotion regulation, emotion calibration, emotion measurement and the like.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (9)

1. A brain-computer interface system based on multifunctional emotion recognition and adaptive adjustment is characterized by comprising an emotion induction unit, an electroencephalogram signal acquisition module, a pressure signal emotion calibration module, an emotion decoding unit and an emotion adjustment unit,
the emotion inducing unit is used for playing emotion inducing materials of audio and/or video and inducing different emotions of the tester;
the electroencephalogram signal acquisition module is used for acquiring electroencephalogram signals under different emotional states;
the pressure signal emotion calibration module comprises a first pressure sensor, a second pressure sensor and a third pressure sensor, wherein the first pressure sensor, the second pressure sensor and the third pressure sensor are used for recording the pressure of the pressure sensors under different emotions in the training stage when a tester watches/listens to emotion inducing materials, and the pressure is used as a basis for emotion auxiliary judgment;
the emotion decoding unit is used for emotion decoding;
the emotion adjusting unit is used for adjusting extreme emotions and restoring excited emotions.
2. The brain-computer interface system according to claim 1, wherein the stress signal mood calibration module further comprises a fourth stress sensor, wherein the fourth stress sensor is used as a detector for the stroke during the discrimination phase, and the fourth stress sensor is used as a mood signal collector and a catharser for anxiety and excited mood to counteract the mood regulation.
3. The brain-computer interface system according to claim 1, wherein the emotion-inducing materials comprise positive, neutral and negative emotion materials in a Chinese emotional stimulation material library, positive, neutral and negative emotions of the tester are induced by playing the positive, neutral and negative emotion materials, and electroencephalogram signals corresponding to the emotions are acquired by the electroencephalogram signal acquisition module.
4. The brain-computer interface system according to claim 1, wherein the first to third pressure sensors are labeled "positive", "neutral" and "negative", respectively, and the first to third pressure sensors record the pressure under the corresponding emotional inducing material, and since the pressure of the pressure sensors is differentiated under different emotions, the first to third pressure sensors are used as the emotion markers and also as the emotion judgment markers.
5. The brain-computer interface system according to claim 1, wherein the emotion decoding unit performs emotion decoding by a scale weighted space-frequency optimization algorithm and pressure signal emotion judgment, and then performs feature fusion.
6. The brain-computer interface system according to claim 5, wherein the emotion adjusting unit enables the brain-computer interface system to have an adaptive learning function, and parameters can be adjusted according to different environments, time and individuals.
7. The brain-computer interface system according to claim 5, wherein the method of emotion decoding by the space-frequency optimization algorithm is as follows:
step 1: performing band-pass filtering on the electroencephalogram signals by 1-45hz, and reserving the electroencephalogram signals from delta waves to gamma wave frequency bands;
step 2: dividing the pass band, and then calculating a projection matrix of each sub-band by performing a common space mode on each sub-band;
and step 3: optimizing the subspace projection matrix in combination with the scale score of the training set, and performing weighted average to form a new projection matrix; classifying the test data to find an optimal projection matrix and an optimal classification sub-band;
and 4, step 4: and classifying the electroencephalogram signals by the optimal sub-band and the trained optimal projection matrix to obtain the characteristics generated by the electroencephalogram signals.
8. The brain-computer interface system according to claim 5, wherein in the emotion decoding unit, feature fusion is performed by a feature fusion model as follows:
Figure FDA0003256933330000021
where d is the decoding dimension of the system, fd×1For outputting, namely a result of fusion of the space-frequency optimization algorithm and the emotion distinguishing characteristic of the stress signal, M is the number of system signals,
Figure FDA0003256933330000022
a feature vector representing the ith signal, the number of features being Ni
Figure FDA0003256933330000023
Is that
Figure FDA0003256933330000024
Is a weight of d × NiMatrix of bd×1Is a vector of dX 1, which is a bias coefficient;
task decision is implemented using softmax:
Figure FDA0003256933330000025
wherein f isk,1Is a vector fd×1The subscript is the value of (k, 1).
9. The brain-computer interface system according to claim 5, wherein the current emotional state is determined through feature fusion, and the emotion adjusting unit implements emotion adjustment in the following manner: for negative emotions in multiple tests, positive emotion materials are played through an emotion induction unit to induce positive emotions; for positive emotions in multiple tests, neutral emotion materials are played through an emotion induction unit to achieve emotion smoothing;
in the emotion recognition link, a tester can press the fourth pressure sensor at will, and the pressure sensor serves as an emotion signal collector and also serves as a catharsis device for anxiety and excited emotion to counteract emotion regulation.
CN202111062641.6A 2021-09-10 2021-09-10 Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment Pending CN113778228A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111062641.6A CN113778228A (en) 2021-09-10 2021-09-10 Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111062641.6A CN113778228A (en) 2021-09-10 2021-09-10 Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment

Publications (1)

Publication Number Publication Date
CN113778228A true CN113778228A (en) 2021-12-10

Family

ID=78842487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111062641.6A Pending CN113778228A (en) 2021-09-10 2021-09-10 Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment

Country Status (1)

Country Link
CN (1) CN113778228A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102512160A (en) * 2011-12-16 2012-06-27 天津大学 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands
CN102715911A (en) * 2012-06-15 2012-10-10 天津大学 Brain electric features based emotional state recognition method
CN103654798A (en) * 2013-12-11 2014-03-26 四川大学华西医院 Method and device for monitoring and recording emotion
CN107007291A (en) * 2017-04-05 2017-08-04 天津大学 Intense strain intensity identifying system and information processing method based on multi-physiological-parameter
CN108509040A (en) * 2018-03-28 2018-09-07 哈尔滨工业大学深圳研究生院 Mixing brain machine interface system based on multidimensional processiug and adaptive learning
CN110576443A (en) * 2018-08-03 2019-12-17 重庆市武义赏网络科技有限公司 robot system for psychological catharsis and data acquisition and analysis method thereof
CN112669882A (en) * 2021-01-07 2021-04-16 上海龙旗科技股份有限公司 Media playing method and device based on pressure detection
CN112987917A (en) * 2021-02-08 2021-06-18 中国科学院自动化研究所 Motion imagery enhancement method, device, electronic equipment and storage medium
CH717003A2 (en) * 2019-12-27 2021-06-30 Univ Lanzhou System for regulating a depressed mood through music feedback based on an electroencephalogram signal.

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102512160A (en) * 2011-12-16 2012-06-27 天津大学 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands
CN102715911A (en) * 2012-06-15 2012-10-10 天津大学 Brain electric features based emotional state recognition method
CN103654798A (en) * 2013-12-11 2014-03-26 四川大学华西医院 Method and device for monitoring and recording emotion
CN107007291A (en) * 2017-04-05 2017-08-04 天津大学 Intense strain intensity identifying system and information processing method based on multi-physiological-parameter
CN108509040A (en) * 2018-03-28 2018-09-07 哈尔滨工业大学深圳研究生院 Mixing brain machine interface system based on multidimensional processiug and adaptive learning
CN110576443A (en) * 2018-08-03 2019-12-17 重庆市武义赏网络科技有限公司 robot system for psychological catharsis and data acquisition and analysis method thereof
CH717003A2 (en) * 2019-12-27 2021-06-30 Univ Lanzhou System for regulating a depressed mood through music feedback based on an electroencephalogram signal.
CN112669882A (en) * 2021-01-07 2021-04-16 上海龙旗科技股份有限公司 Media playing method and device based on pressure detection
CN112987917A (en) * 2021-02-08 2021-06-18 中国科学院自动化研究所 Motion imagery enhancement method, device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108629313B (en) Emotion adjusting method, device and system and computer storage medium
Phinyomark et al. A feature extraction issue for myoelectric control based on wearable EMG sensors
CN110353702A (en) A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN108304917A (en) A kind of P300 signal detecting methods based on LSTM networks
CN111091074B (en) Motor imagery electroencephalogram signal classification method of optimal region co-space mode
CN103690165A (en) Cross-inducing-mode emotion electroencephalogram recognition and modeling method
CN101464729A (en) Independent desire expression method based on auditory sense cognition neural signal
CN113017645B (en) P300 signal detection method based on void convolutional neural network
CN116400800B (en) ALS patient human-computer interaction system and method based on brain-computer interface and artificial intelligence algorithm
US20220051039A1 (en) Biometric identification using electroencephalogram (eeg) signals
CN108491792B (en) Office scene human-computer interaction behavior recognition method based on electro-oculogram signals
CN114557708A (en) Device and method for detecting somatosensory stimulation consciousness based on electroencephalogram dual-feature fusion
CN111067513A (en) Sleep quality detection key brain area judgment method based on characteristic weight self-learning
CN104793743B (en) A kind of virtual social system and its control method
Yoon et al. Spatial and time domain feature of ERP speller system extracted via convolutional neural network
CN113778228A (en) Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment
KR20150076932A (en) apparatus for analyzing brain wave signal and analyzing method thereof, and user terminal device for using the analyzation result
CN101571747A (en) Method for realizing multi-mode EEG-control intelligent typewriting
WO2023027578A1 (en) Nose-operated head-mounted device
CN111905229A (en) Piano music hypnosis treatment control system and method based on 5G
CN108764008B (en) Method for detecting movement intention based on combination of dynamic stopping strategy and integrated learning
CN114201041A (en) Human-computer interaction command method and device based on brain-computer interface
Sanggarini et al. Hjorth descriptor as feature extraction for classification of familiarity in EEG signal
CN113070875A (en) Manipulator control method and device based on brain wave recognition
Deepajothi et al. Performance evaluation of SVM–RBF kernel for classifying ECoG motor imagery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination