CN107080546B - Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers - Google Patents

Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers Download PDF

Info

Publication number
CN107080546B
CN107080546B CN201710252127.6A CN201710252127A CN107080546B CN 107080546 B CN107080546 B CN 107080546B CN 201710252127 A CN201710252127 A CN 201710252127A CN 107080546 B CN107080546 B CN 107080546B
Authority
CN
China
Prior art keywords
score
video
electroencephalogram
valence
emotion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710252127.6A
Other languages
Chinese (zh)
Other versions
CN107080546A (en
Inventor
苏媛媛
吕钊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Zhiqu Angel Information Technology Co., Ltd
Original Assignee
Anhui Zhiqu Angel Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Zhiqu Angel Information Technology Co Ltd filed Critical Anhui Zhiqu Angel Information Technology Co Ltd
Priority to CN201710252127.6A priority Critical patent/CN107080546B/en
Publication of CN107080546A publication Critical patent/CN107080546A/en
Application granted granted Critical
Publication of CN107080546B publication Critical patent/CN107080546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Psychiatry (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Psychology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses an electroencephalogram-based emotion sensing system and method for environmental psychology of teenagers and a stimulation sample selection method, wherein the system comprises an electroencephalogram signal acquisition module, an electroencephalogram signal preprocessing module, an electroencephalogram signal feature extraction module, an emotion sensing module and the like; the established environment that teenagers often contact, participate or are keen is taken as a visual stimulus source. The emotion perception method comprises the steps of visual stimulus source selection, electroencephalogram signal collection, electroencephalogram signal preprocessing, electroencephalogram signal feature extraction, model training, emotion intensity determination and the like. The method for selecting the stimulation sample comprises the steps of dividing 5 emotion intensities in the wakefulness dimension and the valence dimension respectively, and determining a rectangular selection frame in a non-equidistant mode according to the actual selection condition of the sample and the distribution state of the sample in a two-dimensional space. The emotion sensing system and method and the stimulation sample selection method have the advantages of strong emotion sensing capability, strong object expansion capability and the like, and have high application value in environmental psychology research.

Description

Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers
Technical Field
The invention relates to an electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers.
Background
The number of minors under 18 years old in our country is counted up to 3.67 hundred million. According to conservative estimates, there are various learning, mood and behavior disorders up to 3000 ten thousand. In recent years, the incidence rate of Chinese teenager psychological problems is in a remarkable rising trend in recent years, and the incidence rate of children psychological behavior problems is as high as 14-17%. Among them, emotional problems (such as anxiety and depression) are one of the common psychological problems, and have high incidence and wide spread, and affect the behavior and the psychosocial function of individuals. The World Health Organization (WHO) predicts that more than 50% of children and adolescents will develop neuropsychological problems worldwide in 2020, which will become one of the first five causes of disease, disability and death in children and adolescents.
A built environment refers to a man-made environment that provides for human activities, including large urban environments. Relevant research shows that the built environment as an artificial visual environment can change the emotion of people and influence the health and the behavior of the people. Similarly, the built environment around the teenagers can have different effects on the emotional and psychological health of the teenagers. For example, if a teenager is in a hot audio-visual environment, the teenager feels excited and has high emotional arousal level, and the long-term high arousal level brings fatigue and is not beneficial to health; on the contrary, the green environment, the landscape and the static indoor space reduce the emotional arousal degree, and can relieve the emotion and relax the mood.
At present, emotion perception based on electroencephalogram signals becomes a new research hotspot in the fields of environmental psychology and landscape design. For example: the Ulrich detects indexes such as heart rate, blood pressure, brain activity and the like through questionnaires and interviews, compares an artificial natural environment with artificial commercial and industrial environments, and finds that the alpha wave is more obvious in the artificial natural environment; nakamura et al analyzed the number of alpha waves and beta rhythms of subjects looking at the fence and concrete enclosure and found that the alpha ratio (a/(a + b)) was higher when subjects looked at the fence wall than when they looked at the concrete wall; recording and analyzing emotion changes of two dimensions and five channels when a testee passes through a commercial street and a green space by an Aspinall and other people through wearable electroencephalogram acquisition equipment; roe et al also recorded and analyzed through wearable electroencephalograph signals induced by two sets of photographs looking at landscape and urban environments, and concluded that green space has lower arousal and higher meditation than urban scenes.
The above studies focused on the use of electroencephalography for satisfaction and health assessment of a specific environment, and focused on the analysis of the relationship between a specific environment and a general population, but less so, especially in describing the emotional impact of the created environment on teenagers. In addition, from the aspect of emotion description accuracy, in the fields of psychology and computer emotion calculation, although relevant technical bases for emotion recognition and biological signal calculation are available, including single or multi-modal emotion recognition models such as electroencephalogram, electrocardio, respiration rate, blood oxygen saturation, body surface temperature and the like, the above models are mainly used for recognizing some typical emotion types, such as: sadness, happiness, disappointment, anger, etc., which are often provoked by stimuli comprising the story line, it is clear that the typical emotional types described above are not sufficient to describe the natural emotional state of the adolescent in its different set-up environments.
In addition, the existing method is single in the selection of stimulus sources, and most stimulus sources include storylines; and the content of the stimulation object and the source is not distinguished, so that great contingency exists in the actual operation, and the stimulation effect is not satisfactory. Therefore, there is a need to establish an electroencephalogram-based emotion sensing system and method for environmental psychology of teenagers and a stimulation sample selection method, so as to truly and objectively reflect the psychological activities of teenagers.
Disclosure of Invention
The invention provides a system and a method for emotion perception of environmental psychology of teenagers, and a method for selecting stimulation samples, which are high in emotion perception capability, object expansion capability and application value, and avoid the defects in the prior art.
The invention adopts the following technical scheme to solve the technical problem.
The electroencephalogram-based emotion sensing system for the environmental psychology of the teenagers is structurally characterized by comprising an electroencephalogram signal acquisition module, an electroencephalogram signal preprocessing module, an electroencephalogram signal feature extraction module and an emotion sensing module; the established environment which is frequently contacted, participated or enthusiastic by teenagers is taken as a visual stimulus, and the testee is the teenager;
the electroencephalogram signal acquisition module is used for acquiring original 32-lead electroencephalogram signals of a testee through a recording electrode and a reference electrode and sending the acquired original electroencephalogram signals to the electroencephalogram signal preprocessing module; the electroencephalogram signal preprocessing module is used for performing operations such as framing, windowing, band-pass filtering, eye movement artifact removing and the like on the original electroencephalogram signals sent by the electroencephalogram signal acquisition module so as to reduce noise interference and improve the perception performance of the model;
the electroencephalogram signal feature extraction module is used for extracting a power spectrum, energy and a power spectrum difference value of a symmetrical electrode from different electroencephalogram frequency bands of the preprocessed electroencephalogram signals respectively to serve as feature parameters for representing original signals and reducing data redundancy;
the emotion perception module is used for respectively realizing the perception of the emotion intensity of the teenager on the visual environment in the arousal degree dimension and the valence dimension through training and recognition of a Support Vector Machine (SVM).
The invention also provides a method for sensing the emotion of the environmental psychology of the teenagers.
An electroencephalogram-based emotional perception method of the environmental psychology of teenagers comprises the following steps:
step 1: selecting a visual stimulus source; the established environment which is frequently contacted, participated or enthusiastic by teenagers is taken as a visual stimulus, and the teenagers are taken as testees;
step 2: acquiring an electroencephalogram signal; a testee watches the visual stimulus source, and then the original electroencephalogram signal of the testee is collected;
and step 3: preprocessing an electroencephalogram signal; carrying out operations such as framing, windowing, band-pass filtering, eye movement artifact removal and the like on the original electroencephalogram signal;
and 4, step 4: extracting electroencephalogram signal features; dividing the preprocessed electroencephalogram signals into training data and testing data; extracting a power spectrum, energy and a power spectrum difference value of the symmetrical electrode as characteristic parameters for training data and test data;
and 5: training a model; respectively training a wakefulness SVM sub-model and a valence SVM sub-model by using the characteristic parameters and the emotional intensity labels corresponding to the training data;
step 6: determining emotional intensity; and (4) respectively inputting the characteristic parameters of the test data into the two trained arousal SVM submodels and valence SVM submodels in the step (5), and realizing the perception of the emotional intensity in two dimensions of arousal degree and valence.
In the step 3, the frame length is 2 seconds, the frame shift is 1 second, and the Hamming window is used for windowing the original electroencephalogram signal; the cut-off frequency of the band-pass filter is set to be 0.1-45 Hz; and removing the eye movement artifact by adopting an independent component analysis method.
In step 4, the characteristic parameters include power spectra and energy of 6 frequency bands on 32 electrodes, and power spectrum difference values of 5 frequency bands on 12 pairs of electrodes except for a slow Alpha wave frequency band.
The invention also provides a stimulus source selection method in the process of establishing the electroencephalogram-based juvenile emotion perception model.
The stimulus source selection method in the process of establishing the electroencephalogram-based juvenile emotional perception model comprises the following steps,
step 01: manually extracting a 55-second highlight video from all original stimulation videos according to subjective feelings of a testee;
step 02: ensuring that at least 20 testees participate in the experiment, watching the highlight video clips in the step 01 by each tester as much as possible, and respectively grading each tester according to 5 grades after watching each video in two dimensions of arousal degree and titer;
step 03: respectively calculating the standardized score (x) of each video clip in the dimension of the arousal degree according to the scoring value in the step 02a) And a normalized score in the valence dimension score (x)v) The calculation method comprises the following steps:
Figure GDA0002462482780000051
in the above formula, x represents the index number of the stimulation video, i.e. the video segment x, xaIs the score of the arousal level of the testee after watching the stimulus video x, xvIs a score of the titer of the subject after viewing the stimulus video clip x. Mu represents the score average value, and is the average value of scores of a plurality of testees of one video clip x; σ represents the score variance, which is the variance of scores of multiple subjects for one video segment x. Normalized score of video clip x in wakefulness score (x)a) The average value mu of scores of the video segments in the wakefulness dimensionxaDivided by the score variance σxaObtaining; similarly, the video clip has a normalized score in the valence dimension, score (x)v) Mean value μ of score from the video in the valence dimensionxvDivided by the score variance σxvObtaining;
step 04: first, score (x) obtained in step 03 is obtaineda) And score (x)v) Mapping into a two-dimensional coordinate system. Then, dividing 5 rectangular frames in two dimensions of arousal degree and valence in a non-equidistant way for framing video clips corresponding to different emotional intensities;
step 05: and 8 video clips are selected from the rectangular frame in the step 04 as visual stimulus sources.
Compared with the prior art, the invention has the beneficial effects that:
the invention relates to an electroencephalogram-based emotion sensing system and method for environmental psychology of teenagers and a stimulation sample selection method. The method comprises the following steps: firstly, selecting teenagers as research objects, and optimally selecting stimulation samples in different environmental scenes; then, synchronously acquiring an electroencephalogram signal induced by a video when a testee watches the stimulation video, and carrying out preprocessing operations such as framing, windowing, removing eye movement artifacts and the like; then, preprocessing all the acquired electroencephalogram signals, and dividing the electroencephalogram signals into two parts, namely training data and testing data; training a Support Vector Machine (SVM) model on two emotion dimensions of wakefulness and valence by using training data respectively; and finally, respectively inputting the characteristic parameters of the test data into the two trained SVM models to realize the perception of the emotional intensity.
The electroencephalogram-based emotion perception system and method for environmental psychology of teenagers and the stimulation sample selection method have the following 3 aspects of characteristics.
1. The invention has stronger emotion perception capability.
The emotion perception method adopts two independent classification strategies to respectively conduct emotion perception on two dimensions of the arousal degree and the valence, and theoretically, the emotion perceived by the emotion perception method can be any emotion of an arousal degree-valence two-dimensional model (as shown in figure 2). Meanwhile, each dimension is divided into 5 emotion intensities, and rectangular selection frames are determined in a non-equidistant mode according to the actual selection condition of the samples and the distribution state of the samples in a two-dimensional space, so that the selection of the training samples is more scientific and reasonable, the emotion state is more accurately perceived, and the defect that the existing emotion recognition model is poor in description capacity of rich emotion states is effectively overcome. The 10 testees in the laboratory environment use the optimized stimulus source video to perform experiments, the identification accuracy rates of the arousal level and the valence in two dimensions are 70.9% and 64.1% respectively, and the results verify the effectiveness of the method.
2. The invention has strong object expansion capability.
The invention mainly aims to provide a method for sensing the emotion of teenagers in different built environments. The adolescent population has better experimental conditions, such as: the experimental results are from students in the school, the experimental matching degree is high, the body acuity is high, the stability is good, the scientificity of the experiment and the accuracy of the result are ensured, and the application object expansion capability is realized. The emotion and the health condition of the weak people such as children, old people, patients and the like need attention as well as the special social groups. In the field of environmental design, empirical studies around the concepts of medical gardens, rehabilitation gardens, healing landscapes and the like emphasize the three aspects of physiology, psychology and spirit of people and recuperate the overall health, and the studies are mostly completed based on methods such as observation, interview, questionnaire and the like. In the actual investigation process, some people cannot accurately describe subjective emotional feeling or are willing to cooperate with self-evaluation of characters. The emotion perception model established by the method is applied to more crowds, particularly vulnerable groups, through cross-professional cooperation and collaborative design, and has a large expansion space in the future. Through the recognition result of the model, the possibility is provided for emotional cure and health repair aiming at the physical condition and the environmental psychological characteristics of special people.
3. The invention has stronger application value.
Teenagers are special groups of society, and the emotion of the teenagers is affected differently by the established environment where the teenagers live, study or prefer and dislike the established environment, so that the health condition of the teenagers is affected. The emotion perception model system and method and the stimulation sample selection method established by the invention are applied to the abundant emotion change research of teenagers in different environments, and are the basis for scientific prevention and effective control of health problems of the teenagers. Test results show that the model established by the invention can better describe the natural emotional state of the teenagers, can be used for truly and objectively reflecting the psychological activity condition of the teenagers, and provides a new method and a technical means for the empirical research of environmental psychology and environmental design; meanwhile, the invention can also objectively detect the emotional states of different people in various scenes under the condition of avoiding excessive manual intervention, for example, the influence and the restoration of different outdoor environments, such as different open levels, different color levels, different enclosing forms and other conditions on the emotion; the method can also be applied to sensing the emotional state in a certain specific environment, such as the emotion fluctuation of a crowd in the process of visiting a sequence of exhibition spaces and experiencing interactive scenes, and helps designers and decision makers to understand the effect evaluation after the environment is used, and judge the influence degree of different constituent elements, rich design grammar and space atmosphere on the emotional state, thereby providing evidence-based design for environment design and landscape planning.
The electroencephalogram-based emotion sensing system and method for environmental psychology of teenagers and the stimulation sample selection method have the advantages of strong emotion sensing capability, strong object expansion capability and the like, and have high application value in environmental psychology research.
Drawings
FIG. 1 is a schematic diagram of generation and acquisition of an electroencephalogram signal according to the present invention.
Figure 2 is a two-dimensional emotional dimension definition of arousal and valence for the present invention.
Fig. 3 is a basic flow diagram of emotional perception of the present invention.
Fig. 4 is a schematic diagram of two dimensions of potency and arousal for the stimulation sample selection method of the present invention.
FIG. 5 is a schematic diagram of a single experiment paradigm of emotion electroencephalogram of the present invention.
Fig. 6 is a self-evaluation software interface used in a single experiment of emotion electroencephalogram.
Fig. 7 is a schematic view of the 32 electrode mounting positions of the present invention.
FIG. 8 is a schematic diagram of the framing of the original EEG signal (EEG preprocessing step) according to the present invention.
The present invention will be further described with reference to the following detailed description and accompanying drawings.
Detailed Description
Referring to fig. 1-8, the electroencephalogram-based emotion sensing system for the environmental psychology of teenagers comprises an electroencephalogram signal acquisition module, an electroencephalogram signal preprocessing module, an electroencephalogram signal feature extraction module and an emotion sensing module; the established environment which is frequently contacted, participated or enthusiastic by teenagers is taken as a visual stimulus, and the testee is the teenager;
the electroencephalogram signal acquisition module is used for acquiring original 32-lead electroencephalogram signals of a testee through a recording electrode and a reference electrode and sending the acquired original electroencephalogram signals to the electroencephalogram signal preprocessing module; the original 32-lead electroencephalogram signals are acquired by adopting an international 10-20 system electrode placement method, and each electrode corresponds to 1 electroencephalogram signal. The electrode placement position of the international 10-20 system adopted by the invention is as follows: 30 recording electrodes and 2 reference electrodes were placed, and the arrangement of 32 electrodes was as shown in FIG. 7;
the electroencephalogram signal preprocessing module is used for performing operations such as framing, windowing, band-pass filtering, eye movement artifact removing and the like on the original electroencephalogram signals sent by the electroencephalogram signal acquisition module so as to reduce noise interference and improve the perception performance of the model;
the electroencephalogram signal feature extraction module is used for extracting a power spectrum, energy and a power spectrum difference value of a symmetric electrode from different electroencephalogram frequency bands of the preprocessed electroencephalogram signal respectively to serve as feature parameters (the feature parameters are used for representing original signals and reducing data redundancy);
the emotion perception module is used for respectively realizing the perception of the emotion intensity of the teenager on the visual environment in the arousal degree dimension and the valence dimension through training and recognition of a Support Vector Machine (SVM). The two dimensions of the arousal degree and the valence are divided into 5 levels, and the 5 levels are level 1-level 5 from low to high. Wherein, the lowest level 1 of the arousal degree is in a calm state, and the highest level 5 is in an excitation state; the minimum level 1 of the titer is unpleasant, and the maximum level 5 is pleasant. In the invention, because the stimulus source video selects the visual environment of the teenager instead of the story sheet containing plots, and is not easy to generate some extreme emotions, the training samples are selected on 5 levels of two dimensions, and a non-equidistant division method is adopted to obtain the optimal perception effect.
An electroencephalogram-based emotional perception method of the environmental psychology of teenagers comprises the following steps:
step 1: selecting a visual stimulus source; the established environment which is frequently contacted, participated or enthusiastic by teenagers is taken as a visual stimulus, and the teenagers are taken as testees;
step 2: acquiring an electroencephalogram signal; a testee watches the visual stimulus source, and then the original electroencephalogram signal of the testee is collected; the display first appears as a blank screen for 5 seconds, and then the computer issues a 20ms "beep" warning tone; after 0.5 second, the stimulation video is randomly displayed, at the moment, the electroencephalogram signals of the testee are synchronously recorded, and the acquisition rate is set to be 250 Hz; after the video clip is played, the testee clicks the corresponding option on the self-evaluation software to complete the emotion intensity evaluation table. And (4) completing the steps, and after the testee slightly relaxes, continuing to play the next video by the system, and acquiring the corresponding electroencephalogram signals until all videos are played.
Then, the subject is replaced. The above-described steps of "display first appears 5 seconds of blank screen … … until all video is played" are repeated for the next subject being replaced.
And step 3: preprocessing an electroencephalogram signal; carrying out operations such as framing, windowing, band-pass filtering, eye movement artifact removal and the like on the original electroencephalogram signal;
and 4, step 4: extracting electroencephalogram signal features; dividing the preprocessed electroencephalogram signals into training data and testing data; extracting a power spectrum, energy and a power spectrum difference value of the symmetrical electrode as characteristic parameters for training data and test data; and respectively extracting power spectrum and energy from the training data and the test data on 6 typical electroencephalogram signal frequency bands and 32 electrodes, and taking 444-dimension data which are the difference values of the power spectrum of 5 typical electroencephalogram signal frequency bands and 12 pairs of symmetrical electrodes as characteristic parameters.
And 5: training a model; respectively training a wakefulness SVM sub-model and a valence SVM sub-model by using the characteristic parameters and the emotional intensity labels corresponding to the training data;
step 6: determining emotional intensity; and (4) respectively inputting the characteristic parameters of the test data into the two trained arousal SVM submodels and valence SVM submodels in the step (5), and realizing the perception of the emotional intensity in two dimensions of arousal degree and valence.
In the step 3, the frame length is 2 seconds, the frame shift is 1 second, and the Hamming window is used for windowing the original electroencephalogram signal; the cut-off frequency of the band-pass filter is set to be 0.1-45 Hz; and removing the eye movement artifact by adopting an independent component analysis method. The removing step of the eye movement artifact comprises the following steps: first, the original 32-lead EEG signals are statistically independently analyzed using independent component analysis ICA to obtain different "sources"; then, determining eye movement independent "sources" by calculating kurtosis of all output channels; and finally, resetting the output channel corresponding to the eye movement artifact to zero and reconstructing the observation signal, thereby realizing the removal of the eye movement artifact.
In step 4, the characteristic parameters include power spectra and energy of 6 frequency bands on 32 electrodes, and power spectrum difference values of 5 frequency bands on 12 pairs of electrodes except for a slow Alpha wave frequency band. The characteristic parameters comprise power spectrums and energy on 6 frequency bands of Delta (0.5-4Hz), Theta (4-8Hz), Alpha (8-12Hz), slow-speed Alpha (8-10Hz), Beta (12-30Hz) and Gamma (30-40 Hz); meanwhile, the method also comprises the step of extracting 5 frequency range power spectrum difference values except a slow Alpha (8-10Hz) wave frequency range on 12 pairs of electrodes such as FP1-FP2, F7-F8, F3-F4, FT7-FT8, FC3-FC4, T3-T4, C3-C4, TP7-TP8, CP3-CP4, T5-T6, P3-P4, O1-O2 and the like as supplementary characteristic parameters, and the distribution of the 12 pairs of electrodes is shown in figures 1 and 7. The characteristic parameters are 444-dimensional parameters; the 444-dimensional parameters include: power spectrum: 32 leads × 6 frequency bands, 192 dimensions; energy: 32 lead × 6 frequency bands 192; power spectrum difference value: 12 leads x 5 frequency bands 60; thus, the 192+192+ 60-444 dimensional features are accumulated.
The stimulus source selection method in the process of establishing the electroencephalogram-based juvenile emotional perception model comprises the following steps,
step 01: manually extracting 55-second highlight videos, namely video clips most likely to induce emotion, from all original stimulation videos according to subjective feelings of a testee;
step 02: ensuring that at least 20 testees participate in the experiment, watching the highlight video clips in the step 01 by each tester as much as possible, and respectively grading each tester according to 5 grades after watching each video in two dimensions of arousal degree and titer;
step 03: respectively calculating the standardized score (x) of each video clip in the dimension of the arousal degree according to the scoring value in the step 02a) And a normalized score in the valence dimension score (x)v) The calculation method comprises the following steps:
Figure GDA0002462482780000111
in the above formula, x represents the index number of the stimulation video, i.e. the video segment x, xaIs the score of the arousal level of the testee after watching the stimulus video x, xvIs a score of the titer of the subject after viewing the stimulus video clip x. Mu represents the score average value, and is the average value of scores of a plurality of testees of one video clip x; σ represents the score variance, which is the variance of scores of multiple subjects for one video segment x. Normalized score of video clip x in wakefulness score (x)a) The average value mu of scores of the video segments in the wakefulness dimensionxaDivided by the score variance σxaObtaining; similarly, the video clip has a normalized score in the valence dimension, score (x)v) Mean value μ of score from the video in the valence dimensionxvDivided by the score variance σxvObtaining;
step 04: first, score (x) obtained in step 03 is obtaineda) And score (x)v) Mapping into a two-dimensional coordinate system. Then, dividing 5 rectangular frames in two dimensions of arousal degree and valence in a non-equidistant way for framing video clips corresponding to different emotional intensities; the rectangular box is set as shown in FIG. 4, in the vertical axis wakefulness dimension: level3 (middle)Sex) rectangular frame takes the horizontal axis of 0 as the central line, and 8 videos closest to the central line are selected as the stimulus sources; according to the distribution condition of the videos, level 1 (calm) and level 5 (excited) are respectively the selection frames which are closest to a +/-2.5 horizontal axis and cover 8 videos; the level2 and level 4 rectangular frames respectively take a horizontal axis of +/-0.7 as a central line, and 8 videos closest to the central line are selected as stimulation sources;
in the horizontal axis titer dimension: the level3 (neutral) rectangular frame takes the longitudinal axis of 0 as a central line, and 8 videos closest to the central line are selected as stimulation sources; according to the distribution of the videos, level 1 (not pleasant) and level 5 (pleasant) are selection boxes which are closest to the vertical axes of-1.5 and +3 and cover 8 videos respectively; the level2 and level 4 rectangular frames respectively take longitudinal axes of-0.4 and +1 as central lines, and 8 videos closest to the central lines are selected as stimulation sources; )
Step 05: and selecting 8 video clips from the rectangular frame in the step 04 as visual stimulus sources, namely selecting 8 video clips meeting the above conditions from the rectangular frame as preferred stimulus videos.
FIG. 1 is a schematic diagram of electroencephalogram signal generation. When a testee watches a stimulus scene, the accumulated activity of a large number of nerve cells can generate electric potential on the surface of the hemisphere of the brain, and the electric potential can be acquired through a plurality of biological electrodes placed on the cerebral cortex to form an electroencephalogram.
Referring to fig. 2, the definition of the two dimensions of arousal and valence in the present embodiment is illustrated. The horizontal axis is the emotional valence related to the preference degree, and the dimension is from unpleasant to pleasant so as to measure the positive and negative emotional states of a person; the vertical axis represents the degree of emotional arousal in relation to the degree of excitement, and represents the degree of emotional arousal from calm to excitement. Subtle, complex, extreme emotional states (e.g., sadness, satisfaction, anger, etc.) of humans can be found in this two-dimensional model.
Referring to fig. 3, a basic flow of emotion perception in the present embodiment is explained. Firstly, teenagers are taken as research objects, and stimulus source samples in different environmental scenes are optimally selected. When a testee watches the stimulation video, the electroencephalogram signals induced by the video are synchronously collected, and preprocessing operations such as framing, windowing, removing eye movement artifacts and the like are performed. Secondly, after all the collected emotion electroencephalogram data are preprocessed, the emotion electroencephalogram data are divided into two parts, namely training and testing, and power spectrum, energy and 444-dimensional data, which are counted by power spectrum difference values of symmetrical electrodes, are respectively extracted on 6 typical electroencephalogram signal frequency bands and 32 electrodes to serve as characteristic parameters. And then, respectively training a wakefulness SVM submodel and a valence SVM submodel by using the characteristic parameters and the emotional intensity labels corresponding to the training data. And finally, respectively inputting the characteristic parameters of the test data into the two trained SVM submodels to realize the perception of the emotional intensity on two dimensions of the arousal degree and the valence.
Referring to fig. 4, a stimulation sample selection method in this embodiment is illustrated. The dots in the figure are scores corresponding to 100 videos, and it can be seen that the overall distribution of the selected points is not uniform. Extreme emotional states are rarely produced, for example, there are few sample points at the junction of low valence and high arousal (the type of emotion corresponding to this location is anger), because stimulus source video mostly chooses the established environment in which teenagers are located rather than storylines, and does not produce the emotion. According to the distribution situation, in order to improve the performance of the emotion perception model, the two dimensions of the method are respectively divided into 5 typical intervals so as to select representative training samples. The rectangular frame arrangement is shown in fig. 4.
In the vertical axis arousal dimension: the level3 (neutral) rectangular frame takes the horizontal axis of 0 as a central line, and 8 videos closest to the central line are selected as stimulus sources; according to the distribution situation of the videos, level 1 (calm) and level 5 (excited) are respectively rectangular frames which are closest to a +/-2.5 horizontal axis and cover 8 videos, the lowermost side line of the rectangular frame of the level 1 is a horizontal line which is parallel to the horizontal axis and has a numerical value of-2.5, and the uppermost side line of the rectangular frame of the level 5 is a horizontal line which is parallel to the horizontal axis and has a numerical value of + 2.5; the level2 and level 4 rectangular frames respectively take a horizontal axis of +/-0.7 as a central line, and 8 videos closest to the central line are selected as stimulation sources;
in the horizontal axis titer dimension: the level3 (neutral) rectangular frame takes the longitudinal axis of 0 as a central line, and 8 videos closest to the central line are selected as stimulation sources; according to the distribution situation of the videos, level 1 (unpleasant) and level 5 (pleasant) are respectively the selection frames which are closest to the vertical axes of-1.5 and +3 and cover 8 videos, the leftmost side line of the rectangular frame of the level 1 is a vertical line which is parallel to the vertical axis and has the value of-1.5, and the rightmost side line of the rectangular frame of the level 1 is a vertical line which is parallel to the vertical axis and has the value of + 3; the level2 and level 4 rectangular frames respectively take longitudinal axes of-0.4 and +1 as central lines, and 8 videos closest to the central lines are selected as stimulation sources;
referring to fig. 5, a specific process of the emotion perception single experiment paradigm in this embodiment is illustrated. First, the display first appears as a 5 second blank screen, and then the computer issues a 20ms warning tone ("beep"); after 0.5 second, the stimulation video is randomly displayed, at the moment, the electroencephalogram signals of the testee are synchronously recorded, and the acquisition rate is set to be 250 Hz; after the video clip is played, the testee clicks the corresponding option on the self-evaluation software to complete the emotion intensity evaluation table. After the above steps are completed, the subject slightly relaxes and continues the next set of tests. In order to guarantee the stimulation effect, a neutral video with the color band as the main tone is played between every two stimulation videos. The stimulation effect can be more obvious by playing the neutral video.
Referring to fig. 6, a self-evaluation software interface in the present embodiment is illustrated. The software has the main functions of video playing and emotion intensity marking, is used for replacing the traditional questionnaire, and has the characteristics of high efficiency, intelligence, simplicity in use and the like. In the experimental process, after a testee watches a section of stimulation video, the testee clicks a radio box with the same feeling on different dimensions by using a mouse.
Referring to fig. 7, the mounting positions of the 32 electrodes in the present embodiment are explained. An international 10-20 system electrode placement method is adopted, and specifically comprises the following steps: based on a median line formed by connecting the nasal root to the occipital tuberosity, a left forehead point (FP1, FP2), a forehead point (FC3, FC4), a center point (C3, C4), a vertex (CP3, CP4) and a occipital point (O1, O2) are determined at corresponding positions which are equidistant from the left and the right of the line, the position of the forehead point is equivalent to 10% of the distance from the nasal root to the occipital tuberosity on the nasal root, the forehead point is equivalent to two times of the distance from the nasal root to the forehead point, namely 20% of the distance from the nasal root to the median line after the forehead point, and the intervals of the points towards the back center, the vertex and the occipital are all 20%.
Referring to fig. 8, a framing method in the present embodiment is explained. Let the n-th frame signal after framing be xn(m), the windowing process can be represented as:
Figure GDA0002462482780000151
in the above formula, the first and second carbon atoms are,
Figure GDA0002462482780000152
representing the windowed signal, ω (m) represents the window function, in this example we use a hamming window, which is defined as:
Figure GDA0002462482780000153
where N denotes a frame length, M denotes a frame shift, M ═ (1/2) N, and M are positive integers, and N is an integer multiple of 128 as needed, as follows: 128, 256, 512, etc.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (5)

1. The electroencephalogram-based emotion sensing system for the environmental psychology of the teenagers is characterized by comprising an electroencephalogram signal acquisition module, an electroencephalogram signal preprocessing module, an electroencephalogram signal feature extraction module and an emotion sensing module; the method takes the built environment which is frequently contacted, participated or enthusiastic by teenagers as a visual stimulus, the testee is the teenager, and the stimulus selection comprises the following steps:
step 01: manually extracting a 55-second highlight video from all original stimulation videos according to subjective feelings of a testee;
step 02: ensuring that at least 20 testees participate in the experiment, watching the highlight video clips in the step 01 by each tester as much as possible, and respectively grading each tester according to 5 grades after watching each video in two dimensions of arousal degree and titer;
step 03: respectively calculating the standardized score (x) of each video clip in the dimension of the arousal degree according to the scoring value in the step 02a) And a normalized score in the valence dimension score (x)v) The calculation method comprises the following steps:
Figure FDA0002462482770000011
in the above formula, x represents the index number of the stimulation video, i.e. the video segment x, xaIs the score of the arousal level of the testee after watching the stimulus video x, xvIs the rating of the valence of the subject after watching the stimulus video clip x; mu represents the score average value, and is the average value of scores of a plurality of testees of one video clip x; σ represents the score variance, which is the variance of scores of multiple subjects of one video segment x; normalized score of video clip x in wakefulness score (x)a) The average value mu of scores of the video segments in the wakefulness dimensionxaDivided by the score variance σxaObtaining; similarly, the video clip has a normalized score in the valence dimension, score (x)v) Mean value μ of score from the video in the valence dimensionxvDivided by the score variance σxvObtaining;
step 04: first, score (x) obtained in step 03 is obtaineda) Andscore(xv) Mapping into a two-dimensional coordinate system; then, dividing 5 rectangular frames in two dimensions of arousal degree and valence in a non-equidistant way for framing video clips corresponding to different emotional intensities;
step 05: selecting 8 video clips from the rectangular frame in the step 04 as visual stimulus sources;
the electroencephalogram signal acquisition module is used for acquiring original 32-lead electroencephalogram signals of a testee through a recording electrode and a reference electrode and sending the acquired original electroencephalogram signals to the electroencephalogram signal preprocessing module;
the electroencephalogram signal preprocessing module is used for performing framing, windowing, band-pass filtering and eye movement artifact removing operations on the original electroencephalogram signals sent by the electroencephalogram signal acquisition module;
the electroencephalogram signal feature extraction module is used for extracting a power spectrum, energy and a power spectrum difference value of the symmetrical electrodes from different electroencephalogram frequency bands of the preprocessed electroencephalogram signals respectively to serve as feature parameters;
the emotion perception module is used for respectively realizing the perception of the emotion intensity of the teenager on the visual environment in the arousal degree dimension and the valence dimension through training and recognition of a Support Vector Machine (SVM).
2. An electroencephalogram-based emotional perception method of the environmental psychology of teenagers is characterized by comprising the following steps:
step 1: selecting a visual stimulus source; the established environment that teenagers frequently contact, participate or are keen is taken as a visual stimulus, and the teenagers are taken as testees, and the visual stimulus method comprises the following substeps:
step 01: manually extracting a 55-second highlight video from all original stimulation videos according to subjective feelings of a testee;
step 02: ensuring that at least 20 testees participate in the experiment, watching the highlight video clips in the step 01 by each tester as much as possible, and respectively grading each tester according to 5 grades after watching each video in two dimensions of arousal degree and titer;
step 03: respectively calculating the standardized score (x) of each video clip in the dimension of the arousal degree according to the scoring value in the step 02a) And a normalized score in the valence dimension score (x)v) The calculation method comprises the following steps:
Figure FDA0002462482770000021
in the above formula, x represents the index number of the stimulation video, i.e. the video segment x, xaIs the score of the arousal level of the testee after watching the stimulus video x, xvIs the rating of the valence of the subject after watching the stimulus video clip x; mu represents the score average value, and is the average value of scores of a plurality of testees of one video clip x; σ represents the score variance, which is the variance of scores of multiple subjects of one video segment x; normalized score of video clip x in wakefulness score (x)a) The average value mu of scores of the video segments in the wakefulness dimensionxaDivided by the score variance σxaObtaining; similarly, the video clip has a normalized score in the valence dimension, score (x)v) Mean value μ of score from the video in the valence dimensionxvDivided by the score variance σxvObtaining;
step 04: first, score (x) obtained in step 03 is obtaineda) And score (x)v) Mapping into a two-dimensional coordinate system; then, dividing 5 rectangular frames in two dimensions of arousal degree and valence in a non-equidistant way for framing video clips corresponding to different emotional intensities;
step 05: selecting 8 video clips from the rectangular frame in the step 04 as visual stimulus sources;
step 2: acquiring an electroencephalogram signal; a testee watches the visual stimulus source, and then the original electroencephalogram signal of the testee is collected;
and step 3: preprocessing an electroencephalogram signal; performing framing, windowing, band-pass filtering and eye movement artifact removing operations on the original electroencephalogram signals;
and 4, step 4: extracting electroencephalogram signal features; dividing the preprocessed electroencephalogram signals into training data and testing data; extracting a power spectrum, energy and a power spectrum difference value of the symmetrical electrode as characteristic parameters for training data and test data;
and 5: training a model; respectively training a wakefulness SVM sub-model and a valence SVM sub-model by using the characteristic parameters and the emotional intensity labels corresponding to the training data;
step 6: determining emotional intensity; and (4) respectively inputting the characteristic parameters of the test data into the two trained arousal SVM submodels and valence SVM submodels in the step (5), and realizing the perception of the emotional intensity in two dimensions of arousal degree and valence.
3. The emotion perception method according to claim 2, wherein in said step 3, the frame length is 2 seconds, the frame shift is 1 second, and the original electroencephalogram signal is windowed using a hamming window; the passband frequency of the bandpass filter is set to 0.1-45 Hz; and removing the eye movement artifact by adopting an independent component analysis method.
4. The emotion perception method according to claim 2, wherein in said step 4, said characteristic parameters include power spectrum and energy of 6 frequency bands on 32 electrodes, and power spectrum difference of 5 frequency bands on 12 pairs of electrodes except for slow Alpha wave frequency band.
5. The electroencephalogram-based stimulus source selection method in the process of establishing the teenager emotion perception model is characterized by comprising the following steps of:
step 01: manually extracting a 55-second highlight video from all original stimulation videos according to subjective feelings of a testee;
step 02: ensuring that at least 20 testees participate in the experiment, watching the highlight video clips in the step 01 by each tester as much as possible, and respectively grading each tester according to 5 grades after watching each video in two dimensions of arousal degree and titer;
step 03: respectively calculating the standardized score (x) of each video clip in the dimension of the arousal degree according to the scoring value in the step 02a) And a standard in the valence dimensionScore (x)v) The calculation method comprises the following steps:
Figure FDA0002462482770000041
in the above formula, x represents the index number of the stimulation video, i.e. the video segment x, xaIs the score of the arousal level of the testee after watching the stimulus video x, xvIs the rating of the valence of the subject after watching the stimulus video clip x; mu represents the score average value, and is the average value of scores of a plurality of testees of one video clip x; σ represents the score variance, which is the variance of scores of multiple subjects of one video segment x; normalized score of video clip x in wakefulness score (x)a) The average value mu of scores of the video segments in the wakefulness dimensionxaDivided by the score variance σxaObtaining; similarly, the video clip has a normalized score in the valence dimension, score (x)v) The mean value μ of the score from the video clip in the valence dimensionxvDivided by the score variance σxvObtaining;
step 04: first, score (x) obtained in step 03 is obtaineda) And score (x)v) Mapping into a two-dimensional coordinate system; then, dividing 5 rectangular frames in two dimensions of arousal degree and valence in a non-equidistant way for framing video clips corresponding to different emotional intensities;
step 05: and 8 video clips are selected from the rectangular frame in the step 04 as visual stimulus sources.
CN201710252127.6A 2017-04-18 2017-04-18 Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers Active CN107080546B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710252127.6A CN107080546B (en) 2017-04-18 2017-04-18 Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710252127.6A CN107080546B (en) 2017-04-18 2017-04-18 Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers

Publications (2)

Publication Number Publication Date
CN107080546A CN107080546A (en) 2017-08-22
CN107080546B true CN107080546B (en) 2020-08-21

Family

ID=59611578

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710252127.6A Active CN107080546B (en) 2017-04-18 2017-04-18 Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers

Country Status (1)

Country Link
CN (1) CN107080546B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2743876C1 (en) * 2020-05-19 2021-03-01 Татьяна Евгеньевна Ефремова Method for rehabilitation of children and adolescents with behavioral and emotional disorders suffering from mental disorders

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108143412B (en) * 2017-12-22 2020-09-04 苏州创捷传媒展览股份有限公司 Control method, device and system for electroencephalogram emotion analysis of children
CN108294739B (en) * 2017-12-27 2021-02-09 苏州创捷传媒展览股份有限公司 Method and device for testing user experience
CN110310722A (en) * 2018-03-27 2019-10-08 中育苑(北京)文化传媒股份有限公司 Mental measurement and leading method and system based on image information
CN108937968B (en) * 2018-06-04 2021-11-19 安徽大学 Lead selection method of emotion electroencephalogram signal based on independent component analysis
CN108881985A (en) * 2018-07-18 2018-11-23 南京邮电大学 Program points-scoring system based on brain electricity Emotion identification
CN109190658A (en) * 2018-07-19 2019-01-11 中国电子科技集团公司电子科学研究院 Video degree of awakening classification method, device and computer equipment
CN109276243A (en) * 2018-08-31 2019-01-29 易念科技(深圳)有限公司 Brain electricity psychological test method and terminal device
KR20210055060A (en) * 2018-09-04 2021-05-14 존슨 앤드 존슨 컨수머 인코포레이티드 Apparatus and method for evaluating the emotions of infants and toddlers
CN109157231B (en) * 2018-10-24 2021-04-16 阿呆科技(北京)有限公司 Portable multichannel depression tendency evaluation system based on emotional stimulation task
CN111104815A (en) * 2018-10-25 2020-05-05 北京入思技术有限公司 Psychological assessment method and device based on emotion energy perception
CN109567830B (en) * 2018-10-30 2021-03-02 清华大学 Personality measuring method and system based on neural response
CN109620260A (en) * 2018-12-05 2019-04-16 广州杰赛科技股份有限公司 Psychological condition recognition methods, equipment and storage medium
CN109697413B (en) * 2018-12-13 2021-04-06 合肥工业大学 Personality analysis method, system and storage medium based on head gesture
CN111413874B (en) * 2019-01-08 2021-02-26 北京京东尚科信息技术有限公司 Method, device and system for controlling intelligent equipment
JP6709966B1 (en) * 2019-03-29 2020-06-17 パナソニックIpマネジメント株式会社 Mental state estimation system, mental state estimation method, program, estimation model generation method
CN110141258A (en) * 2019-05-16 2019-08-20 深兰科技(上海)有限公司 A kind of emotional state detection method, equipment and terminal
CN111466931A (en) * 2020-04-24 2020-07-31 云南大学 Emotion recognition method based on EEG and food picture data set
CN112401886B (en) * 2020-10-22 2023-01-31 北京大学 Processing method, device and equipment for emotion recognition and storage medium
CN113576478A (en) * 2021-04-23 2021-11-02 西安交通大学 Electroencephalogram signal-based image emotion classification method, system and device
CN113349778B (en) * 2021-06-03 2023-02-17 杭州回车电子科技有限公司 Emotion analysis method and device based on transcranial direct current stimulation and electronic device
CN114027840A (en) * 2021-11-12 2022-02-11 江苏科技大学 Emotional electroencephalogram recognition method based on variational modal decomposition
CN114638263A (en) * 2022-03-15 2022-06-17 华南理工大学 Building space satisfaction evaluation method based on electroencephalogram signals
CN114870195B (en) * 2022-05-09 2023-03-21 北京航空航天大学 Method for regulating emotion of personnel by using vegetable green wall

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103690165B (en) * 2013-12-12 2015-04-29 天津大学 Modeling method for cross-inducing-mode emotion electroencephalogram recognition
CN105894039A (en) * 2016-04-25 2016-08-24 京东方科技集团股份有限公司 Emotion recognition modeling method, emotion recognition method and apparatus, and intelligent device
CN106267514B (en) * 2016-10-19 2019-07-23 上海大学 Feeling control system based on brain electricity feedback

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2743876C1 (en) * 2020-05-19 2021-03-01 Татьяна Евгеньевна Ефремова Method for rehabilitation of children and adolescents with behavioral and emotional disorders suffering from mental disorders

Also Published As

Publication number Publication date
CN107080546A (en) 2017-08-22

Similar Documents

Publication Publication Date Title
CN107080546B (en) Electroencephalogram-based emotion perception and stimulus sample selection method for environmental psychology of teenagers
Chiang et al. Wild or tended nature? The effects of landscape location and vegetation density on physiological and psychological responses
CN109298779B (en) Virtual training system and method based on virtual agent interaction
CN109224242B (en) Psychological relaxation system and method based on VR interaction
CN106569604B (en) Audiovisual bimodal semantic matches and semantic mismatch collaboration stimulation brain-machine interface method
CN103690165B (en) Modeling method for cross-inducing-mode emotion electroencephalogram recognition
Wan et al. Measuring the impacts of virtual reality games on cognitive ability using EEG signals and game performance data
CN108324292B (en) Indoor visual environment satisfaction degree analysis method based on electroencephalogram signals
US20220039715A1 (en) Realtime evaluation method and system for virtual reality immersion effect
Meng et al. Exploring training effect in 42 human subjects using a non-invasive sensorimotor rhythm based online BCI
CN112545519B (en) Real-time assessment method and system for group emotion homogeneity
CN113425297A (en) Electroencephalogram signal-based children attention concentration training method and system
CN105919556A (en) Near-infrared brain imager map collecting method based on cognitive tasks
Li et al. Neurophysiological and subjective analysis of VR emotion induction paradigm
Su et al. Adolescents environmental emotion perception by integrating EEG and eye movements
CN116211306A (en) Psychological health self-evaluation system based on eye movement and electrocardiosignal
CN112347984A (en) Olfactory stimulus-based EEG (electroencephalogram) acquisition and emotion recognition method and system
Huang Recognition of psychological emotion by EEG features
Rad et al. Cognitive and perceptual influences of architectural and urban environments with an emphasis on the experimental procedures and techniques
CN115659207A (en) Electroencephalogram emotion recognition method and system
Abreu et al. Increased N250 elicited by facial familiarity: an ERP study including the face inversion effect and facial emotion processing
CN113057652A (en) Brain load detection method based on electroencephalogram and deep learning
CN113269084B (en) Movie and television play market prediction method and system based on audience group emotional nerve similarity
Su et al. A spatial filtering approach to environmental emotion perception based on electroencephalography
Hercegfi Improved temporal resolution heart rate variability monitoring—pilot results of non-laboratory experiments targeting future assessment of human-computer interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200720

Address after: 234000 building 7, electronic industrial park, high tech Zone, Suzhou City, Anhui Province

Applicant after: Anhui Zhiqu Angel Information Technology Co., Ltd

Address before: 230601 No. 111 Jiulong Road, Hefei economic and Technological Development Zone, Anhui, China

Applicant before: ANHUI University

GR01 Patent grant
GR01 Patent grant