CN114640699B - Emotion induction monitoring system based on VR role playing game interaction - Google Patents

Emotion induction monitoring system based on VR role playing game interaction Download PDF

Info

Publication number
CN114640699B
CN114640699B CN202210146377.2A CN202210146377A CN114640699B CN 114640699 B CN114640699 B CN 114640699B CN 202210146377 A CN202210146377 A CN 202210146377A CN 114640699 B CN114640699 B CN 114640699B
Authority
CN
China
Prior art keywords
emotion
testee
module
induction
feedback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210146377.2A
Other languages
Chinese (zh)
Other versions
CN114640699A (en
Inventor
李远清
林璇琨
瞿军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202210146377.2A priority Critical patent/CN114640699B/en
Publication of CN114640699A publication Critical patent/CN114640699A/en
Application granted granted Critical
Publication of CN114640699B publication Critical patent/CN114640699B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses an emotion induction monitoring system based on VR role playing game interaction, which is characterized in that a scene is presented through VR, and a scenario task and a punishment mechanism of the role playing game are combined to provide better immersion for a tested person, so that the tested person is stimulated to more actively receive emotion induction stimulation of the scene, and corresponding emotion is actively generated; meanwhile, through role dialogue interaction, the selectable dialogue is used as emotion self-evaluation feedback of the testee, so that subjective feeling of the testee emotion can be recorded in real time, and the time point of emotion induction control can be marked on the emotion signal in time, thereby being beneficial to real-time monitoring of emotion induction of the testee; visual feedback based on VR is also provided, so that the feedback effect is more obvious, and the tested person is stimulated to perform emotion regulation more actively.

Description

Emotion induction monitoring system based on VR role playing game interaction
Technical Field
The invention relates to the technical field of brain-computer interface technology and emotion induction and VR role playing games, in particular to an emotion induction monitoring system based on VR role playing game interaction.
Background
Emotion is a psychological and physiological process triggered by conscious or unconscious perception of an object or condition, is an important component of human psychology, can have important influence on human decision and behaviors, and is extremely important to accurately identify and scientifically regulate emotion. Researchers interested in emotions strive to elicit an emotional response in the laboratory, and the emotion induction technology is an experimental technology for directly exploring the psychological mechanism of emotion, and has important significance for emotion research.
Bioelectric phenomena are one of the fundamental features of vital activities. The bioelectric signals generated by the brain when the human beings perform thinking activities are brain waves. All visible (walking, speaking, etc.) and invisible (cell growth, metabolism, etc.) activities of the human body are closely related to the state and function of the brain at all times, without the dominance of the brain. The brain electrical signal has a corresponding relation with the emotion change of the person, so that the emotion state of the person can be judged through the brain electrical signal. Compared with the method for monitoring emotion through a tested self-report (in the forms of emotion self-assessment, psychological interview, questionnaire and the like), voice, facial expression or human body posture, the brain electrical signal is not easy to disguise and sensitive in response, and the result recognized through the brain electrical signal is more objective and real.
From a psychological perspective, existing mood-inducing means are known to include material-and situational-induction. Material inducements can be used in sentences, pictures, video, music, etc., and situational inducements include imaginations, conferences, virtual reality and gaming tasks (such as games), etc. The duration of emotion induced by pictures is short, but the effect of the emotion induced by videos and music is difficult to grasp due to different susceptibility of testees to emotion induction modes, large individual differences exist, the reactions of different people to the same stimulus source can be quite opposite, and time deviation exists in emotion monitoring.
Virtual Reality (VR) technology simulates and generates a 3D virtual world through related devices, and provides a user with a sensory simulation experience such as audio-visual or tactile, so that the user is on the scene. A user may interact with the virtual reality environment through a helmet, a handle controller, or the like. Currently, VR technology has very wide application in various fields (education, games, medical treatment, etc.), and development prospects are attractive.
It is known that the existing emotion induction methods based on brain-computer interfaces are all material induction, and even though VR interaction is limited to emotion induction by VR video and music in the application number 201811030702.9, the VR interaction is still material induction, and "interaction" is not actually realized, and the effect of the emotion induction method is still that the testee passively receives the stimulation of the emotion material, and the testee is still a large difference from the induction effect of actively depending on the situation change through imagination, recall or virtual reality.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings of the prior art, and provides a more active and effective emotion induction monitoring system based on VR role playing game interaction for a testee.
In order to achieve the above purpose, the technical scheme provided by the invention is as follows: the emotion induction monitoring system based on VR role playing game interaction comprises an emotion induction module, a signal acquisition module, an emotion recognition module and an emotion feedback module;
the emotion induction module is used for presenting a scene and a simulation situation of virtual reality and inducing emotion of a testee through a VR role playing game;
the signal acquisition module is used for acquiring and recording emotion signals of a user, the acquired emotion signals are calibrated into training data and on-line test data, wherein the training data are stored in a computer and can be read by the emotion recognition module, and the on-line test data are transmitted to the emotion recognition module in real time;
the emotion recognition module is used for processing emotion signals acquired by the signal acquisition module, and comprises the steps of reading stored training data and receiving online test data in real time, preprocessing the training data and the online test data respectively, extracting features to obtain corresponding feature vectors, wherein the feature vectors obtained by the training data are used for constructing an emotion-based classifier model and a feedback criterion, and the feature vectors obtained by the online test data are used for obtaining processing results through the classifier model and the feedback criterion, so that monitoring and recognition of the emotion of a tested person are realized and are used for online emotion feedback;
the emotion feedback module is used for presenting the result processed by the emotion recognition module, namely feeding back the current emotion state of the testee to the experimenter and the testee on line, facilitating the experimenter to observe the emotion induction effect of the testee, and also assisting the testee to know and adjust the emotion state of the testee.
Further, the emotion induction module comprises VR equipment for presenting a virtual reality scene, and is communicated with a computer carrying the emotion recognition module through TCP; the emotion induction module further comprises a VR role playing game for inducing the emotion of the testee, wherein the VR role playing game comprises game scenarios corresponding to different emotions, and the game is provided with a task system and a reward and punishment mechanism for exciting the testee to concentrate on game content and actively receiving emotion induction of the system; the game prompts the emotion required to be induced by the testee through the side white, role dialogue, CG animation and character action to exchange the scenario, combining game tasks and rewards and punishments, and based on the scenario and the condition duration of the testee, the scenario animation and the sound effect are variable; the role dialogue comprises a fixed dialogue for delivering dramas and an optional dialogue for expressing subjective emotion of a testee, wherein each drama has a plurality of time nodes for triggering the optional dialogue related to the current emotion, the optional dialogue is presented in a mode of selecting questions, the testee selects by using a handle, the obtained options are used as emotion state self report of the testee and investigation of satisfaction degree of the current emotion induction scene, and the time nodes are synchronously transmitted to an emotion recognition module to be used as references of emotion recognition; after each template presentation is finished, the user is given a rest for a proper time; the templates include VR role-playing games presented via VR devices that include one or more elements of a scenario, a scene, a side, a role dialogue, an animation, a character action, an audio effect, a task system, and a rewards and punishment mechanism.
Further, the signal acquisition module acquires brain electric emotion signals of a user and is connected with a computer provided with the data processing module through a USB data connecting wire and a parallel port wire; the signal acquisition module records scalp electroencephalogram signals by adopting an electroencephalogram amplifier and an EEG electrode cap with 32 channels and transmits the electroencephalogram signals to the emotion recognition module; wherein, the electroencephalogram signals of all channels are referenced by the right earlobe, in the electroencephalogram acquisition process, the impedance values of all electrodes are below 5KΩ, and the electroencephalogram signals are sampled at the frequency of 250Hz and are subjected to band-pass filtering in the range of 0.1 to 70 Hz; when the signals are collected, a tested person sits in a comfortable sitting posture, the EEG electrode cap is worn on the tested person, electrode paste is injected into the electrode cap embedded electrode by the flat-mouth needle cylinder, VR equipment is worn on the tested person after the electrode paste is injected, two sections of electroencephalogram data of each tested person are required to be collected, one section is training data collection, and the other section is online test data collection.
Further, the emotion recognition module reads the training data recorded in the computer after the training data acquisition stage is finished, and obtains corresponding feature vectors through filtering pretreatment and feature extraction; then constructing an emotion classification classifier by using a Support Vector Machine (SVM) based on a linear kernel, forming a training set by using emotion feature data corresponding to various emotion states in training, and sending the feature vectors corresponding to the various emotion states into the classifier for training to obtain an emotion-based classifier model; and then, respectively sending the feature vectors corresponding to the various emotional states in the training set into a trained classifier model, outputting a reference value base of each emotional state by the model, and constructing an emotion feedback criterion based on the reference value base for scoring the emotional states of the testee in an online test stage, so that an experimenter can conveniently monitor the emotion induction effect of the testee, and meanwhile, the testee can be helped to effectively know and adjust the emotion of the testee, so that the useful emotion induction effect is achieved.
Further, the emotion recognition module performs emotion classification by using the trained classifier model in the online test data acquisition stage, outputs a result representing the degree of emotion intensity, namely a score, based on the reference value base, and transmits the result to the emotion feedback module.
Further, the emotion feedback module belongs to a part of a game scene in a visual feedback mode, the feedback content is presented on the right side of the VR scene, the current emotion type and intensity of the tested person are respectively displayed by using cartoon images and color light rings, the cartoon images have different mental actions corresponding to different emotion types, and the initial default state is static; the color light ring surrounds the cartoon image, and the initial size is equal to the size of the cartoon image, and is enlarged along with the increase of the emotion induction degree; in the process of performing an online test task, each time online test data are received and processed, the emotion recognition module gives a corresponding emotion classification result and score, transmits a corresponding feedback criterion and a reference value base to the emotion feedback module, and displays the corresponding emotion category and emotion induction degree by the emotion feedback module; the testee and the experimenter can know the current emotion state of the testee according to the change conditions of the cartoon image and the color light ring, so that the emotion induction effect is judged.
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the invention provides a more active and effective emotion induction monitoring system for a tested person, which is characterized in that a scene is presented through VR, and a better immersion feeling is provided for the tested person by combining a scenario task and a reward and punishment mechanism of a role playing game, so that the tested person is stimulated to more actively receive emotion induction stimulation of the scene, and corresponding emotion is actively generated.
2. According to the system, through role dialogue interaction, the selectable dialogue is used as emotion self-evaluation feedback of the testee, subjective feeling of the testee emotion can be recorded in real time, and the time point of emotion induction control can be used for marking the emotion signal in time, so that real-time monitoring of emotion induction of the testee is facilitated.
3. The system provides visual feedback based on VR, so that the feedback effect is more obvious, and the tested person is stimulated to perform emotion regulation more actively.
Drawings
Fig. 1 is a block diagram of an emotion induction monitoring system.
Fig. 2 is a flow chart of an implementation of the emotion induction monitoring system.
Fig. 3 is a flow chart of an online emotion-induction monitoring single trial.
Fig. 4 is a presentation schematic of the emotion feedback module.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but embodiments of the present invention are not limited thereto.
Referring to fig. 1, the emotion induction monitoring system based on VR role playing game interaction provided in this embodiment includes an emotion induction module, a signal acquisition module, an emotion recognition module, and an emotion feedback module;
the emotion induction module is used for presenting a scene and a simulation situation of virtual reality and inducing emotion of a testee through a VR role playing game;
the signal acquisition module is used for acquiring and recording emotion signals of a user, the acquired emotion signals are calibrated into training data and on-line test data, wherein the training data are stored in a computer and can be read by the emotion recognition module, and the on-line test data are transmitted to the emotion recognition module in real time;
the emotion recognition module is used for processing emotion signals acquired by the signal acquisition module, and comprises the steps of reading stored training data and receiving online test data in real time, preprocessing the training data and the online test data respectively, extracting features to obtain corresponding feature vectors, wherein the feature vectors obtained by the training data are used for constructing an emotion-based classifier model and a feedback criterion, and the feature vectors obtained by the online test data are used for obtaining processing results through the classifier model and the feedback criterion, so that monitoring and recognition of the emotion of a tested person are realized and are used for online emotion feedback;
the emotion feedback module is used for presenting the result processed by the emotion recognition module, namely feeding back the current emotion state of the testee to the experimenter and the testee on line, facilitating the experimenter to observe the emotion induction effect of the testee, and assisting the testee to know and adjust the emotion state of the testee.
Further, the emotion induction module comprises VR equipment for presenting a virtual reality scene, and is communicated with a computer carrying the emotion recognition module through TCP; the emotion induction module further comprises a VR role playing game for inducing the emotion of the testee, wherein the VR role playing game comprises game scenarios corresponding to different emotions, and the game is provided with a task system and a reward and punishment mechanism for exciting the testee to concentrate on game content and actively receiving emotion induction of the system; the game prompts the emotion required to be induced by the testee through the side white, role dialogue, CG animation and character action to exchange the scenario, combining game tasks and rewards and punishments, and based on the scenario and the condition duration of the testee, the scenario animation and the sound effect are variable; the role dialogue comprises a fixed dialogue for delivering dramas and an optional dialogue for expressing subjective emotion of a testee, wherein each drama has a plurality of time nodes for triggering the optional dialogue related to the current emotion, the optional dialogue is presented in a mode of selecting questions, the testee selects by using a handle, the obtained options are used as emotion state self report of the testee and investigation of satisfaction degree of the current emotion induction scene, and the time nodes are synchronously transmitted to an emotion recognition module to be used as references of emotion recognition; after each template presentation is finished, the user is given a rest for a proper time; the templates include role-playing games presented via VR devices that include one or more elements of a scenario, a scene, a side, a role dialogue, an animation, a character action, an audio effect, a task system, and a punishment mechanism.
Further, the signal acquisition module acquires brain electric emotion signals of a user and is connected with a computer provided with the data processing module through a USB data connecting wire and a parallel port wire; the signal acquisition module records scalp electroencephalogram signals by adopting an electroencephalogram amplifier and an EEG electrode cap with 32 channels and transmits the electroencephalogram signals to the emotion recognition module; wherein, the electroencephalogram signals of all channels are referenced by the right earlobe, in the electroencephalogram acquisition process, the impedance values of all electrodes are below 5KΩ, and the electroencephalogram signals are sampled at the frequency of 250Hz and are subjected to band-pass filtering in the range of 0.1 to 70 Hz; when the signals are collected, a tested person sits in a comfortable sitting posture, the EEG electrode cap is worn on the tested person, electrode paste is injected into the electrode cap embedded electrode by the flat-mouth needle cylinder, VR equipment is worn on the tested person after the electrode paste is injected, two sections of electroencephalogram data of each tested person are required to be collected, one section is training data collection, and the other section is online test data collection.
Further, the emotion recognition module reads the training data recorded in the computer after the training data acquisition stage is finished, and obtains corresponding feature vectors through filtering pretreatment and feature extraction; then constructing an emotion classification classifier by using a Support Vector Machine (SVM) based on a linear kernel, forming a training set by using emotion feature data corresponding to various emotion states in training, and sending the feature vectors corresponding to the various emotion states into the classifier for training to obtain an emotion-based classifier model; and then, respectively sending the feature vectors corresponding to the various emotional states in the training set into a trained classifier model, outputting a reference value base for obtaining each emotional state by the model, and constructing an emotion feedback criterion based on the reference value base for scoring the emotional states of the testee in an online test stage, so that an experimenter can conveniently monitor the emotion induction effect of the testee, and meanwhile, the testee can be helped to effectively know and adjust the emotion of the testee, so that the useful emotion induction effect is achieved.
Further, the emotion recognition module performs emotion classification by using the trained classifier model in the online test data acquisition stage, outputs a result representing the degree of emotion intensity, namely a score, based on the reference value base, and transmits the result to the emotion feedback module.
Further, the emotion feedback module belongs to a part of a game scene in a visual feedback mode, the feedback content is presented on the right side of the VR scene, the current emotion type and intensity of the tested person are respectively displayed by using cartoon images and color light rings, the cartoon images have different mental actions corresponding to different emotion types, and the initial default state is static; the color light ring surrounds the cartoon image, and the initial size is equal to the size of the cartoon image, and is enlarged along with the increase of the emotion induction degree; in the process of performing an online test task, each time online test data are received and processed, the emotion recognition module gives a corresponding emotion classification result and score, transmits a corresponding feedback criterion and a reference value base to the emotion feedback module, and displays the corresponding emotion category and emotion induction degree by the emotion feedback module; the testee and the experimenter can know the current emotion state of the testee according to the change conditions of the cartoon image and the color light ring, so that the emotion induction effect is judged.
As shown in fig. 2, based on the above emotion induction monitoring system based on VR role playing game interaction, the specific implementation flow of the emotion induction experiment is as follows:
1) And opening the signal acquisition module to start to acquire the brain electrical signals. The method comprises the following steps: the testee sits on the chair in a comfortable way, the EEG electrode cap is worn on the testee, the flat-mouth needle cylinder is used for injecting electrode paste into the electrode cap embedded electrode, and the VR equipment is worn on the testee after the electrode paste is injected. An electroencephalogram signal is acquired by using a 32-channel electroencephalogram electrode cap of the international 10-20 standard and an electroencephalogram amplifier of a NeuroScan company, wherein the electroencephalogram signals of all channels are referenced by the right auricle, in the electroencephalogram acquisition process, the impedance values of all electrodes are below 5KΩ, and the electroencephalogram signals are sampled at the frequency of 250 Hz.
2) The communication relationship is established among the signal acquisition module, the emotion induction module, the emotion recognition module and the emotion feedback module, and the specific connection mode is as follows:
2.1 Opening the computer, running netreader.exe for accepting the data, and netstim.exe for transmitting the tag data;
2.2 A emotion recognition module is operated on the computer, a TCP port number is set on the interface of netreader.exe, connection based on a TCP protocol is established between the signal acquisition module and the emotion recognition module after a connection button is clicked, and the emotion recognition module can receive the electroencephalogram data sent by the signal acquisition module in real time and display the received electroencephalogram waveform in a window;
2.3 Setting a UDP port number in a netstim.exe interface, and establishing connection between the emotion feedback module and the signal acquisition module based on a UDP protocol after clicking a sending button;
2.4 Opening a game for emotion induction in the VR device, setting a TCP port number therein, establishing a TCP protocol-based connection with the emotion recognition module, and guiding the subject to use the VR handle.
3) Collecting training data:
3.1 Setting the emotion recognition module mode as a training mode, setting training tests, generally 30, clicking and storing, clicking and starting, and starting the operation of the emotion induction module;
3.2 When each sub-trial starts, the emotion inducing module firstly generates a drama background brief introduction, a main line task and a reward and punishment mechanism, and plays corresponding background music to give emotion prompts. The testee clicks the confirmation key through the handle to start the scenario task. The testee keeps the corresponding emotion according to the scenario, but is required to have no facial expression and limb actions except the operation handle so as not to generate muscle movement to interfere with the brain electrical signals. Each subtest-scenario key node triggers a role dialogue about the current emotion, presents the dialogue in the form of a selection question, a subject selects the dialogue by using a handle, the obtained options serve as an emotion state self report of the subject and a survey of satisfaction degree of the current emotion-inducing scene, and the time nodes synchronously make time marks in test data as references on emotion recognition time. After each scenario segment is finished, the testee calms the own emotion state according to the needs, browses the scenario game task of the next segment, and then clicks the confirmation button to start training of the next test. After all 30 secondary tests are completed, closing the record of the signal acquisition module;
3.3 After the training data acquisition phase is finished, the emotion recognition module reads the training data recorded in the computer, and obtains corresponding feature vectors through filtering pretreatment and feature extraction; then constructing an emotion classification classifier by using a Support Vector Machine (SVM) based on a linear kernel, forming a training set by using emotion feature data corresponding to various emotion states in training, and sending the feature vectors corresponding to the various emotion states into the classifier for training to obtain an emotion-based classifier model; and then, respectively sending the feature vectors corresponding to the various emotional states in the training set into a trained classifier model, outputting a reference value base for obtaining each emotional state by the model, and constructing an emotion feedback criterion based on the base value for scoring the emotional states of the testee in an online test stage, so that an experimenter can conveniently monitor the emotion induction effect of the testee, and meanwhile, the testee can effectively know and adjust the emotion of the testee.
4) On-line emotion induction monitoring:
4.1 Setting the emotion recognition module mode as an on-line mode, setting training tests, generally 30, clicking and storing, clicking and starting, and starting running of the emotion induction module;
4.2 Referring to fig. 3, at the beginning of each sub-trial, the emotion induction module first presents a scenario background profile, a main line task and a reward and punishment mechanism, and plays corresponding background music to give emotion prompts. The testee clicks the confirmation key through the handle to start the scenario task. The testee keeps the corresponding emotion according to the scenario, but is required to have no facial expression and limb actions except the operation handle so as not to generate muscle movement to interfere with the brain electrical signals. Each sub-test scenario key node triggers a role dialogue related to the current emotion, the role dialogue is presented in a mode of selecting questions, a testee selects the role dialogue by using a handle, the obtained options serve as an emotion state self report of the testee and investigation of satisfaction degree of the current emotion induction scene, and the time nodes are synchronously transmitted to an emotion recognition module to serve as references on emotion recognition time;
and the lower right corner of the VR scene presents an emotion feedback module, and the emotion feedback module presents corresponding emotion states and degrees according to the result of the emotion recognition module in each trial. After each scenario segment is finished, the testee calms own emotional state as required, at the moment, the emotion feedback module presents the calm state, and zooms the aperture with the frequency of deep breathing, the testee can coordinate to perform deep breathing to calm emotion, meanwhile, the testee can browse the scenario game task of the next section, and then click the confirmation key to start training of the next test. After the testee clicks on, the emotion recognition module can capture the current emotion electroencephalogram signal, namely on-line test data, and perform emotion recognition;
4.3 The emotion recognition module calculates according to the on-line test data and the trained emotion model, gives a corresponding emotion classification result and score, then transmits a corresponding feedback criterion and a reference value base to the emotion feedback module, and the emotion feedback module displays a corresponding emotion category and emotion induction degree, as shown in fig. 4, the cartoon character displays a corresponding emotion state according to the emotion classification result, such as a happy emotion, the whole body halo size of the cartoon character is determined by the emotion intensity degree, the corresponding emotion is stronger, the halo range is larger, and conversely, the emotion is weaker, and the halo range is smaller; the testee and the experimenter can know the current emotion state of the testee according to the change conditions of the cartoon image and the light ring, so that the emotion induction effect is judged and properly adjusted.
In summary, compared with other emotion induction modes, the system adopts VR role playing games, induces the emotion of a tested person through game play, improves immersion and experience sense by matching with animation, sound effects, tasks and reward and punishment mechanisms, actively participates in the situation through interaction to generate corresponding emotion, improves the emotion induction effect, provides nerve feedback by using brain electrophysiological signals, enables the tested person and experimenters to know the emotion induction effect in real time, and adopts proper regulation strategies; and simultaneously, the self emotion assessment report of the testee is obtained by means of optional role dialogue. In addition, the system monitors the emotion state based on the electrophysiological signal, and compared with the traditional subjective scale evaluation method, the method is more objective and accurate, and not only can accurately reflect the true emotion state of the testee, but also can estimate the emotion intensity of the testee. In a word, the system provided by the invention provides an effective emotion induction mode for carrying out emotion induction experiments in a laboratory and collecting emotion related information and for emotion treatment of patients with emotional mental disorder, has potential clinical value and social value, and is worthy of popularization.
The above examples are preferred embodiments of the present invention, but the embodiments of the present invention are not limited to the above examples, and any other changes, modifications, substitutions, combinations, and simplifications that do not depart from the spirit and principle of the present invention should be made in the equivalent manner, and the embodiments are included in the protection scope of the present invention.

Claims (5)

1. The emotion induction monitoring system based on VR role playing game interaction is characterized by comprising an emotion induction module, a signal acquisition module, an emotion recognition module and an emotion feedback module;
the emotion induction module is used for presenting a scene and a simulation situation of virtual reality and inducing emotion of a testee through a VR role playing game;
the signal acquisition module is used for acquiring and recording emotion signals of a user, the acquired emotion signals are calibrated into training data and on-line test data, wherein the training data are stored in a computer and can be read by the emotion recognition module, and the on-line test data are transmitted to the emotion recognition module in real time;
the emotion recognition module is used for processing emotion signals acquired by the signal acquisition module, and comprises the steps of reading stored training data and receiving online test data in real time, preprocessing the training data and the online test data respectively, extracting features to obtain corresponding feature vectors, wherein the feature vectors obtained by the training data are used for constructing an emotion-based classifier model and a feedback criterion, and the feature vectors obtained by the online test data are used for obtaining processing results through the classifier model and the feedback criterion, so that monitoring and recognition of the emotion of a tested person are realized and are used for online emotion feedback;
the emotion feedback module is used for presenting the result processed by the emotion recognition module, namely feeding back the current emotion state of the testee to the experimenter and the testee on line, so that the experimenter can observe the emotion induction effect of the testee, and can assist the testee to know and adjust the emotion state of the testee;
the emotion induction module comprises VR equipment for presenting a virtual reality scene, and is communicated with a computer carrying an emotion recognition module through TCP; the emotion induction module further comprises a VR role playing game for inducing the emotion of the testee, wherein the VR role playing game comprises game scenarios corresponding to different emotions, and the game is provided with a task system and a reward and punishment mechanism for exciting the testee to concentrate on game content and actively receiving emotion induction of the system; the game prompts the emotion required to be induced by the testee through side white, role dialogue, CG animation and character action to exchange the scenario, and combines game tasks and rewards and punishments, and the scene animation and the sound effect prompt the duration is determined according to the scenario and the testee condition; the role dialogue comprises a fixed dialogue for delivering dramas and an optional dialogue for expressing subjective emotion of a testee, wherein each drama has a plurality of time nodes for triggering the optional dialogue related to the current emotion, the optional dialogue is presented in a mode of selecting questions, the testee selects by using a handle, the obtained options are used as emotion state self-report of the testee and investigation of satisfaction degree of the current emotion induction scene, and the time nodes are synchronously transmitted to an emotion recognition module to be used as references of emotion recognition; after each template presentation is finished, the user is given time to rest; the templates include VR role-playing games presented via VR devices, including one or more of drama, scenes, whites, character conversations, animations, character actions, sound effects, task systems, and rewards and punishments mechanisms.
2. The VR role playing game interaction-based emotion induction monitoring system of claim 1, wherein the signal acquisition module acquires an electroencephalogram emotion signal of a user and is connected with a computer provided with a data processing module through a USB data connection line and a parallel port line; the signal acquisition module records scalp electroencephalogram signals by adopting an electroencephalogram amplifier and an EEG electrode cap with 32 channels and transmits the electroencephalogram signals to the emotion recognition module; wherein, the electroencephalogram signals of all channels are referenced by the right earlobe, the impedance values of all electrodes are below 5K omega in the electroencephalogram acquisition process, and the electroencephalogram signals are sampled at the frequency of 250Hz and are subjected to band-pass filtering in the range of 0.1 to 70 Hz; when the signals are collected, a tested person sits on the device, the EEG electrode cap is worn on the tested person, electrode paste is injected into the electrode cap embedded electrode by the flat-mouth needle cylinder, VR equipment is worn on the tested person after the electrode paste is injected, and two sections of electroencephalogram data are required to be collected by each tested person, wherein one section is training data collection, and the other section is online test data collection.
3. The emotion induction monitoring system based on VR role playing game interaction according to claim 1, wherein the emotion recognition module reads the training data recorded in the computer after the training data acquisition phase is finished, and obtains the corresponding feature vector through filtering preprocessing and feature extraction; then constructing an emotion classification classifier by using a Support Vector Machine (SVM) based on a linear kernel, forming a training set by using emotion feature data corresponding to various emotion states in training, and sending the feature vectors corresponding to the various emotion states into the classifier for training to obtain an emotion-based classifier model; and then, respectively sending the feature vectors corresponding to the various emotional states in the training set into a trained classifier model, outputting a reference value base of each emotional state by the model, and constructing an emotion feedback criterion based on the reference value base for scoring the emotional states of the testee in the online test stage.
4. The VR role playing game interaction based emotion induction monitoring system of claim 1, wherein the emotion recognition module performs emotion classification using a trained classifier model in an online test data collection stage, outputs a result representing the degree of emotion intensity, i.e., score, based on a reference value base, and transmits the result to the emotion feedback module.
5. The emotion induction monitoring system based on VR role playing game interaction according to claim 1, wherein the emotion feedback module presents feedback content on the right side of the VR scene in a visual feedback manner, and the current emotion type and intensity of the testee are respectively displayed by using cartoon images and color light rings, wherein the cartoon images have different mental actions corresponding to different emotion types, and the initial default state is static; the color light ring surrounds the cartoon image, and the initial size is equal to the size of the cartoon image, and is enlarged along with the increase of the emotion induction degree; in the process of performing an online test task, each time online test data are received and processed, the emotion recognition module gives a corresponding emotion classification result and score, transmits a corresponding feedback criterion and a reference value base to the emotion feedback module, and displays the corresponding emotion category and emotion induction degree by the emotion feedback module; the testee and the experimenter can know the current emotion state of the testee according to the change conditions of the cartoon image and the color light ring, so that the emotion induction effect is judged.
CN202210146377.2A 2022-02-17 2022-02-17 Emotion induction monitoring system based on VR role playing game interaction Active CN114640699B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210146377.2A CN114640699B (en) 2022-02-17 2022-02-17 Emotion induction monitoring system based on VR role playing game interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210146377.2A CN114640699B (en) 2022-02-17 2022-02-17 Emotion induction monitoring system based on VR role playing game interaction

Publications (2)

Publication Number Publication Date
CN114640699A CN114640699A (en) 2022-06-17
CN114640699B true CN114640699B (en) 2023-06-20

Family

ID=81946034

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210146377.2A Active CN114640699B (en) 2022-02-17 2022-02-17 Emotion induction monitoring system based on VR role playing game interaction

Country Status (1)

Country Link
CN (1) CN114640699B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115373519A (en) * 2022-10-21 2022-11-22 北京脑陆科技有限公司 Electroencephalogram data interactive display method, device and system and computer equipment
CN116603232A (en) * 2023-05-30 2023-08-18 深圳市德尔凯科技有限公司 Three-dimensional VR and entity feedback based mutual-aid game entertainment system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2846919A1 (en) * 2013-03-21 2014-09-21 Smarteacher Inic. Emotional intelligence engine for systems
CN106061456A (en) * 2013-12-31 2016-10-26 伊夫徳发明未来科技有限公司 Wearable devices, systems, methods and architectures for sensory stimulation and manipulation, and physiological data acquisition
KR101901258B1 (en) * 2017-12-04 2018-09-21 가천대학교 산학협력단 Method, Device, and Computer-Readable Medium for Optimizing Images of compressed package files
CN111247505A (en) * 2017-10-27 2020-06-05 索尼公司 Information processing device, information processing method, program, and information processing system
CN111651060A (en) * 2020-08-10 2020-09-11 之江实验室 Real-time evaluation method and evaluation system for VR immersion effect

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2605993A1 (en) * 2005-04-25 2006-11-02 Ellen Eatough Mind-body learning system and methods of use
TWI311067B (en) * 2005-12-27 2009-06-21 Ind Tech Res Inst Method and apparatus of interactive gaming with emotion perception ability
US20090069707A1 (en) * 2007-09-06 2009-03-12 Sandford Joseph A Method to improve neurofeedback training using a reinforcement system of computerized game-like cognitive or entertainment-based training activities
US20140167358A1 (en) * 2012-12-13 2014-06-19 Keith C. Fox Method for Providing Play Therapy Utilizing An Interactive Role Playing Game
CN103690165B (en) * 2013-12-12 2015-04-29 天津大学 Modeling method for cross-inducing-mode emotion electroencephalogram recognition
CN105852831A (en) * 2016-05-10 2016-08-17 华南理工大学 Equipment based on virtual reality interaction technology and brain function real-time monitoring technology
CN108056774A (en) * 2017-12-29 2018-05-22 中国人民解放军战略支援部队信息工程大学 Experimental paradigm mood analysis implementation method and its device based on visual transmission material
CN108478224A (en) * 2018-03-16 2018-09-04 西安电子科技大学 Intense strain detecting system and detection method based on virtual reality Yu brain electricity
CN109298779B (en) * 2018-08-10 2021-10-12 济南奥维信息科技有限公司济宁分公司 Virtual training system and method based on virtual agent interaction
US10970898B2 (en) * 2018-10-10 2021-04-06 International Business Machines Corporation Virtual-reality based interactive audience simulation
CN109992113B (en) * 2019-04-09 2020-05-15 燕山大学 MI-BCI system based on multi-scene induction and control method thereof
CN111026265A (en) * 2019-11-29 2020-04-17 华南理工大学 System and method for continuously labeling emotion labels based on VR scene videos
CN112120716A (en) * 2020-09-02 2020-12-25 中国人民解放军军事科学院国防科技创新研究院 Wearable multi-mode emotional state monitoring device
CN112163518B (en) * 2020-09-28 2023-07-18 华南理工大学 Emotion modeling method for emotion monitoring and adjusting system
CN112597967A (en) * 2021-01-05 2021-04-02 沈阳工业大学 Emotion recognition method and device for immersive virtual environment and multi-modal physiological signals
CN113053492B (en) * 2021-04-02 2022-07-15 北方工业大学 Self-adaptive virtual reality intervention system and method based on user background and emotion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2846919A1 (en) * 2013-03-21 2014-09-21 Smarteacher Inic. Emotional intelligence engine for systems
CN106061456A (en) * 2013-12-31 2016-10-26 伊夫徳发明未来科技有限公司 Wearable devices, systems, methods and architectures for sensory stimulation and manipulation, and physiological data acquisition
CN111247505A (en) * 2017-10-27 2020-06-05 索尼公司 Information processing device, information processing method, program, and information processing system
KR101901258B1 (en) * 2017-12-04 2018-09-21 가천대학교 산학협력단 Method, Device, and Computer-Readable Medium for Optimizing Images of compressed package files
CN111651060A (en) * 2020-08-10 2020-09-11 之江实验室 Real-time evaluation method and evaluation system for VR immersion effect

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于脑电的情绪加工与识别技术研究;庄宁;《中国博士学位论文全文数据库(电子期刊)》(第第1期期);E070-14 *
情绪诱发方法及其新进展;蒋军,陈雪飞,陈安涛;《西南师范大学学报(自然科学版)》;第第36卷卷(第第1期期);第209-214页 *

Also Published As

Publication number Publication date
CN114640699A (en) 2022-06-17

Similar Documents

Publication Publication Date Title
Kulke et al. A comparison of the Affectiva iMotions Facial Expression Analysis Software with EMG for identifying facial expressions of emotion
CN109298779B (en) Virtual training system and method based on virtual agent interaction
CN104902806B (en) The assessment system and method for europathology
WO2020119355A1 (en) Method for evaluating multi-modal emotional understanding capability of patient with autism spectrum disorder
CN114640699B (en) Emotion induction monitoring system based on VR role playing game interaction
Feidakis et al. Emotion measurement in intelligent tutoring systems: what, when and how to measure
KR20190026651A (en) Methods and systems for acquiring, aggregating and analyzing vision data to approach a person's vision performance
Riseberg et al. Frustrating the user on purpose: Using biosignals in a pilot study to detect the user's emotional state
CN112163518B (en) Emotion modeling method for emotion monitoring and adjusting system
CN108078573A (en) A kind of interest orientation value testing method based on physiological reaction information and stimulus information
Farnsworth Eeg (electroencephalography): The complete pocket guide
Nesbitt et al. Using the startle eye-blink to measure affect in players
CN108478224A (en) Intense strain detecting system and detection method based on virtual reality Yu brain electricity
CN115376695A (en) Method, system and device for neuropsychological assessment and intervention based on augmented reality
CN112008725B (en) Human-computer fusion brain-controlled robot system
Christensen et al. Emotion matters: Different psychophysiological responses to expressive and non-expressive full-body movements
Chunawale et al. Human emotion recognition using physiological signals: A survey
Kawala-Janik Efficiency evaluation of external environments control using bio-signals
Alarcão Reminiscence therapy improvement using emotional information
Smith Electroencephalograph based brain computer interfaces
Hercegfi et al. Experiences of virtual desktop collaboration experiments
Dave Enhancing user experience in e-learning: Real-time emotional analysis and assessment
Bashir et al. Electroencephalogram (EEG) Signals for Modern Educational Research
WO2020139108A1 (en) Method for conducting cognitive examinations using a neuroimaging system and a feedback mechanism
Yang et al. Ground truth dataset for EEG-based emotion recognition with visual indication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant