CN106560765A - Method and device for content interaction in virtual reality - Google Patents

Method and device for content interaction in virtual reality Download PDF

Info

Publication number
CN106560765A
CN106560765A CN201610416809.1A CN201610416809A CN106560765A CN 106560765 A CN106560765 A CN 106560765A CN 201610416809 A CN201610416809 A CN 201610416809A CN 106560765 A CN106560765 A CN 106560765A
Authority
CN
China
Prior art keywords
brain wave
information
content
virtual environment
virtual reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610416809.1A
Other languages
Chinese (zh)
Inventor
贺超
杨洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chuangda Yunrui Intelligent Technology Co Ltd
Original Assignee
Shenzhen Chuangda Yunrui Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chuangda Yunrui Intelligent Technology Co Ltd filed Critical Shenzhen Chuangda Yunrui Intelligent Technology Co Ltd
Priority to CN201610416809.1A priority Critical patent/CN106560765A/en
Priority to PCT/CN2016/103849 priority patent/WO2017215177A1/en
Publication of CN106560765A publication Critical patent/CN106560765A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention, which is suitable for the technical field of the computer, provides a method and device for content interaction in virtual reality, so that problems that virtual environment control by the user is limited to a body movement or manual instruction range and the virtual environment only receives a control instruction from a user passively to carry out interaction feedback can be solved. The method comprises: acquiring a brain wave signal; analyzing the brain wave signal to obtain emotion cognition information corresponding to the brain wave signal; and according to the emotion cognition information, executing corresponding content interaction on the virtual environment. Therefore, on the premise that no body movement of manual instruction needs to be carried out, the virtual environment control is completed by using the brain wave signal directly; the emotion cognition of the user is sensed by analyzing the brain wave signal, so that the virtual environment carries out interaction feedback with the user actively; and the interaction behavior is not limited to image and touching and both the emotion state and attention state can be used as interaction ways.

Description

The method and apparatus of content interaction in a kind of virtual reality
Technical field
The present invention relates to the method and apparatus that content is interacted in field of computer technology, more particularly to a kind of virtual reality.
Background technology
Virtual reality (Virtual Reality, VR), is that one kind can be created and the experiencing virtual world (Virtual World computer system).Its concrete intension is:The interface such as comprehensive utilization computer graphics system and various reality and control Equipment, provides the technology for immersing sensation in three-dimensional environment generating on computers, can interacting.Wherein, computer is generated The three-dimensional environment that can be interacted is referred to as virtual environment (Virtual Environment, VE).Virtual reality technology is that one kind can be created The technology with the computer simulation system in the experiencing virtual world is built, it generates a kind of simulated environment using computer, using multi-source The interactive three-dimensional dynamic vision of information fusion and the system emulation of entity behavior make user be immersed in the environment.
Existing virtual reality technology by build a virtual three-dimensional environment, allow people to virtual environment carry out observation with Explore, and by technologies such as motion captures, enable a person to obtain certain feedback from virtual environment.But people and virtual environment Interaction be generally all by means of the first-class action recognition of keyboard, mouse, joystick or depth camera and seizure equipment, root Corresponding interbehavior is made according to the input of the limb activity of people.
In this case, control of the people to virtual environment is limited and isolated, can only pass through limited limbs and transport Dynamic or manual command is operated, and virtual environment is also completely passive to the interaction feedback of people.
The content of the invention
It is an object of the invention to provide the method and apparatus that content is interacted in a kind of virtual reality, it is intended to solve existing skill Control of the people to virtual environment in art is limited in the expression category of limb motion or manual command, and virtual environment is only The control instruction of the passive recipient of energy interacts the confinement problems of feedback.
A first aspect of the present invention, there is provided a kind of method of content interaction in virtual reality, including:
Obtain eeg signal;
The eeg signal is parsed, the corresponding emotion cognition information of the eeg signal is obtained;
Corresponding content interaction is performed to virtual environment according to the emotion cognition information.
A kind of a second aspect of the present invention, there is provided the device of content interaction in virtual reality, including:
Acquisition module, for obtaining eeg signal;
Parsing module, for parsing the eeg signal, obtains the corresponding emotion cognition information of the eeg signal;
Interactive module, for corresponding content interaction to be performed to virtual environment according to the emotion cognition information.
The beneficial effect that exists compared with prior art of the present invention is:By parsing to obtaining eeg signal, obtain To the corresponding emotion cognition information of eeg signal, content interaction is carried out to virtual environment according to emotion cognition information.On the one hand, Realize on the premise of it need not carry out limb motion or manual command, directly can be completed to virtual by eeg signal The control of environment so that control of the people to virtual environment is no longer confined in the expression category of limb motion or manual command; On the other hand, by the emotion cognition of analysis eeg signal active perception people, virtual environment is made no longer passively to feed back the control of people System, but directly the interaction feedback with people is actively carried out according to emotion cognition information, while also causing interbehavior not only to limit to In image and touch, affective state and state of attention etc. can also be used as interactive meanses.
Description of the drawings
Fig. 1 is the flow chart of the method for content interaction in a kind of virtual reality that the embodiment of the present invention one is provided;
Fig. 2 is the flow chart of the method for content interaction in a kind of virtual reality that the embodiment of the present invention two is provided;
Fig. 3 is the structural representation of the device of content interaction in a kind of virtual reality that the embodiment of the present invention three is provided;
Fig. 4 is the structural representation of the device of content interaction in a kind of virtual reality that the embodiment of the present invention four is provided.
Specific embodiment
In order that the objects, technical solutions and advantages of the present invention become more apparent, it is right below in conjunction with drawings and Examples The present invention is further elaborated.It should be appreciated that specific embodiment described herein is only to explain the present invention, and It is not used in the restriction present invention.
It is described in detail below in conjunction with realization of the concrete accompanying drawing to the present invention.
Embodiment one:
Fig. 1 is the flow chart of the method for content interaction in a kind of virtual reality that the embodiment of the present invention one is provided, concrete to wrap Step S101 to S103 is included, details are as follows:
S101, acquisition eeg signal.
Brain wave (Electroencephalography, EEG) signal be brain in the activity of carrying out neuroelectricity it is outer Electric wave change in performance, its record brain activity is the bioelectrical activity of cranial nerve cell in cerebral cortex or scalp table The overall reflection in face.
Specifically, detected by E.E.G detection equipment, obtained eeg signal.
E.E.G detection equipment can be mounted in the head fixing device of virtual reality device, such as in gland band, It can be an independent testing equipment.
S102, parsing eeg signal, obtain the corresponding emotion cognition information of the eeg signal.
Specifically, the eeg signal to getting is parsed, and resolving can include signal filtration, feature extraction With algorithm pattern identification etc., the eeg signal after parsing is converted into corresponding emotion cognition information, the emotion cognition Information can be recognized and responded by virtual reality system.
Further, emotion cognition information can include emotional state information, ' s focus of attention information and idea control letter Breath etc..
The mankind can induce different brain wave patterns in anxious state of mind, and corresponding brain electricity is extracted from eeg signal Wave mode, can identify the affective state of people, it is possible to which affective state is classified by the parsing to brain wave pattern.
The response that the mankind stimulate to external world equally can be reflected on eeg signal, referred to as event related potential, due to The response of brain wave generally occurs occur 300 milliseconds afterwards in stimulation, therefore event related potential is also referred to as P300 current potentials, When we are persistently observed brain wave, the response modes of brain wave P300 current potentials can be got.For example, red colored lamp with 0.5 second frequency scintillation, with 0.7 second frequency scintillation, when the notice of people concentrates observation red colored lamp, brain wave current potential is special for blue lamp Levy progressively to overlap with red colored lamp flicker frequency, it is possible thereby to know the focussing direction of the notice of people.Therefore by brain electricity The parsing of ripple signal, can extract current notice target.
The mankind are when limb motion control is carried out, and corticocerebral spatial activation degree is otherwise varied, in this, as Feature, the parsing to eeg signal can extract the idea state of people, it is possible to be converted to control signal.
S103, the interaction of corresponding content is performed to virtual environment according to emotion cognition information.
Specifically, virtual reality device is responded to the emotion cognition information for identifying, is performed in virtual environment Hold interaction.
The content interaction can be the passive response to virtual environment, including but not limited to be carried out to thing according to the idea of people Manipulation of body, such as displacement and speed etc. are controlled;Can also be the active feedback to virtual environment, including but not limited to by note Meaning power focus realizes the selective interbehavior such as selection to menu, and according to the emotional state of people actively to virtual environment Carry out color, sound, light and weather etc. to change accordingly;At the same time it can also be use same type virtual reality device people Community's interbehavior between people, mutual reception and registration of such as interest and emotional state etc..
In the present embodiment, by parsing to the eeg signal for obtaining, obtain the corresponding emotion of eeg signal and recognize Know information, content interaction is carried out to virtual environment according to emotion cognition information.On the one hand, realizing need not carry out limb motion Or on the premise of manual command, directly the control to virtual environment can be completed by eeg signal so that people is to virtual The control of environment is no longer confined in the expression category of limb motion or manual command;On the other hand, by analyzing brain wave The emotion cognition of signal active perception people, makes virtual environment no longer passively feed back the control of people, but directly according to emotion cognition Information actively carries out the interaction feedback with people, while also cause interbehavior to be not only limited to image and touch, affective state and State of attention etc. can also be used as interactive meanses.
Embodiment two:
Fig. 2 is the flow chart of the method for content interaction in a kind of virtual reality that the embodiment of the present invention two is provided, concrete to wrap Step S201 to S205 is included, details are as follows:
S201, acquisition eeg signal.
Brain wave (Electroencephalography, EEG) signal be brain in the activity of carrying out neuroelectricity it is outer Electric wave change in performance, its record brain activity is the bioelectrical activity of cranial nerve cell in cerebral cortex or scalp table The overall reflection in face.
Specifically, detected by E.E.G detection equipment, obtained eeg signal.
E.E.G detection equipment can be mounted in the head fixing device of virtual reality device, such as in gland band, It can be an independent testing equipment.
S202, filtration treatment is carried out to eeg signal, obtain filtering out the stabilization signal after interference signal.
The purpose of filtration treatment is to remove clutter and the interference in eeg signal, obtains relatively clear and pure stablizing Signal.
Specifically, the including but not limited to following several processing means of filtration treatment, it can be one of which processing means, It can also be multimedia combination:
Random interfering signal is removed by time superposition;
Main eeg signal is amplified by principal component analysis, other interference signals are filtered;
The eeg signal of some independent frequency range is amplified by LPF, high-pass filtering and bandpass filtering, it is filtered In unrelated part;
Stronger frequency range, the power frequency component interference of such as 50Hz-60Hz frequency ranges are removed by notch filter.
S203, extract from stabilization signal the brain wave feature that can be identified for that emotion cognition information.
The feature extraction of eeg signal includes but is not limited to following several ways, it is possible to use one way in which, Can be being applied in combination for various ways:
Time-domain signal is converted to by frequency-region signal by Fourier transformation, for the analysis to brain wave energy;
Realized and Fourier transformation identical function by wavelet transformation, but the low frequency region computational accuracy ratio of wavelet transformation Fourier transformation is higher, can be used to extract fine characteristics of low-frequency;
The variation characteristic of eeg signal is obtained by Hilbert transform;
Different brain wave letters are amplified by common space pattern (Common Spatial Pattern, CSP) space filtering The difference in number interval, realizes the identification and classification of the nuance to eeg signal.
Specifically, brain wave feature is extracted from stabilization signal by feature extraction, the brain wave feature carries concrete Emotion cognition information feature, can be identified for that out corresponding emotion cognition information.
Further, emotion cognition information can include emotional state information, ' s focus of attention information and idea control letter Breath etc..
S204, brain wave feature is analyzed using characteristic model, obtains the corresponding emotion cognition of brain wave feature Information, characteristic model is the corresponding model of emotion cognition information and brain wave feature.
Characteristic model is, using the brain wave feature extracted after being analyzed to eeg signal, to be calculated by pattern-recognition The emotion cognition information of method foundation and the corresponding model of brain wave feature.Algorithm for pattern recognition includes but is not limited to neutral net point Analysis (Neural Network Analysis, ANN), SVMs (Support Vector Machine, SVM), linear return Return, logistic regression and similitude clustering etc..These algorithms can be used alone, it is also possible at the same using or successively use, with Plus the recognition speed and recognition accuracy of strong algorithms.
Specifically, the brain wave feature for being extracted to step S203 using this feature model is analyzed, further, can To carry out model analysis to brain wave feature using the learning process and/or unsupervised learning process that have supervision, brain electricity is obtained The corresponding emotion cognition information of wave characteristic.Wherein, the learning process for having supervision is referred to based on pre-defined rule and brain wave species Model analysis process, using the learning process for having supervision, it is special that characteristics of needs model collects in advance a number of known brain wave Levy, for example, collect brain wave feature of 100 people under happy state in advance, brain wave is set up in row mode of going forward side by side matching The classification and matching rule of feature and emotion cognition information.When being analyzed to brain wave feature using characteristic model, according to spy The rule of model foundation is levied, brain wave feature is analyzed and is classified according to rule, obtain the brain wave feature pair The emotion cognition information answered;Unsupervised learning process refers to that the similarity based on brain wave state compares generalization procedure, adopts Unsupervised learning process does not need characteristic model to pre-build classifying rules, but by contrasting brain wave feature to be analyzed Similarity, and carry out automatic clustering, due to being automatic cluster, therefore the criterion to the brain wave feature after cluster is carried out Further analysis, and then valuable emotion cognition information is obtained from brain wave feature to be analyzed.
It should be noted that because characteristic model has an adaptive ability, can with characteristic model brain wave feature with The corresponding rate of emotion cognition information is continuously increased and constantly improve, therefore, brain wave feature is analyzed using characteristic model When, the recognition efficiency and accuracy rate of brain wave feature also can be improved constantly.Especially, when using big data in characteristic model Corresponding rate accumulated after, can be more and more accurate to the analysis result of brain wave feature using characteristic model.
S205, the interaction of corresponding content is performed to virtual environment according to emotion cognition information.
Specifically, virtual reality device is responded to the emotion cognition information that the analysis of step S204 is obtained, and is performed to void The content interaction in near-ring border, by content the passive response and active feedback realized to virtual environment are interacted.
Further, the content interaction can be included according to emotional state information and ' s focus of attention information to virtual environment Content adjustment is carried out, and the object of virtual environment is controlled according to idea control information.
Wherein, the content adjustment of virtual environment is belonged to virtual ring according to emotional state information and ' s focus of attention information The active feedback in border, for example, carry out menu mutual by ' s focus of attention, or identifying user article interested carries out backstage Big data analysis, including data mining and psychoanalysis etc., or the weather of virtual environment is adjusted according to the emotional state of people The mood of the means peace-makers such as situation, environment and offer music, makes one more to loosen comfortable or alertness etc. of more regaining consciousness;Root The object of virtual environment is controlled according to idea control information belongs to the passive response to virtual environment so that need not carried out On the premise of limb motion, by perceiving control of the idea realization of people to virtual environment, for example, control virtual in virtual environment Speed and direction of entity or the vehicles etc..
Further, content interaction can also be and be interacted using the interpersonal community of same type virtual reality device Mutual reception and registration of behavior, such as interest and emotional state etc. so that the interbehavior of person to person is not only limited to image and touches, Affective state and state of attention etc. can also be used as interactive meanses.
Further, big data excavation is carried out by the brain wave data to collecting, personality and interest of people etc. is entered The prediction of row big data, can make feedback interbehavior more precisely and various.
In the present embodiment, filtration treatment and brain wave feature extraction are carried out by the eeg signal to obtaining, and used Characteristic model is analyzed to brain wave feature, the corresponding emotion cognition information of the brain wave feature is obtained, according to emotion cognition Information carries out content interaction to virtual environment, realizes passive response and active feedback to virtual environment.On the one hand, according to idea Control information is controlled to the object of virtual environment, realizes on the premise of it need not carry out limb motion, by perceiving people Idea realize control to virtual environment so that control of the people to virtual environment is no longer confined to limb motion or manually finger In the expression category of order;On the other hand, content tune is carried out to virtual environment according to emotional state information and ' s focus of attention information Section so that virtual environment is no longer limited to the control of passive response people, but burnt by perceiving the emotional state and notice of people Point, actively carries out the interaction feedback with people, while also causing interbehavior to be not only limited to image with touch, affective state and note Meaning power state etc. can also be used as interactive meanses.
Embodiment three:
Fig. 3 is the structural representation of the device of content interaction in a kind of virtual reality that the embodiment of the present invention three is provided, and is It is easy to explanation, illustrate only the part related to the embodiment of the present invention.Content interaction in a kind of virtual reality of Fig. 3 examples Device can be the executive agent of the method for content interaction in a kind of virtual reality that previous embodiment one is provided.Fig. 3 examples The device of content interaction mainly includes in a kind of virtual reality:Acquisition module 31, parsing module 32 and interactive module 33.Each function Module describes in detail as follows:
Acquisition module 31, for obtaining eeg signal;
Parsing module 32, for parsing eeg signal, obtains the corresponding emotion cognition information of the eeg signal;
Interactive module 33, for corresponding content interaction to be performed to virtual environment according to emotion cognition information.
Each module realizes the process of respective function in the device of content interaction in a kind of virtual reality that the present embodiment is provided, The description of aforementioned embodiment illustrated in fig. 1 is specifically referred to, here is omitted.
Knowable to the device of content interaction in a kind of virtual reality of above-mentioned Fig. 3 examples, in the present embodiment, by obtaining Eeg signal parsed, the corresponding emotion cognition information of eeg signal is obtained, according to emotion cognition information to virtual Environment carries out content interaction.On the one hand, realize on the premise of it need not carry out limb motion or manual command, directly pass through Eeg signal can complete the control to virtual environment so that control of the people to virtual environment be no longer confined to limb motion or In the expression category of person's manual command;On the other hand, by the emotion cognition of analysis eeg signal active perception people, make virtual Environment no longer passively feeds back the control of people, but directly actively carries out the interaction feedback with people according to emotion cognition information, while Also so that interbehavior is not only limited to image and touches, affective state and state of attention etc. can also be used as interactive meanses.
Example IV:
Fig. 4 is the structural representation of the device of content interaction in a kind of virtual reality that the embodiment of the present invention four is provided, and is It is easy to explanation, illustrate only the part related to the embodiment of the present invention.Content interaction in a kind of virtual reality of Fig. 4 examples Device can be the executive agent of the method for content interaction in a kind of virtual reality that previous embodiment two is provided.Fig. 4 examples The device of content interaction mainly includes in a kind of virtual reality:Acquisition module 41, parsing module 42 and interactive module 43.Each function Module describes in detail as follows:
Acquisition module 41, for obtaining eeg signal;
Parsing module 42, for parsing eeg signal, obtains the corresponding emotion cognition information of the eeg signal;
Interactive module 43, for corresponding content interaction to be performed to virtual environment according to emotion cognition information.
Further, emotion cognition information includes emotional state information, ' s focus of attention information and idea control information.
Further, parsing module 42 includes:
Filter submodule 421, for carrying out filtration treatment to eeg signal, obtains filtering out stablizing after interference signal Signal;
Extracting sub-module 422, for extracting the brain wave feature that can be identified for that emotion cognition information from stabilization signal;
Matched sub-block 423, for brain wave feature to be matched with characteristic model, obtains brain wave feature corresponding Emotion cognition information, characteristic model is the brain wave feature of the corresponding predetermined quantity of known emotion cognition information.
Further, matched sub-block 423, are additionally operable to:
Brain wave feature and characteristic model are carried out by the learning process that has supervision and/or unsupervised learning process Match somebody with somebody, the learning process for having supervision is the model analysis process based on pre-defined rule and brain wave species, unsupervised learning process It is the similarity comparison generalization procedure based on brain wave state.
Further, interactive module 43 includes:
Submodule 431 is adjusted, for carrying out content to virtual environment according to emotional state information and ' s focus of attention information Adjust;
Control submodule 432, for being controlled to the object of virtual environment according to idea control information.
Each module realizes the process of respective function in the device of content interaction in a kind of virtual reality that the present embodiment is provided, The description of aforementioned embodiment illustrated in fig. 2 is specifically referred to, here is omitted.
Knowable to the device of content interaction in a kind of virtual reality of above-mentioned Fig. 4 examples, in the present embodiment, by obtaining Eeg signal carry out filtration treatment and brain wave feature extraction, and brain wave feature is analyzed using characteristic model, The corresponding emotion cognition information of the brain wave feature is obtained, content interaction is carried out to virtual environment according to emotion cognition information, it is real Now to the passive response and active feedback of virtual environment.On the one hand, the object of virtual environment is carried out according to idea control information Control, realizes on the premise of it need not carry out limb motion, by perceiving control of the idea realization of people to virtual environment, makes Obtain control of the people to virtual environment to be no longer confined in the expression category of limb motion or manual command;On the other hand, according to Emotional state information and ' s focus of attention information carry out content adjustment to virtual environment so that virtual environment is no longer limited to passively The control of response people, but by perceiving the emotional state and ' s focus of attention of people, the interaction feedback with people is actively carried out, while Also so that interbehavior is not only limited to image and touches, affective state and state of attention etc. can also be used as interactive meanses.
It should be noted that each embodiment in this specification is described by the way of progressive, each embodiment What is stressed is all the difference with other embodiment, between each embodiment same or like part mutually referring to .For device class embodiment, due to itself and embodiment of the method basic simlarity, so description is fairly simple, it is related Part is illustrated referring to the part of embodiment of the method.
It should be noted that in said apparatus embodiment, included modules simply carry out drawing according to function logic Point, but above-mentioned division is not limited to, as long as corresponding function can be realized;In addition, each functional module is concrete Title is also only to facilitate mutually differentiation, is not limited to protection scope of the present invention.
It will appreciated by the skilled person that realize that all or part of step in the various embodiments described above method is can Completed with instructing the hardware of correlation by program, corresponding program can be stored in a computer read/write memory medium In, described storage medium, such as ROM/RAM, disk or CD.
Presently preferred embodiments of the present invention is the foregoing is only, not to limit the present invention, all essences in the present invention Any modification, equivalent and improvement made within god and principle etc., should be included within the scope of the present invention.

Claims (10)

1. a kind of method that content is interacted in virtual reality, it is characterised in that include:
Obtain eeg signal;
The eeg signal is parsed, the corresponding emotion cognition information of the eeg signal is obtained;
Corresponding content interaction is performed to virtual environment according to the emotion cognition information.
2. the method that content is interacted in virtual reality according to claim 1, it is characterised in that the emotion cognition information Including emotional state information, ' s focus of attention information and idea control information.
3. the method that content is interacted in virtual reality according to claim 1 and 2, it is characterised in that described in the parsing Eeg signal, obtaining the corresponding emotion cognition information of the brain wave feature includes:
Filtration treatment is carried out to the eeg signal, obtains filtering out the stabilization signal after interference signal;
The brain wave feature that can be identified for that emotion cognition information is extracted from the stabilization signal;
The brain wave feature is analyzed using characteristic model, obtains the corresponding emotion cognition letter of the brain wave feature Breath, the characteristic model is the corresponding model of the emotion cognition information and brain wave feature.
4. the method that content is interacted in virtual reality according to claim 3, it is characterised in that the use characteristic model The brain wave feature is analyzed including:
By the learning process that has supervision and/or unsupervised learning process, model analysis, institute are carried out to the brain wave feature The learning process for stating supervision is the model analysis process based on pre-defined rule and brain wave species, described unsupervised to learn Journey is that the similarity based on brain wave state compares generalization procedure.
5. the method that content is interacted in virtual reality according to claim 2, it is characterised in that described according to the emotion Cognitive information performs corresponding content interaction to virtual environment to be included:
Content adjustment is carried out to the virtual environment according to the emotional state information and the ' s focus of attention information;
The object of the virtual environment is controlled according to the idea control information.
6. the device that content is interacted in a kind of virtual reality, it is characterised in that include:
Acquisition module, for obtaining eeg signal;
Parsing module, for parsing the eeg signal, obtains the corresponding emotion cognition information of the eeg signal;
Interactive module, for corresponding content interaction to be performed to virtual environment according to the emotion cognition information.
7. the device that content is interacted in virtual reality according to claim 6, it is characterised in that the emotion cognition information Including emotional state information, ' s focus of attention information and idea control information.
8. the device that content is interacted in the virtual reality according to claim 6 or 7, it is characterised in that the parsing module Including:
Signal filter submodule, for carrying out filtration treatment to the eeg signal, obtains filtering out steady after interference signal Determine signal;
Feature extraction submodule, it is special for extracting the brain wave that can be identified for that emotion cognition information from the stabilization signal Levy;
Model analysis submodule, for being analyzed to the brain wave feature using characteristic model, obtains the brain wave special Corresponding emotion cognition information is levied, the characteristic model is the corresponding model of the emotion cognition information and brain wave feature.
9. the device that content is interacted in virtual reality according to claim 8, it is characterised in that the matched sub-block, It is additionally operable to:
By the learning process that has supervision and/or unsupervised learning process, model analysis, institute are carried out to the brain wave feature The learning process for stating supervision is the model analysis process based on pre-defined rule and brain wave species, described unsupervised to learn Journey is that the similarity based on brain wave state compares generalization procedure.
10. the device that content is interacted in virtual reality according to claim 7, it is characterised in that the interactive module bag Include:
Submodule is adjusted, for carrying out to the virtual environment according to the emotional state information and the ' s focus of attention information Content adjustment;
Control submodule, for being controlled to the object of the virtual environment according to the idea control information.
CN201610416809.1A 2016-06-14 2016-06-14 Method and device for content interaction in virtual reality Pending CN106560765A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610416809.1A CN106560765A (en) 2016-06-14 2016-06-14 Method and device for content interaction in virtual reality
PCT/CN2016/103849 WO2017215177A1 (en) 2016-06-14 2016-10-28 Method and device for content interaction in virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610416809.1A CN106560765A (en) 2016-06-14 2016-06-14 Method and device for content interaction in virtual reality

Publications (1)

Publication Number Publication Date
CN106560765A true CN106560765A (en) 2017-04-12

Family

ID=58485697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610416809.1A Pending CN106560765A (en) 2016-06-14 2016-06-14 Method and device for content interaction in virtual reality

Country Status (2)

Country Link
CN (1) CN106560765A (en)
WO (1) WO2017215177A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107193378A (en) * 2017-05-20 2017-09-22 吉林大学 Emotion decision maker and method based on brain wave machine learning
CN107436932A (en) * 2017-07-17 2017-12-05 小草数语(北京)科技有限公司 Data digging method and system with virtual reality interaction
CN108363530A (en) * 2018-02-13 2018-08-03 广东欧珀移动通信有限公司 Electronic device, method for playing music and related product
CN108399006A (en) * 2018-02-11 2018-08-14 广东欧珀移动通信有限公司 Signal processing method and related product
CN109965871A (en) * 2019-03-22 2019-07-05 中国科学院上海高等研究院 Analysis method, system, medium and the equipment of brain-computer interface signal
CN110008874A (en) * 2019-03-25 2019-07-12 联想(北京)有限公司 Data processing method and its device, computer system and readable medium
CN110188836A (en) * 2019-06-21 2019-08-30 西安交通大学 A kind of brain function network class method based on variation self-encoding encoder
CN110784746A (en) * 2019-11-07 2020-02-11 成都威爱新经济技术研究院有限公司 Brain cognitive competence AI training system based on VR and electroencephalogram biofeedback technology
CN111984123A (en) * 2020-08-19 2020-11-24 北京鲸世科技有限公司 Electroencephalogram data interaction method and device
CN113619607A (en) * 2021-09-17 2021-11-09 合众新能源汽车有限公司 Control method and control system for automobile running

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101234224A (en) * 2008-01-29 2008-08-06 河海大学 Method for using virtual reality technique to help user executing training rehabilitation
CN103190124A (en) * 2010-08-30 2013-07-03 迪士尼企业公司 Contextual chat based on behavior and usage
CN104750241A (en) * 2013-12-26 2015-07-01 财团法人工业技术研究院 Head-mounted device and related simulation system and simulation method thereof
CN204990187U (en) * 2015-09-16 2016-01-20 陈包容 Take brain wave control function's virtual reality helmet
CN105302297A (en) * 2015-09-16 2016-02-03 国网山东东营市东营区供电公司 Cell-phone interacting method via brain wave Bluetooth earphone

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103976733A (en) * 2014-05-21 2014-08-13 蓝江涌 Multi-passage brain wave control glasses
US10120413B2 (en) * 2014-09-11 2018-11-06 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
CN104615243A (en) * 2015-01-15 2015-05-13 深圳市掌网立体时代视讯技术有限公司 Head-wearable type multi-channel interaction system and multi-channel interaction method
CN104820500A (en) * 2015-05-31 2015-08-05 仲佳 Virtual reality helmet with electroencephalogram control function

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101234224A (en) * 2008-01-29 2008-08-06 河海大学 Method for using virtual reality technique to help user executing training rehabilitation
CN103190124A (en) * 2010-08-30 2013-07-03 迪士尼企业公司 Contextual chat based on behavior and usage
CN104750241A (en) * 2013-12-26 2015-07-01 财团法人工业技术研究院 Head-mounted device and related simulation system and simulation method thereof
CN204990187U (en) * 2015-09-16 2016-01-20 陈包容 Take brain wave control function's virtual reality helmet
CN105302297A (en) * 2015-09-16 2016-02-03 国网山东东营市东营区供电公司 Cell-phone interacting method via brain wave Bluetooth earphone

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107193378A (en) * 2017-05-20 2017-09-22 吉林大学 Emotion decision maker and method based on brain wave machine learning
CN107436932A (en) * 2017-07-17 2017-12-05 小草数语(北京)科技有限公司 Data digging method and system with virtual reality interaction
CN107436932B (en) * 2017-07-17 2020-11-13 绿湾网络科技有限公司 Data mining method and system with virtual reality interaction
CN108399006B (en) * 2018-02-11 2020-06-02 Oppo广东移动通信有限公司 Signal processing method and related product
CN108399006A (en) * 2018-02-11 2018-08-14 广东欧珀移动通信有限公司 Signal processing method and related product
CN108363530A (en) * 2018-02-13 2018-08-03 广东欧珀移动通信有限公司 Electronic device, method for playing music and related product
CN109965871A (en) * 2019-03-22 2019-07-05 中国科学院上海高等研究院 Analysis method, system, medium and the equipment of brain-computer interface signal
CN109965871B (en) * 2019-03-22 2022-02-11 中国科学院上海高等研究院 Method, system, medium, and apparatus for analyzing brain-computer interface signal
CN110008874A (en) * 2019-03-25 2019-07-12 联想(北京)有限公司 Data processing method and its device, computer system and readable medium
CN110008874B (en) * 2019-03-25 2021-05-18 联想(北京)有限公司 Data processing method and device, computer system and readable medium
CN110188836A (en) * 2019-06-21 2019-08-30 西安交通大学 A kind of brain function network class method based on variation self-encoding encoder
CN110188836B (en) * 2019-06-21 2021-06-11 西安交通大学 Brain function network classification method based on variational self-encoder
CN110784746A (en) * 2019-11-07 2020-02-11 成都威爱新经济技术研究院有限公司 Brain cognitive competence AI training system based on VR and electroencephalogram biofeedback technology
CN110784746B (en) * 2019-11-07 2021-07-13 成都威爱新经济技术研究院有限公司 Brain cognitive competence AI training system based on VR and electroencephalogram biofeedback technology
CN111984123A (en) * 2020-08-19 2020-11-24 北京鲸世科技有限公司 Electroencephalogram data interaction method and device
CN113619607A (en) * 2021-09-17 2021-11-09 合众新能源汽车有限公司 Control method and control system for automobile running

Also Published As

Publication number Publication date
WO2017215177A1 (en) 2017-12-21

Similar Documents

Publication Publication Date Title
CN106560765A (en) Method and device for content interaction in virtual reality
Wang et al. EmotioNet: A 3-D convolutional neural network for EEG-based emotion recognition
Gao et al. A recurrence network-based convolutional neural network for fatigue driving detection from EEG
Liao et al. Decoding individual finger movements from one hand using human EEG signals
Mohammadpour et al. Classification of EEG-based emotion for BCI applications
Hekmatmanesh et al. Review of the state-of-the-art of brain-controlled vehicles
Chen et al. Emotion recognition of EEG signals based on the ensemble learning method: AdaBoost
CN110555468A (en) Electroencephalogram signal identification method and system combining recursion graph and CNN
CN112488002B (en) Emotion recognition method and system based on N170
Lu et al. Combined CNN and LSTM for motor imagery classification
Wei et al. A novel multi-dimensional features fusion algorithm for the EEG signal recognition of brain's sensorimotor region activated tasks
Pathirana et al. A critical evaluation on low-cost consumer-grade electroencephalographic devices
Fang et al. Recent advances of P300 speller paradigms and algorithms
Geng et al. A fusion algorithm for EEG signal processing based on motor imagery brain-computer interface
Masuda et al. Multi-Input CNN-LSTM deep learning model for fear level classification based on EEG and peripheral physiological signals
Thanigaivelu et al. OISVM: Optimal Incremental Support Vector Machine-based EEG Classification for Brain-computer Interface Model
Khan et al. EEG based aptitude detection system for stress regulation in health care workers
Shashi Kumar et al. Neural network approach for classification of human emotions from EEG signal
Pereira et al. Factor analysis for finding invariant neural descriptors of human emotions
Miah et al. Prediction of motor imagery tasks from multi-channel eeg data for brain-computer interface applications
Wang et al. Residual learning attention cnn for motion intention recognition based on eeg data
Ahmad et al. Convolutional neural networks model for emotion recognition using EEG signal
Qadir et al. Quantitative analysis of cognitive load test while driving in a VR vs Non-VR Environment
Dutta et al. An extensive analysis on deep neural architecture for classification of subject-independent cognitive states
Liang et al. Nuclear norm regularized deep neural network for EEG-based emotion recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170412

RJ01 Rejection of invention patent application after publication