CN115373519A - Electroencephalogram data interactive display method, device and system and computer equipment - Google Patents

Electroencephalogram data interactive display method, device and system and computer equipment Download PDF

Info

Publication number
CN115373519A
CN115373519A CN202211291012.5A CN202211291012A CN115373519A CN 115373519 A CN115373519 A CN 115373519A CN 202211291012 A CN202211291012 A CN 202211291012A CN 115373519 A CN115373519 A CN 115373519A
Authority
CN
China
Prior art keywords
target
display
electroencephalogram data
data identification
display element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211291012.5A
Other languages
Chinese (zh)
Inventor
郭倩
王慧宇
王博
王晓岸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Brain Up Technology Co ltd
Original Assignee
Beijing Brain Up Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Brain Up Technology Co ltd filed Critical Beijing Brain Up Technology Co ltd
Priority to CN202211291012.5A priority Critical patent/CN115373519A/en
Publication of CN115373519A publication Critical patent/CN115373519A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/375Electroencephalography [EEG] using biofeedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/384Recording apparatus or displays specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Psychiatry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • Psychology (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Educational Technology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • Dermatology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses an electroencephalogram data interactive display method, device, system, computer equipment and storage medium, wherein the method comprises the following steps: acquiring an electroencephalogram data identification sequence; determining a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence; acquiring scene display information of a target game scene and first display information corresponding to a target display element, and outputting and displaying the target game scene and the target display element according to the scene display information and the first display information; and under the condition that the identification result of the target electroencephalogram data is different from the identification result of the next electroencephalogram data, determining the identification result of the next electroencephalogram data as a new identification result of the target electroencephalogram data, and executing the step of determining a target display element corresponding to the identification result of the target electroencephalogram data in the identification sequence of the electroencephalogram data. By adopting the method, the display interaction of the electroencephalogram data identification result is realized.

Description

Electroencephalogram data interactive display method, device and system and computer equipment
Technical Field
The invention relates to the technical field of data processing, in particular to an electroencephalogram data interactive display method, device and system and computer equipment.
Background
A Brain Computer Interface (BCI) is a connection path directly established between a Brain (or Brain cells) and an external device, and a Brain Computer Interface technology is a technology for realizing information interaction between a human Brain and the external device based on the Brain Computer Interface.
The current brain-computer interface technology collects brain electrical data through a brain-computer device after the brain electrical data is generated by the human brain (or brain cells) of a target object, and transmits the brain electrical data to an external device (e.g., a computer device) in the form of a signal through a brain-computer interface. Then, the external device analyzes and identifies the electroencephalogram data contained in the signal instruction, determines an electroencephalogram data identification result corresponding to the electroencephalogram data (the electroencephalogram data identification result can be an emotion identification result, a concentration identification result and the like), and feeds back the electroencephalogram data identification result to the brain of the target object, so that information exchange between the brain and the external device is realized.
However, because the electroencephalogram data identification result has abstraction and imperceptibility, when the obtained electroencephalogram data identification result is fed back to the brain of the target object by the external device, technicians are often required to adjust the emotional state of the target object based on the abstract information represented by the electroencephalogram data identification result. The target object cannot perform autonomous interaction with external equipment based on the electroencephalogram data identification result, so that the brain-computer interface technology at present lacks interactivity.
Disclosure of Invention
The technical problem to be solved by the embodiment of the invention is how to realize interactive display of the electroencephalogram data identification result.
In a first aspect, the present application provides an electroencephalogram data interactive display method, including:
acquiring an electroencephalogram data identification sequence; the electroencephalogram data identification sequence comprises a plurality of electroencephalogram data identification results;
determining a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence;
scene display information of a target game scene and first display information corresponding to the target display elements are obtained, and the target game scene and the target display elements are output and displayed according to the scene display information and the first display information; the first display information comprises display tone information and display state information used for reflecting the target display element;
and under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, determining the next electroencephalogram data identification result as a new target electroencephalogram data identification result, and executing the step of determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence.
In one embodiment, before acquiring the brain electrical data identification sequence, the method further comprises:
acquiring scene display information of a target game scene, and displaying the scene display information on a display page; the scene display information is used for reflecting the game scene of the target game scene.
In one embodiment, the acquiring the electroencephalogram data identification sequence includes:
acquiring an electroencephalogram data stream, and identifying and processing the electroencephalogram data stream to obtain an emotion sequence and a concentration sequence; the emotion sequence comprises a plurality of emotion category data, and the concentration sequence comprises a plurality of concentration values;
carrying out sequence judgment on the emotion sequence, dividing a plurality of emotion category data contained in the emotion sequence into a plurality of emotion category groups, and calculating the average emotion of the emotion category data contained in each emotion category group to obtain an emotion recognition result corresponding to the emotion category group;
dividing the concentration values included in the concentration degree sequence into a plurality of concentration degree calculation units, calculating an average value of the concentration values included in each concentration degree calculation unit, and obtaining a concentration degree identification result corresponding to the concentration degree calculation unit;
and obtaining a plurality of electroencephalogram data identification results according to the emotion identification results and the concentration identification results.
In one embodiment, the electroencephalogram data recognition result includes an emotion recognition result and a concentration recognition result, and the determining a target display element corresponding to a target electroencephalogram data recognition result in the electroencephalogram data recognition sequence includes:
determining a target concentration level of the concentration recognition result, and determining a target display element level in a preset corresponding relation between the concentration level and the display element level based on the target concentration level;
and determining the target display element according to the target display element grade and the emotion recognition result.
In one embodiment, the determining a target display element according to the target display element level and the emotion recognition result includes:
determining a target display state corresponding to the target display element grade based on the corresponding relation between the display element grade and the display state; the target display state represents a display effect of a display element;
determining a target display tone corresponding to the emotion recognition result based on the corresponding relation between the emotion recognition result and the display tone;
and determining a target display element containing the display state information and the display tone information according to the display state information corresponding to the target display state and the display tone information corresponding to the target display tone.
In one embodiment, the first display information includes display tone information and display state information for reflecting the target display element, and the outputting and displaying the scene display information and the first display information includes:
responding to the triggering operation of the target electroencephalogram data identification result, and outputting and displaying the first display information and the scene display information based on a preset display rule, so that the target game scene reflected by the scene display information and the target display element reflected by the first display information are displayed in a display page.
In a second aspect, an electroencephalogram data interactive display device is provided, the device comprising:
the first acquisition module is used for acquiring an electroencephalogram data identification sequence;
the determining module is used for determining a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence;
the second acquisition module is used for acquiring scene display information of a target game scene and first display information corresponding to the target display element, and outputting and displaying the target game scene and the target display element according to the scene display information and the first display information; the first display information comprises display tone information and display state information used for reflecting the target display element;
and the judgment determining module is used for determining the next electroencephalogram data identification result as a new target electroencephalogram data identification result under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, and executing the step of determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence.
In a third aspect, an electroencephalogram data interactive display system is provided, the system comprising:
the brain-computer equipment is used for acquiring an electroencephalogram data stream of a target object;
computer means for determining, from said stream of brain electrical data, a result of brain electrical data recognition of said target object and for performing the steps of the method of any one of the first aspect;
and the display is used for displaying the scene display information output by the computer equipment and the first display information corresponding to the target display element.
In a fourth aspect, the application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the steps of the first aspect when executing the computer program.
In a fifth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the first aspect described above.
In a sixth aspect, the present application further provides a computer program product. The computer program product comprising a computer program that when executed by a processor performs the steps of the first aspect described above.
According to the electroencephalogram data interactive display method, the device, the system, the computer equipment, the storage medium and the computer program product, the computer equipment obtains the electroencephalogram data identification sequence; the electroencephalogram data identification sequence comprises a plurality of electroencephalogram data identification results; determining a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence; scene display information of a target game scene and first display information corresponding to the target display elements are obtained, and the target game scene and the target display elements are output and displayed according to the scene display information and the first display information; the first display information comprises display tone information and display state information used for reflecting the target display element; and under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, determining the next electroencephalogram data identification result as a new target electroencephalogram data identification result, and executing the step of determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence. The method comprises the steps of determining a target display element used for displaying currently through a current electroencephalogram data identification result, obtaining first display information of the target display element and scene display information of a target game scene, further outputting and displaying the first display information and the scene display information, then comparing whether a next electroencephalogram data identification result is the same as the current electroencephalogram data identification result or not, if not, determining a second target display element according to the next electroencephalogram data identification result, obtaining second display information corresponding to the second target display element, outputting and displaying the second display information and the scene display information, switching the target display elements according to different electroencephalogram data identification results, and accordingly achieving display interaction with a user based on different display elements of the electroencephalogram data identification results of the user.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
The invention will be more clearly understood from the following detailed description, taken with reference to the accompanying drawings, in which:
FIG. 1 is a diagram of an interactive display system for electroencephalogram data in one embodiment;
FIG. 2 is a diagram of an application environment of an interactive display method of electroencephalogram data in one embodiment;
FIG. 3 is a schematic flowchart of an electroencephalogram data interactive display method in one embodiment;
FIG. 4 is a display page diagram of the presentation of scene information for a game scene in one embodiment;
FIG. 5 is a flowchart illustrating a step of acquiring a sequence of electroencephalogram data in one embodiment;
FIG. 6 is a flow diagram illustrating the steps of determining a target display element in one embodiment;
FIG. 7 is a flowchart illustrating the steps for determining a target display element containing display state information and display hue information in one embodiment;
FIG. 8 is a diagram of a display page showing a target display element, in one embodiment;
FIG. 9 is a block diagram of an interactive display device for electroencephalogram data in one embodiment;
fig. 10 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be discussed further in subsequent figures.
Embodiments of the invention are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the computer system/server include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, networked personal computers, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above, and the like.
The computer system/server may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
The electroencephalogram data interactive display method provided by the embodiment of the application can be applied to the electroencephalogram data interactive display system 100 shown in fig. 1, and the electroencephalogram data interactive display system 100 comprises: the computer system comprises a brain device 102, a computer device 104 and a display 106, wherein the brain device 102 communicates with the computer device 104 through a communication mode such as a network or bluetooth, and the computer device 104 is connected with the display 106 through a wireless or wired mode, which is not limited in the embodiment of the application. Specifically, as shown in fig. 2, a brain-computer device 102 is worn by a target object and faces a display 106, and the brain-computer device 102 is used for acquiring a brain-computer data stream of the target object. The computer device 104 (not shown in FIG. 2) is used to acquire the brain electrical data identification sequence. And then, determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence. Then, the computer device 104 is further configured to obtain scene display information of the target game scene and first display information corresponding to the target display element, and output and display the target game scene and the target display element according to the scene display information and the first display information. Under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, the computer device 104 determines the next electroencephalogram data identification result as a new target electroencephalogram data identification result, and repeatedly executes the step of determining the target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence. And the display 106 is used for displaying the scene display information output by the computer equipment and the first display information corresponding to the target display element. Optionally, the computer device 104 may include a display, and there is no need to configure an additional display 106, so the brain electrical data interactive display system 100 may further include a brain computer device 102 and a computer device 104 with a display. According to the electroencephalogram data interactive display system, the game scene corresponding to the scene display information and the target display element corresponding to the first display information are displayed, the change of the concentration degree recognition result and the change process of the emotion recognition result in the electroencephalogram data are reflected in a pictographic mode, and display interaction between the electroencephalogram data and the display object is achieved.
In one embodiment, as shown in fig. 3, an electroencephalogram data interactive display method is provided, which is described by taking a computer device applied to the electroencephalogram data interactive display system in fig. 1 as an example, and includes the following steps:
step 302, acquiring an electroencephalogram data identification sequence.
The electroencephalogram data identification sequence comprises a plurality of electroencephalogram data identification results.
In implementation, the brain-computer device acquires electroencephalogram data of a wearer (also referred to as a target object) in real time through a brain-computer interface technology (BCI), generates an electroencephalogram data stream, and then transmits the electroencephalogram data stream to the computer device through bluetooth or a network or the like. The computer equipment acquires the electroencephalogram data stream, performs feature extraction on the electroencephalogram data of the target object according to a preset electroencephalogram feature extraction algorithm to obtain the electroencephalogram features of the target object, and then performs recognition processing on the electroencephalogram features according to a preset feature recognition algorithm to obtain an electroencephalogram data recognition result of the target object, so as to obtain an electroencephalogram data recognition sequence. The computer equipment performs characteristic analysis on the current target object based on the electroencephalogram data identification sequence, so that interactive display of the electroencephalogram data is realized.
And 304, determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence.
In implementation, the computer device stores a plurality of corresponding relations containing the electroencephalogram data identification result in advance, and determines a target display element corresponding to the current target electroencephalogram data identification result according to the plurality of corresponding relations. The target display element may be of various types, for example, a flower, a tree, an animal, a building, and the like, and the type of the display element is not limited in the embodiments of the present application.
Optionally, the target electroencephalogram data identification result may be each electroencephalogram data identification result in the electroencephalogram data identification sequence, that is, when the electroencephalogram data identification sequence is processed, each electroencephalogram data identification result is processed in sequence based on the sequence of each electroencephalogram data identification result in the electroencephalogram data identification sequence, and each electroencephalogram data identification result is used as a target electroencephalogram data identification result.
Step 306, obtaining scene display information of the target game scene and first display information corresponding to the target display element, and outputting and displaying the target game scene and the target display element according to the scene display information and the first display information.
The first display information comprises display tone information and display state information used for reflecting the target display element.
In implementation, different target display elements can be displayed in corresponding different game scenes, so after the target display elements are determined, the computer device obtains scene display information corresponding to the target game scenes and first display information corresponding to the target display elements, and then pushes the scene display information and the first display information to the display so as to output and display the scene display information and the first display information through the display.
And 308, under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, determining the next electroencephalogram data identification result as a new target electroencephalogram data identification result, and executing the step of determining the target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence.
In implementation, based on the principle of sequentially processing each electroencephalogram data identification result in the electroencephalogram data identification sequence, the next electroencephalogram data identification result of the target electroencephalogram data identification result in the electroencephalogram data identification sequence is displayed and processed, under the condition that the target electroencephalogram data identification result is different from the next electroencephalogram data identification result, the computer device determines the next electroencephalogram data identification result as a new target electroencephalogram data identification result, and the subsequent processes of the step 304 and the like are continuously executed. And different target display elements are displayed on the display by continuously switching different target electroencephalogram data identification results. The specific implementation process of step 304 is already described in the above step, and is not described herein again in this embodiment of the application.
In another case, that is, under the condition that it is determined that the identification result of the target electroencephalogram data is the same as the identification result of the next electroencephalogram data, the game scene displayed by the display and the target elements in the game scene do not change, that is, the current display is maintained, and the redisplay processing is not performed on the identification result of the next electroencephalogram data.
According to the electroencephalogram data interactive display method, the computer equipment acquires the electroencephalogram data identification sequence. And determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence. Then, the computer device obtains scene display information of the target game scene and first display information corresponding to the target display elements, and outputs and displays the target game scene and the target display elements according to the scene display information and the first display information. And then, under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, the computer equipment determines the next electroencephalogram data identification result as a new target electroencephalogram data identification result, and executes the step of determining the target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence. The method comprises the steps of determining a target display element used for displaying currently through a current electroencephalogram data identification result, obtaining first display information of the target display element and scene display information of a target game scene, further outputting and displaying the first display information and the scene display information, then judging whether the target electroencephalogram data identification result is the same as a next electroencephalogram data identification result, determining the next electroencephalogram data identification result as a new target electroencephalogram data identification result under different conditions, and executing the step of determining the target display element corresponding to the target electroencephalogram data identification result, so that switching of the target display element under different electroencephalogram data identification results is realized, namely different target display elements are continuously displayed, and display interaction of the electroencephalogram data identification result and the target display element is realized.
In one embodiment, before step 302, the electroencephalogram data interactive display method further includes:
and acquiring scene display information of the target game scene, and displaying the scene display information on a display page.
The scene display information is used for reflecting the game scene of the target game scene.
In implementation, before the interactive display of the electroencephalogram data identification result is started, the computer device acquires scene information of a target game scene, and then displays the scene display information of the target game scene on a display page corresponding to the computer device. As shown in fig. 4, fig. 4 is a view showing a target game scene in the garden soil scene in a showing page based on scene display information by a computer device. Target display elements (e.g., bouquet, lawn, number, etc.) may be further added to the garden soil scene to enable further presentation of the target display elements.
In this embodiment, the scene display information of the target game scene is displayed in advance, so that the target object (i.e., a wearer of the brain-computer device) can visualize the game scene of the target game, and thus, through the game scene, the electroencephalogram data identification result influencing the output of the target object is further controlled, and then, based on the electroencephalogram data identification result, the target display element is determined, and the interactive display is realized.
In one embodiment, as shown in fig. 5, the specific processing procedure of step 302 includes the following steps:
step 502, acquiring an electroencephalogram data stream, and performing identification processing on the electroencephalogram data stream to obtain an emotion sequence and a concentration sequence.
Wherein, the emotion sequence comprises a plurality of emotion classification data. The concentration sequence includes a plurality of concentration values.
In implementation, the brain-computer device collects the brain electrical signals of the target object in real time to generate brain electrical data flow. Then, the computer device acquires the electroencephalogram data stream acquired by the brain-computer device, and carries out recognition processing on the electroencephalogram data stream based on the processing speed of the second level to obtain an emotion sequence and a concentration sequence. Specifically, the computer device analyzes and processes the electroencephalogram data, and obtains emotion category data and a concentration value every 1 second. Therefore, an emotion sequence and a concentration sequence corresponding to the electroencephalogram data stream are obtained. The emotion category data included in the emotion sequence may be, but is not limited to: various emotion categories such as happy, sad, fear, anger, etc. are used to reflect the emotional characteristics of the target object. The concentration value included in the concentration sequence represents the concentration of the target object through a numerical value, wherein the larger the numerical value is, the higher the concentration of the target object is, and the smaller the numerical value is, the lower the concentration of the target object is.
Optionally, the computer device performs identification processing on the electroencephalogram data, and specifically, an electroencephalogram feature extraction algorithm and an electroencephalogram data identification algorithm which are required in the process of identification processing of the electroencephalogram data, which is not limited in the embodiment of the present disclosure. Specifically, in the feature extraction process of the electroencephalogram data, the electroencephalogram features extracted by the brain-computer equipment can be classified into three categories: time domain features, frequency domain features, and time-frequency domain features. The time domain features are mainly used for capturing time tissue information of the electroencephalogram signals, and for example, the Hjorth features, the fractal dimension features, the high-order cross features and the like are all time domain features. The frequency domain features mainly capture target object emotion information from the angle of a frequency domain. The time-frequency domain features capture time-domain information and frequency-domain information at the same time, namely, the frequency-domain information is extracted from a unit time signal divided by a sliding window. For the above three features, the present application takes the extraction of frequency domain features as an example for explanation, the brain-computer device may first decompose the original frequency band into several sub-frequency bands, and then extract the electroencephalogram features of each frequency band, and the available extraction methods include fourier transform (RT), power Spectral Density (PSD), wavelet Transform (WT), differential entropy (DT), and the like.
Optionally, there are multiple recognition algorithms for electroencephalogram data (referred to as feature recognition algorithms for short), for example, supervised learning algorithms such as Support Vector Machine (SVM), K-nearest neighbor (KNN) and Naive Bayes (NB) in the machine learning algorithm. The identification algorithm of the electroencephalogram data is not limited in the embodiment of the application. Hereinafter, a specific description will be given for an identification algorithm for identifying the electroencephalogram characteristic of the target object in the present application, and will not be described in detail herein.
And 504, performing sequence judgment on the emotion sequence, dividing a plurality of emotion category data contained in the emotion sequence into a plurality of emotion category groups, and calculating the average emotion of the emotion category data contained in each emotion category group to obtain an emotion recognition result corresponding to each emotion category group.
In implementation, the computer device divides a plurality of emotion category data included in the emotion sequence into a plurality of emotion category groups according to a preset data division number. Then, for each emotion category group, the computer device calculates the average emotion of the emotion category data included in the emotion category group to obtain an emotion recognition result corresponding to the emotion category group. For example, emotion category data corresponding to every 3 seconds in the emotion sequence is determined as an emotion category group, and an average emotion corresponding to the 3 emotion category data within the 3 seconds is calculated, where the average emotion is an emotion recognition result corresponding to the emotion category group. The average emotion is also divided into multiple emotion categories such as happy, sad, fear, anger and the like, so that a plurality of emotion recognition results correspond to the emotion sequence in the electroencephalogram data stream.
Optionally, the average emotion calculating method for calculating 3 emotion category data in the emotion category group by the computer device may be to establish a correspondence between the emotion category data and a numerical value, determine a corresponding numerical value of the 3 emotion category data in the emotion category group, and calculate an average value of the 3 numerical values. Then, based on the correspondence between the emotion category data and the numerical value, emotion category data corresponding to the average value, that is, an average emotion, which is an emotion recognition result of the emotion category group, is determined. In the embodiment of the present application, the average emotion of the emotion category data in the emotion category group may also be calculated in other manners, which is not limited in the embodiment of the present application.
Step 506, the concentration values included in the concentration degree sequence are divided into a plurality of concentration degree calculation units, and an average value of the concentration values included in each concentration degree calculation unit is calculated to obtain a concentration degree identification result corresponding to the concentration degree calculation unit.
In an implementation, the computer device divides the concentration values contained in the concentration sequence into a plurality of concentration calculation units based on a preset number of divisions. Then, for each concentration degree calculation unit, the computer device calculates an average value of the concentration degree values included in the concentration degree calculation unit, and obtains a concentration degree identification result corresponding to the concentration degree calculation unit. For example, the concentration value corresponding to every 3 seconds in the concentration degree sequence is determined as one concentration degree calculation unit, and the average value of the 3 concentration degree values in the 3 seconds included in the concentration degree calculation unit is calculated, and the average value is the concentration degree identification result corresponding to the concentration degree calculation unit. Thus, a plurality of concentration degree identification results of the concentration degree sequence in the electroencephalogram data stream can be obtained. The concentration degree recognition result represents the concentration degree of the target object.
And step 508, obtaining a plurality of electroencephalogram data identification results according to each emotion identification result and each concentration degree identification result.
In implementation, the computer device obtains a plurality of electroencephalogram data recognition results according to the emotion recognition results and the concentration recognition results. Specifically, the emotion recognition result obtained for each emotion category group and the concentration degree recognition result corresponding to the concentration degree calculation unit are used as a pair of electroencephalogram data recognition results, and thus, a plurality of electroencephalogram data recognition results including the emotion recognition result and the concentration degree recognition result are obtained based on the plurality of emotion recognition results and the plurality of concentration degree recognition results.
In the embodiment, the electroencephalogram data are processed to obtain an electroencephalogram data identification result, and the emotional characteristics and the concentration characteristics of the target object can be displayed in an iconic manner based on the electroencephalogram data identification result and the target display element.
In one embodiment, the electroencephalogram data recognition result comprises an emotion recognition result and a concentration recognition result. As shown in fig. 6, the specific processing procedure of step 104 includes:
step 602, determining a target concentration level of the concentration recognition result, and determining a target display element level in a preset corresponding relationship between the concentration level and the display element level based on the target concentration level.
In the implementation, the computer device stores concentration level information in advance, and for each concentration identification result included in the electroencephalogram data identification result, the computer device determines a target concentration level at which the concentration identification result is located. Specifically, for example, with a 20-value as one rank interval, concentration values in the range of 0 to 100 are divided into 5 ranks, [0, 20) as a rank, [20, 40) as B rank, [40, 60) as C rank, [60, 80) as D rank, and [80, 100) as E rank. And then determining a target concentration level at which the concentration value in the current concentration recognition result is located based on the boundary threshold of each level. In addition, the computer device is divided into display element grades in advance for the display elements, and the display element grades and the target concentration grades have corresponding relations, so that the computer device can determine the target display element grades corresponding to the target concentration grades according to the target concentration grades in which the concentration recognition results are located and the one-to-one corresponding relations between the concentration grades and the display element grades.
And step 604, determining the target display element according to the target display element grade and the emotion recognition result.
In practice, each display element level stored by the computer device contains display elements at that level. Also, the display states of the display elements at the same level are the same. In addition, the emotion recognition result may be used to determine a display hue of the display element, and therefore, the computer device determines a target display element that satisfies the display state and the display hue based on the determined target display element rank and the emotion recognition result.
In one embodiment, as shown in fig. 7, the specific processing procedure of step 602 includes the following steps:
step 702, determining a target display state corresponding to the target display element level based on the corresponding relationship between the display element level and the display state.
Wherein the target display state characterizes a display effect of the display element.
In implementation, the computer device stores the corresponding relationship between the display element levels and the display states in advance, that is, each display element level corresponds to one display state of a display element. Therefore, the computer device can determine the target display state of the display element corresponding to the current target display element level according to the corresponding relation between the display element level and the display state. As shown in fig. 8, taking the display element as a bouquet, the display states corresponding to the display elements of different display element levels can be represented by the height of the bouquet and the openness of each flower in the bouquet. Assuming that the display element level is divided into 5 levels, based on the corresponding relationship between the display element level and the concentration level, based on the concentration level (3 levels) of the current concentration recognition result (the value of the current concentration recognition result is 49), it is determined that the target display element level of the target display element is also 3 levels, and the target display state corresponding to the 3 levels is that the height range of the target flower is 1-3 cm and the expansion ratio of the openness of the target flower is 2/3.
Optionally, when the display element is a tree, the display states corresponding to the display elements of different display element levels can be represented by the height of the tree, whether the tree is fruited, or not. Therefore, for different display elements, the computer device may set different expression forms of display element levels according to actual situations, and the embodiment of the present application is not limited.
Step 704, determining a target display color tone corresponding to the emotion recognition result based on the corresponding relationship between the emotion recognition result and the display color tone.
In implementation, the computer device determines a target display color tone corresponding to the emotion recognition result in the target electroencephalogram data recognition result based on the corresponding relationship between the emotion recognition result and the display color tone. For example, the current emotion recognition result is happy, and the target display color tone corresponding to the emotion recognition result is determined to be warm based on the correspondence between the emotion recognition result and the display color tone. And determining that the target display color tone corresponding to the emotion recognition result is cool based on the corresponding relation between the emotion recognition result and the display color tone. Optionally, the emotion recognition result represents different emotion categories, and may correspond to different hues, for example, the emotion recognition result represents the emotion recognition result of the type in 4, and corresponds to different hues in 4.
Step 706, determining a target display element including the display state information and the display color tone information according to the display state information corresponding to the target display state and the display color tone information corresponding to the target display color tone.
In implementation, the computer device determines a target display element including the display state information and the display color tone information according to the display state information corresponding to the target display state and the display color tone information corresponding to the target display color tone, as shown in fig. 8, fig. 8 is a display page diagram showing the target display element, and as can be seen in fig. 8, the computer device determines the target display element according to the display state information such as flower height and flower openness corresponding to the target display state and the display color tone information with the target display color tone being warm color tone.
Optionally, besides determining the display dimensions (e.g., flower height, flower openness, and target display color) corresponding to the target display element, there are other display dimensions that can distinguish the target display element from each other. For example, the display color (red, white, purple, blue) of each flower and the type (lily, rose, tulip, carnation) of each flower under the target display color tone are determined, so that, when the target display element condition is satisfied, the matching combination of the target display elements of multiple display dimensions can be screened for other display dimensions, and the three white flowers and the five red flowers have the same opening degree on the basis that the target display color tone is warm, and the red and white flowers are matched and combined. However, the sizes of flowers between the white flowers are the same as the sizes of flowers between the red flowers. Therefore, under the target display state and the target display tone, the determined specific target display element can be output and displayed according to a preset default collocation and combination form, and can also be set in a user-defined manner in advance by the target object. Then, based on the feature information of the emotion and concentration degree of the target object reflected by the subsequent electroencephalogram data identification result fed back by the target object, the preference result of the target object to the collocation combination is determined, and therefore target display elements of different collocation combinations are switched. The embodiment of the present application does not limit the specific display form of the target display element.
In this embodiment, the concentration degree level of the concentration degree recognition result corresponds to the display element level, and further, the computer device may determine the target display state by the concentration degree recognition result, the display element level, and the corresponding relationship between the display element level and the display state, and determine the target display color tone based on the emotion recognition result, thereby determining the final target display element based on the target display state and the target display color tone. By adopting the method, the final target display element is determined based on the concentration degree recognition result and the emotion recognition result in the electroencephalogram data recognition result, the imaging display of the electroencephalogram data recognition result is realized, and the concentration degree characteristic and the emotion characteristic of the current target object are better reflected.
In one embodiment, the first display information includes display tone information and display state information for reflecting the target display element, and the specific processing procedure of step 106 includes:
responding to the triggering operation of the target electroencephalogram data identification result, and outputting and displaying the first display information and the scene display information based on a preset display rule, so that a target game scene reflected by the scene display information and a target display element reflected by the first display information are displayed in a display page.
In implementation, the computer device responds to a trigger operation of a target electroencephalogram data identification result, and outputs and displays the first display information and the scene display information based on the acquired scene display information, the first display information and a preset display rule, so that a game scene reflected by the scene display information and a target display element corresponding to the first display information are displayed in a target display area of a display page. As shown in fig. 8, the game scene reflected by the scene display information is a garden soil environment scene, and the target display element is a bouquet.
In one embodiment, as shown in FIG. 9, there is provided an electroencephalogram data display apparatus 900, including: a first obtaining module 901, a determining module 902, a second obtaining module 903 and a discrimination determining module 904, wherein,
a first obtaining module 901, configured to obtain an electroencephalogram data identification sequence;
a determining module 902, configured to determine a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence;
a second obtaining module 903, configured to obtain scene display information of the target game scene and first display information corresponding to the target display element, and output and display the target game scene and the target display element according to the scene display information and the first display information; the first display information comprises display tone information and display state information used for reflecting the target display element;
and the judgment determining module 904 is configured to, in a case that it is determined that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next target electroencephalogram data identification result, determine the next target electroencephalogram data identification result as a new target electroencephalogram data identification result, and execute the step of determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence.
In one embodiment, the apparatus 900 further comprises:
the acquisition and display module is used for acquiring scene display information of a target game scene and displaying the scene display information on a display page; the scene display information is used to reflect the game scene of the target game scene.
In one embodiment, the first obtaining module 901 is specifically configured to obtain an electroencephalogram data stream, and perform identification processing on the electroencephalogram data stream to obtain an emotion sequence and a concentration sequence; the emotion sequence comprises a plurality of emotion category data, and the concentration sequence comprises a plurality of concentration values;
performing sequence judgment on the emotion sequence, dividing a plurality of emotion category data contained in the emotion sequence into a plurality of emotion category groups, and calculating the average emotion of the emotion category data contained in each emotion category group to obtain an emotion recognition result corresponding to each emotion category group;
dividing the concentration values contained in the concentration degree sequence into a plurality of concentration degree calculation units, and calculating the average value of the concentration values contained in each concentration degree calculation unit to obtain a concentration degree identification result corresponding to the concentration degree calculation unit;
and obtaining a plurality of electroencephalogram data recognition results according to the emotion recognition results and the concentration recognition results.
In one embodiment, the electroencephalogram data recognition result includes an emotion recognition result and a concentration recognition result, and the determining module 902 is specifically configured to determine a target concentration level at which the concentration recognition result is located, and determine a target display element level in a preset corresponding relationship between the concentration level and the display element level based on the target concentration level;
and determining the target display element according to the target display element grade and the emotion recognition result.
In one embodiment, the determining module 902 is further configured to determine, based on the correspondence between the display element levels and the display statuses, a target display status corresponding to the target display element level; the target display state represents the display effect of the display element;
determining a target display tone corresponding to the emotion recognition result based on the corresponding relation between the emotion recognition result and the display tone;
and determining a target display element containing the display state information and the display tone information according to the display state information corresponding to the target display state and the display tone information corresponding to the target display tone.
In one embodiment, the second obtaining module 903 is specifically configured to, in response to a trigger operation of a target electroencephalogram data identification result, output and display the first display information and the scene display information based on a preset display rule, so that a target game scene reflected by the scene display information and a target display element reflected by the first display information are displayed in a display page.
In one embodiment, the types of target display elements include a flower type, a tree type, an animal type, and a building type.
In the present specification, the embodiments are described in a progressive manner, and each embodiment focuses on differences from other embodiments, and the same or similar parts in each embodiment are referred to each other. For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 10. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to realize an electroencephalogram data interactive display method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 10 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), magnetic Random Access Memory (MRAM), ferroelectric Random Access Memory (FRAM), phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the various embodiments provided herein may be, without limitation, general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing-based data processing logic devices, or the like.
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.
The method and system of the present invention may be implemented in a number of ways. For example, the methods and systems of the present invention may be implemented in software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustrative purposes only, and the steps of the method of the present invention are not limited to the order specifically described above unless specifically indicated otherwise. Furthermore, in some embodiments, the present invention may also be embodied as a program recorded in a recording medium, the program including machine-readable instructions for implementing a method according to the present invention. Thus, the present invention also covers a recording medium storing a program for executing the method according to the present invention.
The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (11)

1. An electroencephalogram data interactive display method is characterized by comprising the following steps:
acquiring an electroencephalogram data identification sequence; the electroencephalogram data identification sequence comprises a plurality of electroencephalogram data identification results;
determining a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence;
scene display information of a target game scene and first display information corresponding to the target display elements are obtained, and the target game scene and the target display elements are output and displayed according to the scene display information and the first display information; the first display information comprises display tone information and display state information used for reflecting the target display element;
and under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, determining the next electroencephalogram data identification result as a new target electroencephalogram data identification result, and executing the step of determining a target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence.
2. The method of claim 1, wherein prior to said acquiring the brain electrical data identification sequence, the method further comprises:
acquiring scene display information of a target game scene, and displaying the scene display information on a display page; the scene display information is used for reflecting the game scene of the target game scene.
3. The method of claim 1, wherein said acquiring brain electrical data identifying sequences comprises:
acquiring an electroencephalogram data stream, and identifying and processing the electroencephalogram data stream to obtain an emotion sequence and a concentration sequence; the emotion sequence comprises a plurality of emotion category data, and the concentration sequence comprises a plurality of concentration values;
performing sequence judgment on the emotion sequence, dividing a plurality of emotion category data contained in the emotion sequence into a plurality of emotion category groups, and calculating the average emotion of the emotion category data contained in each emotion category group to obtain an emotion recognition result corresponding to each emotion category group;
dividing the concentration values included in the concentration degree sequence into a plurality of concentration degree calculation units, calculating an average value of the concentration values included in each concentration degree calculation unit, and obtaining a concentration degree identification result corresponding to the concentration degree calculation unit;
and obtaining a plurality of electroencephalogram data identification results according to the emotion identification results and the concentration identification results.
4. The method of claim 1, wherein the electroencephalogram data identification result comprises an emotion identification result and a concentration identification result, and the determining a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence comprises:
determining a target concentration degree grade of the concentration degree identification result, and determining a target display element grade in a corresponding relation between a preset concentration degree grade and a display element grade on the basis of the target concentration degree grade;
and determining a target display element according to the target display element grade and the emotion recognition result.
5. The method of claim 4, wherein determining a target display element based on the target display element ranking and the emotion recognition result comprises:
determining a target display state corresponding to the target display element grade based on the corresponding relation between the display element grade and the display state; the target display state represents a display effect of a display element;
determining a target display tone corresponding to the emotion recognition result based on the corresponding relation between the emotion recognition result and the display tone;
and determining a target display element containing the display state information and the display tone information according to the display state information corresponding to the target display state and the display tone information corresponding to the target display tone.
6. The method according to claim 1, wherein the output display of the target game scene and the target display element according to the scene display information and the first display information comprises:
responding to the triggering operation of the target electroencephalogram data identification result, and outputting and displaying the first display information and the scene display information based on a preset display rule, so that the target game scene reflected by the scene display information and the target display element reflected by the first display information are displayed in a display page.
7. The method of claim 1, wherein the types of target display elements include a flower type, a tree type, an animal type, and a building type.
8. An electroencephalogram data interactive display device, characterized in that the device comprises:
the first acquisition module is used for acquiring the electroencephalogram data identification sequence;
the determining module is used for determining a target display element corresponding to a target electroencephalogram data identification result in the electroencephalogram data identification sequence;
the second acquisition module is used for acquiring scene display information of a target game scene and first display information corresponding to the target display element, and outputting and displaying the target game scene and the target display element according to the scene display information and the first display information; the first display information comprises display tone information and display state information used for reflecting the target display element;
and the judgment and determination module is used for determining the next electroencephalogram data identification result as a new target electroencephalogram data identification result under the condition that the current target electroencephalogram data identification result in the electroencephalogram data identification sequence is different from the next electroencephalogram data identification result, and executing the step of determining the target display element corresponding to the target electroencephalogram data identification result in the electroencephalogram data identification sequence.
9. An electroencephalogram data interactive display system, characterized in that the system comprises:
the brain-computer equipment is used for acquiring the electroencephalogram data stream of the target object;
a computer device for determining a brain electrical data recognition result of the target object from the brain electrical data stream and performing the steps of the method of any one of claims 1 to 7;
and the display is used for displaying the scene display information output by the computer equipment and the first display information corresponding to the target display element.
10. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202211291012.5A 2022-10-21 2022-10-21 Electroencephalogram data interactive display method, device and system and computer equipment Pending CN115373519A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211291012.5A CN115373519A (en) 2022-10-21 2022-10-21 Electroencephalogram data interactive display method, device and system and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211291012.5A CN115373519A (en) 2022-10-21 2022-10-21 Electroencephalogram data interactive display method, device and system and computer equipment

Publications (1)

Publication Number Publication Date
CN115373519A true CN115373519A (en) 2022-11-22

Family

ID=84074024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211291012.5A Pending CN115373519A (en) 2022-10-21 2022-10-21 Electroencephalogram data interactive display method, device and system and computer equipment

Country Status (1)

Country Link
CN (1) CN115373519A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116172560A (en) * 2023-04-20 2023-05-30 浙江强脑科技有限公司 Reaction speed evaluation method for reaction force training, terminal equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109432773A (en) * 2018-08-30 2019-03-08 百度在线网络技术(北京)有限公司 Processing method, device, electronic equipment and the storage medium of scene of game
CN114625301A (en) * 2022-05-13 2022-06-14 厚德明心(北京)科技有限公司 Display method, display device, electronic equipment and storage medium
CN114640699A (en) * 2022-02-17 2022-06-17 华南理工大学 Emotion induction monitoring system based on VR role playing game interaction
CN114756121A (en) * 2022-03-18 2022-07-15 华南理工大学 Virtual reality interactive emotion detection and regulation system based on brain-computer interface
CN114779937A (en) * 2022-04-28 2022-07-22 脑陆(重庆)智能科技研究院有限公司 Multimedia interactive imaging method, apparatus, storage medium and computer program product
CN114847975A (en) * 2022-04-28 2022-08-05 脑陆(重庆)智能科技研究院有限公司 Electroencephalogram data processing method, device, system, computer device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109432773A (en) * 2018-08-30 2019-03-08 百度在线网络技术(北京)有限公司 Processing method, device, electronic equipment and the storage medium of scene of game
CN114640699A (en) * 2022-02-17 2022-06-17 华南理工大学 Emotion induction monitoring system based on VR role playing game interaction
CN114756121A (en) * 2022-03-18 2022-07-15 华南理工大学 Virtual reality interactive emotion detection and regulation system based on brain-computer interface
CN114779937A (en) * 2022-04-28 2022-07-22 脑陆(重庆)智能科技研究院有限公司 Multimedia interactive imaging method, apparatus, storage medium and computer program product
CN114847975A (en) * 2022-04-28 2022-08-05 脑陆(重庆)智能科技研究院有限公司 Electroencephalogram data processing method, device, system, computer device and storage medium
CN114625301A (en) * 2022-05-13 2022-06-14 厚德明心(北京)科技有限公司 Display method, display device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116172560A (en) * 2023-04-20 2023-05-30 浙江强脑科技有限公司 Reaction speed evaluation method for reaction force training, terminal equipment and storage medium
CN116172560B (en) * 2023-04-20 2023-08-29 浙江强脑科技有限公司 Reaction speed evaluation method for reaction force training, terminal equipment and storage medium

Similar Documents

Publication Publication Date Title
US10853987B2 (en) Generating cartoon images from photos
US10992839B2 (en) Electronic device and method for controlling the electronic device
Kumar et al. Correlation and network analysis of global financial indices
CN109711345B (en) Flame image identification method and device and storage medium thereof
CN108875797B (en) Method for determining image similarity, photo album management method and related equipment
CN111325271B (en) Image classification method and device
US9585581B1 (en) Real-time biometric detection of oscillatory phenomena and voltage events
US11620480B2 (en) Learning method, computer program, classifier, and generator
WO2015062209A1 (en) Visualized optimization processing method and device for random forest classification model
CN110741387B (en) Face recognition method and device, storage medium and electronic equipment
US20230319126A1 (en) Triggering changes to real-time special effects included in a live streaming video
CN110443171B (en) Video file classification method and device, storage medium and terminal
CN115373519A (en) Electroencephalogram data interactive display method, device and system and computer equipment
CN107679532B (en) Data transmission method, device, mobile terminal and computer readable storage medium
CN109346102B (en) Method and device for detecting audio beginning crackle and storage medium
US20170287178A1 (en) Visual generation of an anomaly detection image
CN116229188B (en) Image processing display method, classification model generation method and equipment thereof
CN114779937A (en) Multimedia interactive imaging method, apparatus, storage medium and computer program product
CN114299445A (en) Power distribution room danger source identification method and device, computer equipment, medium and product
CN114598844A (en) Doorbell reminding method and device, intelligent doorbell equipment and storage medium
CN115357154B (en) Electroencephalogram data display method, device, system, computer device and storage medium
US10222959B2 (en) Visual modification and training of an anomaly detection image
CN117409330B (en) Aquatic vegetation identification method, aquatic vegetation identification device, computer equipment and storage medium
CN117235584B (en) Picture data classification method, device, electronic device and storage medium
CN115150439B (en) Method and system for analyzing perception data, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20221122