CN115357154B - Electroencephalogram data display method, device, system, computer device and storage medium - Google Patents

Electroencephalogram data display method, device, system, computer device and storage medium Download PDF

Info

Publication number
CN115357154B
CN115357154B CN202211291114.7A CN202211291114A CN115357154B CN 115357154 B CN115357154 B CN 115357154B CN 202211291114 A CN202211291114 A CN 202211291114A CN 115357154 B CN115357154 B CN 115357154B
Authority
CN
China
Prior art keywords
display
result
concentration degree
determining
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211291114.7A
Other languages
Chinese (zh)
Other versions
CN115357154A (en
Inventor
郭倩
金铭
王博
王晓岸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Brain Up Technology Co ltd
Original Assignee
Beijing Brain Up Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Brain Up Technology Co ltd filed Critical Beijing Brain Up Technology Co ltd
Priority to CN202211291114.7A priority Critical patent/CN115357154B/en
Publication of CN115357154A publication Critical patent/CN115357154A/en
Application granted granted Critical
Publication of CN115357154B publication Critical patent/CN115357154B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a method, a device, a system, computer equipment and a storage medium for displaying electroencephalogram data, wherein the method comprises the following steps: acquiring an electroencephalogram data identification result, wherein the electroencephalogram data identification result comprises a plurality of emotion identification results and a plurality of concentration identification results; determining concentration degree scores of the concentration degree identification results and accumulating the concentration degree scores to obtain integral results; determining the motion information of each display particle according to the integral result, and controlling each display particle to move within a preset motion range; when the integration result is equal to a first integration threshold value, determining a comprehensive emotion recognition result based on the emotion recognition results and determining a target display object corresponding to the comprehensive emotion recognition result; and when the integration result is equal to the second integration threshold value, determining the target display position and the target display information of each display particle according to the display information contained in the target display object and the motion information of each display particle. By adopting the method, the interactive display of the electroencephalogram data identification result is realized.

Description

Electroencephalogram data display method, device, system, computer equipment and storage medium
Technical Field
The invention relates to the technical field of data processing, in particular to an electroencephalogram data display method, device and system, computer equipment and a storage medium.
Background
A Brain Computer Interface (BCI) is a connection path directly established between a Brain (or Brain cell) and an external device. The brain-computer interface technology is a technology for acquiring electroencephalogram data of a target object through a brain-computer interface and analyzing the electroencephalogram data to obtain an electroencephalogram data identification result of the target object.
In the current brain-computer interface technology, after electroencephalogram data is generated in a brain (or brain cells) of a target object, the electroencephalogram data is collected by a brain-computer device, and the electroencephalogram data is transmitted to an external device (e.g., a computer device) in the form of a signal through a brain-computer interface. Then, the external device analyzes and identifies the electroencephalogram data contained in the signal command, determines an electroencephalogram data identification result corresponding to the electroencephalogram data (the electroencephalogram data identification result can be an emotion identification result, a concentration identification result and the like), and feeds the electroencephalogram data identification result back to the brain of the target object, so that information exchange between the brain and the external device is realized.
Because the electroencephalogram data identification result has abstraction and no perceptibility, technicians are often required to adjust the emotion state of the target object based on the electroencephalogram data identification result of the target object in the application process of the electroencephalogram data identification result at present. The target object cannot directly interact with external equipment based on the electroencephalogram data identification result, and therefore, the brain-computer interface technology at present lacks interactivity.
Disclosure of Invention
The technical problem to be solved by the embodiment of the invention is how to realize interactive display of the electroencephalogram data identification result.
In a first aspect, the present application provides a method for displaying electroencephalogram data. The method comprises the following steps:
acquiring an electroencephalogram data identification result corresponding to an electroencephalogram data stream, wherein the electroencephalogram data identification result comprises a plurality of emotion identification results and a plurality of concentration identification results;
determining concentration degree scores corresponding to the plurality of concentration degree identification results respectively;
accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain an integral result;
determining current motion information of each display particle according to the integration result, controlling each display particle to move in a preset motion range based on the motion information; the display particles are initial display objects output in advance in a display interface;
when the integration result is equal to a preset first integration threshold value, determining a comprehensive emotion recognition result based on each emotion recognition result, and determining a target display object corresponding to the comprehensive emotion recognition result;
when the integration result is equal to a preset second integration threshold value, determining a target display position and target display information corresponding to each display particle according to display information contained in the target display object and the current motion information of each display particle; and the target display position and the target display information corresponding to each display particle are used for presenting a target display object.
In one embodiment, the emotion recognition result comprises happiness, sadness, fear, anger; the concentration degree identification result is a concentration degree numerical value representing the concentration degree.
In one embodiment, the determining the concentration scores corresponding to the plurality of concentration recognition results respectively includes:
determining a concentration degree score corresponding to the concentration degree identification result according to the concentration degree identification result and a preset concentration degree threshold value;
the accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain an integral result includes:
determining whether the concentration degree recognition result triggers scoring processing according to the concentration degree score corresponding to the concentration degree recognition result;
and if the scoring processing is triggered, performing accumulation calculation on the concentration degree score corresponding to the concentration degree identification result and the historical concentration degree score to obtain a point result.
In one embodiment, the determining a comprehensive emotion recognition result based on each emotion recognition result when the integration result is equal to a preset first integration threshold value includes:
when the integration result is equal to a preset first integration threshold value, counting the number of the emotion recognition results of the same emotion type according to the emotion type of the emotion recognition result in a plurality of emotion recognition results to obtain the number of the emotion recognition results of each emotion type;
and taking the emotion recognition result of the emotion category with the largest number as a comprehensive emotion recognition result corresponding to each emotion recognition result.
In one embodiment, the target display object includes a plurality of display features, the display features are features for characterizing display contents of the target display object, and the determining the target display object corresponding to the integrated emotion recognition result includes:
according to preset display feature priority and each comprehensive emotion recognition result, sequentially determining target display features corresponding to each comprehensive emotion recognition result in a pre-stored corresponding relation between the comprehensive emotion recognition result and the display features;
and determining a target display object according to each target display characteristic.
In one embodiment, when the integration result is equal to a preset second integration threshold, determining, according to display information included in the target display object and the current motion information of each display particle, target display information corresponding to each display particle to display the target display object includes:
when the integration result is equal to a preset second integration threshold value, determining target position information of the current movement of each display particle according to the movement information corresponding to each display particle;
and determining target display information corresponding to each display particle according to the target position information and display information contained in the target display object.
In a second aspect, there is provided an electroencephalogram data display apparatus, the apparatus comprising:
the acquiring and displaying module is used for acquiring an electroencephalogram data identification result corresponding to the electroencephalogram data stream, and the electroencephalogram data identification result comprises a plurality of emotion identification results and a plurality of concentration identification results;
the first determination module is used for determining concentration degree scores corresponding to the plurality of concentration degree identification results respectively;
the integral module is used for accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain integral results;
the processing module is used for determining the current motion information of each display particle according to the integration result, controlling each display particle to move in a preset motion range and based on the motion information; the display particles are initial display objects output in advance in a display interface;
the second determining module is used for determining a comprehensive emotion recognition result based on each emotion recognition result and determining a target display object corresponding to the comprehensive emotion recognition result when the integral result is equal to a preset first integral threshold;
a third determining module, configured to determine, when the integration result is equal to a preset second integration threshold, a target display position and target display information corresponding to each display particle according to display information included in the target display object and the current motion information of each display particle; and the target display position and the target display information corresponding to each display particle are used for presenting a target display object.
In a third aspect, an electroencephalogram data display system is provided, comprising:
the brain-computer equipment is used for acquiring an electroencephalogram data stream of a target object;
computer means for determining, from said stream of electroencephalographic data, a result of identification of electroencephalographic data of said target object, and for performing the steps of the method of any one of the above first aspects;
and the display is used for displaying the display particles output by the computer equipment and the target display object.
In a fourth aspect, the application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the steps of the first aspect when executing the computer program.
In a fifth aspect, the present application further provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the first aspect described above.
In a sixth aspect, the present application further provides a computer program product. The computer program product comprising a computer program that when executed by a processor performs the steps of the first aspect described above.
According to the electroencephalogram data display method, the electroencephalogram data display device, the electroencephalogram data display system, the computer equipment, the storage medium and the computer program product, the computer equipment obtains electroencephalogram data identification results corresponding to electroencephalogram data streams, and the electroencephalogram data identification results comprise a plurality of emotion identification results and a plurality of concentration identification results; determining concentration degree scores corresponding to the plurality of concentration degree identification results respectively; accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain integral results; determining motion information corresponding to each currently displayed display particle according to the integration result, controlling each display particle to move within a preset motion range based on the motion information; the display particles are initial display objects output in advance in a display interface; when the integration result is equal to a preset first integration threshold value, determining a comprehensive emotion recognition result based on each emotion recognition result, and determining a target display object corresponding to the comprehensive emotion recognition result; when the integration result is equal to a preset second integration threshold value, determining a target display position and target display information corresponding to each display particle according to display information contained in the target display object and the current motion information of each display particle; and the target display position and the target display information corresponding to each display particle are used for presenting a target display object. By adopting the method, the motion information of each display particle is determined through the concentration degree identification result, the target display position corresponding to each display particle is further determined, the corresponding target display object is determined through the emotion identification result, then, the target display object is displayed based on the target display position and the target display information corresponding to each display particle, the concentration degree identification result and the emotion identification result contained in the electroencephalogram data are visualized, and the electroencephalogram data identification result is interactively displayed.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
The invention will be more clearly understood from the following detailed description, taken with reference to the accompanying drawings, in which:
FIG. 1 is an architecture diagram of an electroencephalogram data display system in one embodiment;
FIG. 2 is an application environment diagram of a method for displaying electroencephalogram data in one embodiment;
FIG. 3 is a flow chart illustrating a method for displaying electroencephalogram data in one embodiment;
FIG. 4 is a schematic flow chart of the step of determining the integration result in one embodiment;
FIG. 5 is a schematic flow chart of the step of determining a composite emotion recognition result in one embodiment;
FIG. 6 is a flowchart illustrating the steps of determining a target display object in one embodiment;
FIG. 7 is a flowchart illustrating the steps of determining a target display location and target display information in one embodiment;
FIG. 8 is a flow diagram of an exemplary method for displaying electroencephalogram data in one embodiment;
FIG. 9 is a block diagram of an electroencephalogram data display device in one embodiment;
FIG. 10 is a diagram showing an internal structure of a computer device according to an embodiment.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be discussed further in subsequent figures.
Embodiments of the invention are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the computer system/server include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, networked personal computers, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above, and the like.
The computer system/server may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
The electroencephalogram data display method provided by the embodiment of the application can be applied to the electroencephalogram data display system 100 shown in fig. 1, and the electroencephalogram data display system comprises the following components: the computer system comprises a brain device 102, a computer device 104 and a display 106, wherein the brain device 102 communicates with the computer device 104 through a communication mode such as a network or bluetooth, and the computer device 104 is connected with the display 106 through a wireless or wired mode, which is not limited in the embodiment of the application. Specifically, as shown in fig. 2, the brain-computer device 102 is worn by the target object and faces the display 106, and the brain-computer device 102 is used for acquiring the brain-computer data stream of the target object. The computer device 104 (not shown in fig. 2) is used to obtain a brain electrical data recognition result corresponding to the brain electrical data stream. The electroencephalogram data identification result comprises a plurality of emotion identification results and a plurality of concentration identification results. Furthermore, the computer device 104 is further configured to determine a concentration degree score corresponding to the concentration degree recognition result, and obtain a point result according to the concentration degree score corresponding to each concentration degree recognition result. And determining the current motion information of each display particle according to the integration result, controlling each display particle to move in a preset motion range and carrying out motion based on the motion information. When the integration result is equal to a preset first integration threshold value, determining a comprehensive emotion recognition result based on each emotion recognition result, and determining a target display object corresponding to the comprehensive emotion recognition result; when the integration result is equal to a preset second integration threshold value, determining a target display position and target display information corresponding to each display particle according to display information contained in the target display object and current motion information of each display particle; and the target display position and the target display information corresponding to each display particle are used for presenting a target display object. The display 106 is used for displaying the display particles output by the computer equipment and the target display object constructed by the target display position and the target display information of each display particle.
In one embodiment, as shown in fig. 3, a method for displaying electroencephalogram data is provided, which is described by applying the method to the electroencephalogram data display system in fig. 1, and comprises the following steps:
step 302, acquiring an electroencephalogram data identification result corresponding to the electroencephalogram data stream.
The electroencephalogram data identification result corresponding to the electroencephalogram data stream comprises a plurality of emotion identification results and a plurality of concentration identification results. For example, the computer device processes the electroencephalogram data at a speed of seconds, and thus, one emotion recognition result and one concentration recognition result can be obtained every 1 second. The categories of emotion recognition results may be classified, but are not limited to: the concentration degree recognition result can be represented in a numerical form, wherein the larger the numerical value is, the higher the concentration degree of the target object is, and the smaller the numerical value is, the lower the concentration degree of the target object is.
In implementation, a target object wears brain-computer equipment, the brain-computer equipment acquires electroencephalogram data flow of the target object through brain-computer interface technology (BCI), then the electroencephalogram data flow is transmitted to computer equipment through Bluetooth or a network and the like, the computer equipment performs feature extraction on the electroencephalogram data of the target object according to a preset electroencephalogram feature extraction algorithm to obtain electroencephalogram features of the target object, and then the computer equipment performs analysis processing on the electroencephalogram features according to a preset feature recognition algorithm to obtain an electroencephalogram data recognition result of the target object. The computer device may store (e.g., cache) the electroencephalogram data identification result of the target object, and when the electroencephalogram data identification result of the target object needs to be interactively displayed, the computer device obtains the electroencephalogram data identification result.
Display particles are also displayed in advance on the display, so that the emotion and concentration of the target object are reflected by the motion state of the display particles. For example, when the concentration recognition result is not received, the display particles move arbitrarily, and when the initial concentration recognition result is obtained and the display particles are controlled to be output, the concentration recognition result is low (that is, the value corresponding to the concentration recognition result is small and is smaller than the preset concentration threshold), and each display particle moves within the preset large movement range, but the movement trajectory is disordered.
Optionally, the embodiment of the present disclosure is not limited to the preset electroencephalogram feature extraction algorithm. Specifically, various feature extraction methods can be applied to the electroencephalogram data feature extraction process, so that the electroencephalogram features of the target object are obtained. Specifically, in the feature extraction process of the electroencephalogram data, the electroencephalogram features extracted by the brain-computer equipment can be classified into three categories: time domain features, frequency domain features, and time-frequency domain features. The time domain features are mainly used for capturing time tissue information of the electroencephalogram signals, and for example, the Hjorth features, the fractal dimension features, the high-order cross features and the like are all time domain features. The frequency domain features mainly capture target object emotion information from the perspective of a frequency domain. The time-frequency domain features capture time-domain information and frequency-domain information at the same time, namely, the frequency-domain information is extracted from a unit time signal divided by a sliding window. For the above three features, the extraction of frequency domain features is taken as an example for explanation in the present application, and the brain-computer device may first decompose the original frequency band into several sub-frequency bands, and then extract the electroencephalogram features of each frequency band, and the extraction methods that can be adopted include fourier transform (RT), power Spectral Density (PSD), wavelet Transform (WT), differential entropy (DT), and the like.
Optionally, there are various recognition algorithms for electroencephalogram data (referred to as feature recognition algorithms for short), for example, supervised learning algorithms such as Support Vector Machine (SVM), K-nearest neighbor (KNN), naive Bayes (NB), and the like in a machine learning algorithm. The identification algorithm of the electroencephalogram data is not limited in the embodiment of the application. Hereinafter, the identification algorithm for identifying the electroencephalogram feature of the target object in the present application will be specifically described, and will not be described in detail herein.
And step 304, determining concentration degree scores corresponding to the plurality of concentration degree identification results respectively.
In implementation, a concentration degree scoring policy is stored in the computer device in advance, and the concentration degree scoring policy is used for scoring and judging each received concentration degree identification result and determining a concentration degree score corresponding to each concentration degree identification result.
And step 306, accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain an integral result.
In implementation, the computer device performs cumulative calculation based on the concentration degree scores corresponding to the respective concentration degree recognition results to obtain a sum of the concentration degree scores of the current concentration degree recognition result and the historical concentration degree recognition result, that is, an integral result of the respective concentration degree recognition results. The historical concentration degree identification result is each concentration degree identification result in a preset time period before the current concentration degree identification result.
And 308, determining the current motion information of each display particle according to the integration result, controlling each display particle to move in a preset motion range based on the motion information.
The display particles are initial display objects output in the display interface in advance, and the motion information of each display particle comprises a preset motion track or a preset motion range of each display particle.
In an implementation, the computer device determines, according to the obtained integration result, current motion information of each display particle, where the motion information is used to control a motion situation of the display particle displayed in the display, and specifically, the computer device may output the current motion information of each display particle and each display particle to the display, so that each display particle in a target display area of the display may move according to the motion information.
And 310, when the integration result is equal to a preset first integration threshold value, determining a comprehensive emotion recognition result based on each emotion recognition result, and determining a target display object corresponding to the comprehensive emotion recognition result.
In implementation, the integration result is continuously accumulated along with the concentration degree score of each concentration degree identification result, and when the integration result is equal to a preset first integration threshold value, the computer device processes a plurality of emotion identification results obtained in the integration accumulation process, and determines a comprehensive emotion identification result in the emotion identification results. The composite emotion recognition result characterizes an average state of the emotion recognition results, which may include happiness, sadness, fear, anger, and the like. Then, the computer device determines a target display object (also referred to as a target display object) from the correspondence between the emotion recognition result stored in advance and the target display object, based on the determined integrated emotion recognition result. Optionally, the target display object may be an art painting, a photo, a character avatar, a virtual character skin, or the like, and the embodiment of the present application is not limited.
And step 312, when the integration result is equal to the preset second integration threshold, determining a target display position and target display information corresponding to each display particle according to the display information included in the target display object and the current motion information of each display particle.
In implementation, in the process that the integration result is continuously accumulated along with the concentration degree score of each concentration degree identification result, the motion information corresponding to each display particle can be determined according to the integration result obtained after each scoring processing, each display particle moves in the target display area of the display according to the motion information, and each display particle is gradually gathered in the target display area along with the gradual increase of the integration result. Then, when the integration result is equal to a preset second integration threshold value, the computer device determines the current position of each display particle according to the current motion information of each display particle, and determines the corresponding target display information of each display particle in the display information according to the display information contained in the determined target display object and the current position of each display particle, so that the target display information is spliced through the display particles to obtain the integral target display object.
The display information included in the target display object may be, but is not limited to, various display parameters required for displaying the target object and display contents of the target display object, such as definition, contrast, color, image shape, and the like, and the output display of the target display object is realized based on the display information.
According to the electroencephalogram data display method, the computer equipment obtains electroencephalogram data identification results corresponding to the electroencephalogram data streams and outputs preset display particles, wherein the electroencephalogram data identification results comprise a plurality of emotion identification results and a plurality of concentration identification results. Then, the computer equipment determines concentration degree scores corresponding to the plurality of concentration degree identification results respectively; and accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain an integral result. Then, the computer device determines current motion information of each display particle currently displayed according to the integration result, so that each display particle moves according to the motion information. And when the integration result is equal to a preset first integration threshold value, determining a comprehensive emotion recognition result based on each emotion recognition result, and determining a target display object corresponding to the comprehensive emotion recognition result. And when the integration result is equal to a preset second integration threshold value, determining target display information corresponding to each display particle according to the display information contained in the target display object and the current motion information of each display particle so as to display the target display object. By adopting the method, the motion information of each display particle is determined through the concentration degree identification result, the target display position corresponding to each display particle is further determined, the corresponding target display object is determined through the emotion identification result, then, the target display object is displayed based on the target display position and the target display information corresponding to each display particle, the concentration degree identification result and the emotion identification result contained in the electroencephalogram data are visualized, and the electroencephalogram data identification result is interactively displayed.
In an exemplary embodiment, the plurality of emotion recognition results include happy, sad, horror, angry; the concentration degree identification result is a concentration degree numerical value representing the magnitude of the concentration degree.
In an exemplary embodiment, as shown in fig. 4, in step 304, the specific implementation process of determining the concentration score corresponding to the concentration recognition result is as follows:
step 402, determining a concentration degree score corresponding to the concentration degree recognition result according to the concentration degree recognition result and a preset concentration degree threshold value.
In an implementation, at least one concentration threshold is preset in the computer device, and the concentration threshold is used for judging the concentration degree of the concentration degree recognition result of the target object and judging whether to carry out concentration degree integration on the concentration degree recognition result. Specifically, the computer device compares the current concentration degree recognition result with a preset concentration degree threshold value, and determines a concentration degree score corresponding to the current concentration degree recognition result. For example, if the current concentration degree recognition result is greater than the preset concentration degree threshold (e.g., the current concentration degree recognition result is greater than 60), the concentration degree score corresponding to the current concentration degree recognition result is 1 score, and the current concentration degree recognition result is less than or equal to the preset concentration degree threshold (e.g., the current concentration degree recognition result is less than or equal to 60), the concentration score corresponding to the current concentration degree recognition result is 0 score.
Optionally, the score corresponding to the concentration degree identification result corresponds to a non-scoring operation except for 0 score, the non-0 positive integer score of the concentration degree identification result is not limited to 1 score, but may be, but not limited to, 2 scores and 3 scores, and may be specifically set according to actual needs, which is not limited in the embodiment of the present application. For example, if the preset concentration threshold is 60, the current concentration recognition result is 75, and the current concentration recognition result is greater than the preset concentration threshold, the concentration score corresponding to the current concentration recognition result is 2.
And step 404, determining whether the concentration degree recognition result triggers scoring processing according to the concentration degree score corresponding to the concentration degree recognition result.
In implementation, the computer device determines whether the concentration degree recognition result triggers scoring processing according to the concentration degree score corresponding to the concentration degree recognition result. Specifically, if the concentration degree score corresponding to the current concentration degree recognition result is 1, the scoring processing is performed on the concentration degree recognition result, and if the concentration degree score corresponding to the current concentration degree recognition result is 0, the scoring processing is not performed.
And step 406, if the scoring processing is triggered, performing accumulation calculation on the concentration degree score corresponding to the concentration degree identification result and the historical concentration degree score to obtain a point result.
The historical concentration degree identification result is each concentration degree identification result in a preset time period before the current concentration degree identification result. The historical concentration degree score is a concentration degree score corresponding to the historical concentration degree identification result.
In an implementation, after determining that the current concentration degree identification result needs to be scored, the computer device performs an accumulation calculation according to the concentration degree score corresponding to the current concentration degree identification result and the historical concentration degree score (or referred to as a historical integral result, that is, an integral result obtained after the historical concentration degree scores are integrated), so as to obtain an integral result corresponding to the current concentration degree identification result.
In one embodiment, as shown in fig. 5, in step 310, when the integration result is equal to the preset first integration threshold, the specific implementation process for determining the integrated emotion recognition result based on each emotion recognition result is as follows:
and 502, when the integral result is equal to a preset first integral threshold value, classifying and counting the emotion recognition results of the same emotion category according to the emotion category of the emotion recognition results in a plurality of emotion recognition results to obtain the number of emotion recognition results of each emotion category.
In an implementation, the computer device obtains a plurality of emotion recognition results, for example, a plurality of emotion classifications including happy, sad, frightened, angry, and the like, in the process of accumulating the integration results to be equal to the first integration threshold. And when the integral result is equal to a preset first integral threshold value, the computer equipment carries out classified statistics on the emotion recognition results of the same emotion category according to the emotion category of each emotion recognition result to obtain the number of the emotion recognition results of each emotion category. For example, if the first integration threshold is 30, when the integration result is equal to 30, the computer device correspondingly obtains a plurality of emotion recognition results (for example, 50), and then, according to the emotion categories, the computer device performs quantity statistics on the emotion recognition results of the same emotion category to obtain the quantity of emotion recognition results of each emotion category, for example, the emotion recognition results that totally include 3 emotion categories are happy, relaxed, and sad, and the quantity corresponding to the emotion recognition results of each emotion category is: 25. 15 and 10.
And step 504, taking the emotion recognition result of the emotion category with the largest number as a comprehensive emotion recognition result corresponding to each emotion recognition result.
In an implementation, the computer device uses the emotion recognition result corresponding to the largest number of emotion categories as the integrated emotion recognition result corresponding to each emotion recognition result, for example, the emotion recognition results (respectively happy, relaxed, and sad) for 3 emotion categories, where the number of emotion recognition results is: 25. 15 and 10, the computer device takes the most 'happy' emotion recognition result as a comprehensive emotion recognition result corresponding to the plurality of emotion recognition results.
Optionally, the number of the integrated emotion recognition results may be 1, or may be multiple, and the embodiment of the present application is not limited. For example, when a plurality of integrated emotion recognition results need to be determined, the computer device may sort the emotion recognition results of each emotion category from large to small, select the emotion recognition results of the first N (N is greater than or equal to 2) emotion categories after sorting as the integrated emotion recognition results, and at this time, may obtain a plurality of integrated emotion recognition results.
In an embodiment, the display content of the target display object may be an integral body (that is, the target display object includes one display feature), or may be composed of multiple parts (that is, the target display object includes multiple display features), where the display feature is a feature for characterizing the display content of the target display object, and if the target display object includes only one display feature, (for example, the display feature is a single shape, a circle, a square, or the like), the display content of the target display object is an integral body, and the computer device stores a corresponding relationship between the integrated emotion recognition result and the target display object in advance, and further, the computer device may determine the target display object in the corresponding relationship between each integrated emotion recognition result and the target display object based on the integrated emotion recognition result.
The display method comprises the steps that according to the situation that display content of a target display object comprises a plurality of display features, corresponding relations between comprehensive emotion recognition results and the display features can be stored in computer equipment in advance, then the computer equipment can determine the comprehensive emotion recognition results, then based on the comprehensive emotion recognition results, the target display features are determined in the corresponding relations between the comprehensive emotion recognition results and the display features, and therefore the target display object is formed by the display features, and the diversity of the target display object is improved. As shown in fig. 6, in step 310, the specific implementation process of determining the target display object corresponding to the integrated emotion recognition result includes:
step 602, according to the preset display feature priority and each integrated emotion recognition result, determining the target display feature corresponding to each integrated emotion recognition result in the pre-stored corresponding relationship between the integrated emotion recognition result and the display feature in sequence.
In implementation, for a case where the display content of the target display object includes a plurality of types of display features, priorities of the plurality of types of display features are set in advance in the computer device. Also, since the determination of the integrated emotion recognition result is determined based on the number of emotion recognition results for each emotion type, there is also a priority between the integrated emotion recognition results. Then, the computer device determines a target display feature corresponding to the integrated emotion recognition result in the corresponding relationship between the display feature of the corresponding type and the emotion recognition result according to each integrated emotion recognition result in terms of the priority order of the display features. For example, the display content of the target display object includes a display feature a, a display feature B and a display feature C, wherein the display feature a is prior to the display feature B, the display feature B is prior to the display feature C, and the integrated emotion recognition result determined by the computer device includes a first integrated emotion recognition result, a second integrated emotion recognition result and a third integrated emotion recognition result, the first integrated emotion recognition result is prior to the second integrated emotion recognition result, the second integrated emotion recognition result is prior to the third integrated emotion recognition result, so that the computer device determines the display feature of the display feature a type according to the first integrated emotion recognition result, that is, determines the target display feature a according to the first integrated emotion recognition result in the corresponding relationship between each display feature a and the integrated emotion recognition result, and similarly, determines the display feature of the display feature B type according to the second integrated emotion recognition result, determines the target display feature B in the corresponding relationship between each display feature B and the integrated emotion recognition result, and determines the target display feature C according to the third integrated emotion recognition result.
Step 604, determining the target display object according to the target display characteristics.
In implementation, the computer device determines the target display object according to the target display characteristics. For example, when the target display object is a virtual avatar, the virtual avatar is composed of three display features of skin color, hair color, and facial expression. Therefore, when the target display characteristics determined by the computer device are yellow skin, black hair and smile expression, the final virtual head portrait (i.e. the target display object) can be determined according to the target display characteristics.
In one embodiment, as shown in fig. 7, the specific implementation process of step 312 includes:
and step 702, when the integration result is equal to a preset second integration threshold, determining target position information of the current movement of each display particle according to the movement information corresponding to each display particle.
In implementation, when the integration result is equal to the preset second integration threshold, the computer device determines the target position information where each display particle currently moves according to the motion information corresponding to each display particle. For example, the second integration threshold is 100, in the process of accumulating the integration result to 100, each display particle moves according to the motion information (including the motion track and/or the motion range) determined by each integration result, and the motion trend shows a gradual aggregation state, when the integration result is equal to 100, the computer device determines the current motion information of each display particle, and further determines the target position information where each display particle currently moves according to the current motion information.
Step 704, determining target display information corresponding to each display particle according to the target position information and the display information included in the target display object.
In an implementation, the target display object includes display information, for example, the display information may be pixel information, gray scale information, etc., the display information may be divided into a plurality of target display information to match with each display particle, and the computer device further matches the corresponding target display information for the display particle according to the position information of each display particle to display the entire target display object by outputting the target display information of each display particle after splicing.
In one embodiment, an example of a brain electrical data display method is provided, as shown in fig. 8, the example method comprising the steps of:
step 801, acquiring an electroencephalogram data identification result and outputting preset display particles, wherein the electroencephalogram data identification result comprises a plurality of emotion identification results and a plurality of concentration identification results.
And step 802, scoring and judging the concentration degree identification result, counting the concentration degree score of 1 point if the concentration degree identification result is greater than 60, and not scoring if the concentration degree identification result is less than or equal to 60. And integrating the concentration degree value corresponding to the concentration degree identification result to obtain an integration result, and determining the current motion information of each display particle according to the integration result so that each display particle moves according to the motion information and is represented as that each display particle is gradually gathered along with the increase of the integration result.
When the integration result reaches 30, a comprehensive emotion recognition result (i.e., an average state of emotion results) is determined among the emotion recognition results acquired in the integration process, and the particles are further aggregated as the integration result increases, step 803.
And step 804, determining a target display object (image drawing) according to the comprehensive emotion recognition result.
And step 805, when the integration result reaches 100, determining the target display information of each display particle according to the determined target display object and the motion information of each display particle, and outputting a complete target display object (image drawing) according to the target display information of each display particle.
In one embodiment, as shown in fig. 9, there is provided an electroencephalogram data display apparatus including: an acquisition display module 910, a first determination module 920, an integration module 930, a processing module 940, a second determination module 950, and a third determination module 960, wherein,
the acquiring and displaying module 910 is configured to acquire an electroencephalogram data identification result corresponding to an electroencephalogram data stream, where the electroencephalogram data identification result includes a plurality of emotion identification results and a plurality of concentration identification results;
a first determining module 920, configured to determine concentration scores corresponding to the plurality of concentration recognition results, respectively;
an integration module 930, configured to accumulate concentration degree scores corresponding to the plurality of concentration degree identification results to obtain an integration result;
a processing module 940, configured to determine current motion information of each display particle according to the integration result, control each display particle to move within a preset motion range based on the motion information; the display particles are initial display objects output in advance in a display interface;
a second determining module 950, configured to determine a comprehensive emotion recognition result based on each emotion recognition result and determine a target display object corresponding to the comprehensive emotion recognition result when the integration result is equal to a preset first integration threshold;
a third determining module 960, configured to determine, when the integration result is equal to a preset second integration threshold, a target display position and target display information corresponding to each display particle according to display information included in the target display object and the current motion information of each display particle; and the target display position and the target display information corresponding to each display particle are used for presenting a target display object.
In an exemplary embodiment, the emotion recognition result includes happiness, sadness, fear, anger; the concentration degree identification result is a concentration degree numerical value representing the concentration degree.
In one embodiment, the first determining module 920 is specifically configured to determine, according to the concentration recognition result and a preset concentration threshold, a concentration score corresponding to the concentration recognition result;
the integral module 930, configured to determine whether the concentration degree recognition result triggers scoring processing according to the concentration degree score corresponding to the concentration degree recognition result;
and if the scoring processing is triggered, performing accumulation calculation on the concentration degree score corresponding to the concentration degree identification result and the historical concentration degree score to obtain a point result.
In one embodiment, the second determining module 950 is specifically configured to determine a comprehensive emotion recognition result based on each emotion recognition result and determine a target display object corresponding to the comprehensive emotion recognition result when the integration result is equal to a preset first integration threshold;
and when the integration result is equal to a preset second integration threshold value, determining a target display position and target display information corresponding to each display particle according to display information contained in the target display object and the current motion information of each display particle.
In one embodiment, the second determining module 950 is specifically configured to, when the integration result is equal to a preset first integration threshold, perform quantity statistics on the emotion recognition results of the same emotion category according to the emotion category of the emotion recognition results in a plurality of emotion recognition results, so as to obtain a quantity of emotion recognition results of each emotion category;
and taking the emotion recognition result of the emotion category with the largest number as a comprehensive emotion recognition result corresponding to each emotion recognition result.
In one embodiment, the target display object includes a plurality of display features, where the display features are features used for representing display contents of the target display object, and the second determining module 950 is further configured to determine, according to a preset display feature priority and each of the integrated emotion recognition results, target display features corresponding to each of the integrated emotion recognition results in a correspondence between a pre-stored integrated emotion recognition result and a display feature in sequence;
and determining a target display object according to each target display characteristic.
In one embodiment, the third determining module 960 is further configured to determine, according to the motion information corresponding to each display particle, the target position information where each display particle is currently moving when the integration result is equal to a preset second integration threshold;
and determining target display information corresponding to each display particle according to the target position information and the display information contained in the target display object.
By adopting the device, the motion information of the display particles is determined through the concentration degree identification result, the corresponding target display object is determined through the emotion identification result, and then the display of the target display object by each display particle is realized according to the display information of the target display object and the motion information of each target display object, so that the concentration degree identification result and the emotion identification result contained in the electroencephalogram data are visualized, and the interaction of controlling the display effect through the electroencephalogram data is realized.
All or part of the modules in the electroencephalogram data display device can be realized through software, hardware and a combination of the software and the hardware. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 10. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for communicating with an external terminal in a wired or wireless manner, and the wireless manner can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a method of displaying electroencephalogram data. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 10 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the electroencephalogram data display method when executing the computer program.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned electroencephalogram data display method.
In one embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the above-described electroencephalogram data display method.
It should be noted that the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), magnetic Random Access Memory (MRAM), ferroelectric Random Access Memory (FRAM), phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the various embodiments provided herein may be, without limitation, general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing-based data processing logic devices, or the like.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The method and system of the present invention may be implemented in a number of ways. For example, the methods and systems of the present invention may be implemented in software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustrative purposes only, and the steps of the method of the present invention are not limited to the order specifically described above unless specifically indicated otherwise. Furthermore, in some embodiments, the present invention may also be embodied as a program recorded in a recording medium, the program including machine-readable instructions for implementing a method according to the present invention. Thus, the present invention also covers a recording medium storing a program for executing the method according to the present invention.
The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A method for displaying electroencephalogram data, the method comprising:
acquiring an electroencephalogram data identification result corresponding to an electroencephalogram data stream, wherein the electroencephalogram data identification result comprises a plurality of emotion identification results and a plurality of concentration identification results;
determining concentration degree scores corresponding to the plurality of concentration degree identification results respectively;
accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain integral results;
determining current motion information of each display particle according to the integral result, controlling each display particle to move in a preset motion range based on the motion information; the display particles are initial display objects output in advance in a display interface;
when the integration result is equal to a preset first integration threshold value, determining a comprehensive emotion recognition result based on each emotion recognition result, and determining a target display object corresponding to the comprehensive emotion recognition result;
when the integration result is equal to a preset second integration threshold value, determining a target display position and target display information corresponding to each display particle according to display information contained in the target display object and the current motion information of each display particle; and the target display position and the target display information corresponding to each display particle are used for presenting a target display object.
2. The method of claim 1, wherein the emotion recognition result includes happiness, sadness, fear, anger; the concentration degree identification result is a concentration degree numerical value representing the concentration degree.
3. The method of claim 1, wherein determining the concentration scores corresponding to the respective concentration recognition results comprises:
determining a concentration degree score corresponding to the concentration degree identification result according to the concentration degree identification result and a preset concentration degree threshold value;
the said concentration degree score to a plurality of the corresponding of the identification result of the said concentration degree is accumulated, get the integral result, include:
determining whether the concentration degree recognition result triggers scoring processing according to the concentration degree score corresponding to the concentration degree recognition result;
and if the scoring processing is triggered, accumulating and calculating the concentration degree score corresponding to the concentration degree identification result and the historical concentration degree score to obtain an integral result.
4. The method of claim 1, wherein determining a composite emotion recognition result based on each emotion recognition result when the integration result is equal to a preset first integration threshold comprises:
when the integration result is equal to a preset first integration threshold value, counting the number of the emotion recognition results of the same emotion type according to the emotion type of the emotion recognition result in a plurality of emotion recognition results to obtain the number of the emotion recognition results of each emotion type;
and taking the emotion recognition result of the emotion category with the largest number as a comprehensive emotion recognition result corresponding to each emotion recognition result.
5. The method of claim 1, wherein the target display object comprises a plurality of display features, and the display features are features for characterizing display contents of the target display object, and the determining the target display object corresponding to the integrated emotion recognition result comprises:
according to preset display feature priority and each comprehensive emotion recognition result, sequentially determining target display features corresponding to each comprehensive emotion recognition result in a pre-stored corresponding relation between the comprehensive emotion recognition result and the display features;
and determining a target display object according to each target display characteristic.
6. The method according to claim 1, wherein when the integration result is equal to a preset second integration threshold, determining a target display position and target display information corresponding to each display particle according to display information included in the target display object and the current motion information of each display particle comprises:
when the integration result is equal to a preset second integration threshold value, determining target position information of the current movement of each display particle according to the movement information corresponding to each display particle;
and determining target display information corresponding to each display particle according to the target position information and the display information contained in the target display object.
7. An electroencephalogram data display device, characterized in that said device comprises:
the acquiring and displaying module is used for acquiring an electroencephalogram data identification result corresponding to the electroencephalogram data stream, and the electroencephalogram data identification result comprises a plurality of emotion identification results and a plurality of concentration identification results;
the first determination module is used for determining concentration degree scores corresponding to the plurality of concentration degree identification results respectively;
the integral module is used for accumulating the concentration degree scores corresponding to the plurality of concentration degree identification results to obtain integral results;
the processing module is used for determining the current motion information of each display particle according to the integration result, controlling each display particle to move in a preset motion range and based on the motion information; the display particles are initial display objects output in advance in a display interface;
the second determination module is used for determining a comprehensive emotion recognition result based on each emotion recognition result and determining a target display object corresponding to the comprehensive emotion recognition result when the integration result is equal to a preset first integration threshold;
a third determining module, configured to determine, when the integration result is equal to a preset second integration threshold, a target display position and target display information corresponding to each display particle according to display information included in the target display object and the current motion information of each display particle; and the target display position and the target display information corresponding to each display particle are used for presenting a target display object.
8. An electroencephalogram data display system, comprising:
the brain-computer equipment is used for acquiring an electroencephalogram data stream of a target object;
a computer device for determining, from said stream of brain electrical data, a result of brain electrical data recognition of said target object and for performing the steps of the method of any one of claims 1 to 6;
and the display is used for displaying the display particles output by the computer equipment and the target display object.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
CN202211291114.7A 2022-10-21 2022-10-21 Electroencephalogram data display method, device, system, computer device and storage medium Active CN115357154B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211291114.7A CN115357154B (en) 2022-10-21 2022-10-21 Electroencephalogram data display method, device, system, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211291114.7A CN115357154B (en) 2022-10-21 2022-10-21 Electroencephalogram data display method, device, system, computer device and storage medium

Publications (2)

Publication Number Publication Date
CN115357154A CN115357154A (en) 2022-11-18
CN115357154B true CN115357154B (en) 2023-01-03

Family

ID=84008366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211291114.7A Active CN115357154B (en) 2022-10-21 2022-10-21 Electroencephalogram data display method, device, system, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN115357154B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012115294A1 (en) * 2011-02-24 2012-08-30 주식회사 메디오피아테크 Ubiquitous-learning middleware device for generating study emotion index related to study concentration level from bio-signal emotion index and context information
WO2019082687A1 (en) * 2017-10-27 2019-05-02 ソニー株式会社 Information processing device, information processing method, program, and information processing system
CN110169770A (en) * 2019-05-24 2019-08-27 西安电子科技大学 The fine granularity visualization system and method for mood brain electricity
CN113080998A (en) * 2021-03-16 2021-07-09 北京交通大学 Electroencephalogram-based concentration state grade assessment method and system
CN114847975A (en) * 2022-04-28 2022-08-05 脑陆(重庆)智能科技研究院有限公司 Electroencephalogram data processing method, device, system, computer device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012115294A1 (en) * 2011-02-24 2012-08-30 주식회사 메디오피아테크 Ubiquitous-learning middleware device for generating study emotion index related to study concentration level from bio-signal emotion index and context information
WO2019082687A1 (en) * 2017-10-27 2019-05-02 ソニー株式会社 Information processing device, information processing method, program, and information processing system
CN110169770A (en) * 2019-05-24 2019-08-27 西安电子科技大学 The fine granularity visualization system and method for mood brain electricity
CN113080998A (en) * 2021-03-16 2021-07-09 北京交通大学 Electroencephalogram-based concentration state grade assessment method and system
CN114847975A (en) * 2022-04-28 2022-08-05 脑陆(重庆)智能科技研究院有限公司 Electroencephalogram data processing method, device, system, computer device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Affect representation and recognitionin 3D continuous valence-arousal-dominance space;Gyanendra K Verma 等;《Multimedia Tools and Applications》;20160108;第2159-2183页 *

Also Published As

Publication number Publication date
CN115357154A (en) 2022-11-18

Similar Documents

Publication Publication Date Title
Tiwari et al. An Efficient Classification Technique For Automatic Identification of Emotions Leading To Stress
Abreu et al. Evaluating sign language recognition using the myo armband
Xu et al. Affective states classification using EEG and semi-supervised deep learning approaches
Zen et al. Learning personalized models for facial expression analysis and gesture recognition
Zhao et al. Mobile user authentication using statistical touch dynamics images
US20220277596A1 (en) Face anti-spoofing recognition method and apparatus, device, and storage medium
Agarwal et al. Anubhav: recognizing emotions through facial expression
CN107633203A (en) Facial emotions recognition methods, device and storage medium
Cruz et al. Vision and attention theory based sampling for continuous facial emotion recognition
CN105917305B (en) Filtering and shutter shooting based on image emotion content
US11837061B2 (en) Techniques to provide and process video data of automatic teller machine video streams to perform suspicious activity detection
Saini et al. Don’t just sign use brain too: A novel multimodal approach for user identification and verification
CN110298212B (en) Model training method, emotion recognition method, expression display method and related equipment
Yang et al. Real-time facial expression recognition based on edge computing
Zhang et al. Representation of facial expression categories in continuous arousal–valence space: feature and correlation
Wolfe et al. High precision screening for Android malware with dimensionality reduction
Sharifnejad et al. Facial expression recognition using a combination of enhanced local binary pattern and pyramid histogram of oriented gradients features extraction
Miah et al. Movie Oriented Positive Negative Emotion Classification from EEG Signal using Wavelet transformation and Machine learning Approaches
de Mijolla et al. Human-interpretable model explainability on high-dimensional data
CN115357154B (en) Electroencephalogram data display method, device, system, computer device and storage medium
KR20200010061A (en) Learning method of electronic apparatus and electronic apparatus
Fonnegra et al. Deep learning based video spatio-temporal modeling for emotion recognition
Khadatkar et al. Occlusion invariant face recognition system
CN115373519A (en) Electroencephalogram data interactive display method, device and system and computer equipment
Mahmoodi et al. SDD: A skin detection dataset for training and assessment of human skin classifiers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant