CN115268747B - Brain-computer interface data processing method and device, electronic equipment and storage medium - Google Patents

Brain-computer interface data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115268747B
CN115268747B CN202210882050.1A CN202210882050A CN115268747B CN 115268747 B CN115268747 B CN 115268747B CN 202210882050 A CN202210882050 A CN 202210882050A CN 115268747 B CN115268747 B CN 115268747B
Authority
CN
China
Prior art keywords
frequency
target
stimulation
candidate
stimulation target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210882050.1A
Other languages
Chinese (zh)
Other versions
CN115268747A (en
Inventor
陈小刚
崔红岩
李萌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boruikang Technology Beijing Co ltd
Original Assignee
Institute of Biomedical Engineering of CAMS and PUMC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Biomedical Engineering of CAMS and PUMC filed Critical Institute of Biomedical Engineering of CAMS and PUMC
Priority to CN202210882050.1A priority Critical patent/CN115268747B/en
Publication of CN115268747A publication Critical patent/CN115268747A/en
Application granted granted Critical
Publication of CN115268747B publication Critical patent/CN115268747B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The application provides a brain-computer interface data processing method, a brain-computer interface data processing device, an electronic device and a storage medium; the method comprises the following steps: displaying the candidate stimulation targets on a display screen at a first luminance transformation frequency and a first motion frequency; responding to the gaze of the user at the candidate stimulation target, and acquiring electroencephalogram signals when the user gazes at the candidate stimulation target; and determining the target stimulation target gazed by the user based on the electroencephalogram signal, so that the number of the stimulation targets which can be coded in the data processing of the brain-computer interface can be increased.

Description

Brain-computer interface data processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to brain-computer interface technologies, and in particular, to a method and an apparatus for processing data of a brain-computer interface, an electronic device, and a storage medium.
Background
With the rapid development and wide use of brain-computer interface technology, the use of brain-computer interface technology for brain-computer interface data processing is becoming the mainstream of brain-computer interface technology application. However, in the process of processing the brain-computer interface data, the existing brain-computer interface data processing method is limited by the number of brightness conversion frequencies, and in the process of processing the brain-computer interface data, the number of the encodable stimulation targets is small, so that people hope to increase the number of the encodable stimulation targets.
Therefore, it is a continuing goal to intelligently encode stimulation targets to increase the number of stimulation targets that can be encoded in brain-computer interface data processing.
Disclosure of Invention
The embodiment of the application provides a brain-computer interface data processing method and device, electronic equipment and a storage medium.
According to a first aspect of the present application, there is provided a brain-computer interface data processing method, the method comprising: displaying the candidate stimulation targets on the display screen at the first luminance transformation frequency and the first motion frequency; responding to the gaze of the user at the candidate stimulation target, and acquiring electroencephalogram signals when the user gazes at the candidate stimulation target; and determining a target stimulation target of the user gazing based on the electroencephalogram signal.
According to an embodiment of the present application, the displaying candidate stimulation targets on a display screen with a first luminance transformation frequency and a first motion frequency comprises: the candidate stimulation target comprises a rectangular frame, and a rectangular stimulation block is included in the rectangular frame; displaying the flickering stimulus block on a display screen at the first luminance conversion frequency; displaying the stimulation block moving within the rectangular frame on a display screen at the first motion frequency.
According to an embodiment of the present application, the displaying the candidate stimulation targets on the display screen at the first luminance transformation frequency and the first motion frequency further comprises: displaying N candidate stimulation targets on the display screen simultaneously at the first luminance transformation frequency and the first motion frequency, wherein N is a positive integer.
According to an embodiment of the present application, the displaying the stimulation target on the display screen with the first luminance conversion frequency and the first motion frequency includes: configuring a plurality of different first motion frequencies for each of the candidate stimulation targets, keeping a first intensity transition frequency of the candidate stimulation target unchanged; determining the first luminance transformation frequency and the first motion frequency for each of the candidate stimulation targets, and displaying the candidate stimulation targets on the display screen at the first luminance transformation frequency and the first motion frequency.
According to an embodiment of the application, the acquiring, in response to the user gazing at the candidate stimulation target, the electroencephalogram signals when the user gazes at the candidate stimulation target includes: responsive to the user gazing at the candidate stimulation target, generating the brain electrical signal, the brain electrical signal comprising steady-state visual evoked potentials of particular intermodulation frequency components; determining a specific intermodulation frequency component corresponding to the candidate stimulus target based on the first luminance transformation frequency and the first motion frequency; wherein, in the N candidate stimulation targets, the specific intermodulation frequency component corresponding to each candidate stimulation target is different, where N is a positive integer; and acquiring the electroencephalogram signals generated when the user gazes at the candidate stimulation target.
According to an embodiment of the present application, the determining the specific intermodulation frequency components corresponding to the candidate stimulus target based on the first intensity transformation frequency and the first motion frequency comprises: multiplying the first luminance conversion frequency by the harmonic number of the first luminance conversion frequency to obtain an integral multiple harmonic of the first luminance change frequency; multiplying the first motion frequency by the harmonic number of the first motion frequency to obtain integral multiple harmonics of the first motion frequency; the harmonic number of the first luminance conversion frequency and the harmonic number of the first motion frequency are positive integers; adding or subtracting integer multiples of the first brightness change frequency to or from integer multiples of the first motion frequency to determine the particular intermodulation frequency component.
According to an embodiment of the application, the determining the target stimulation target of the user's gaze based on the electroencephalogram signal includes: preprocessing the electroencephalogram signals, and determining steady-state visual evoked potential electroencephalogram signals; performing feature extraction on the steady-state visual evoked potential electroencephalogram signal, and determining a specific intermodulation frequency component corresponding to the target stimulation target in the steady-state visual evoked potential electroencephalogram signal; determining the target stimulation target gazed by the user based on a specific intermodulation frequency component corresponding to the target stimulation target and a specific intermodulation frequency component corresponding to the candidate stimulation target.
According to an embodiment of the application, the determining the target stimulus target of the user's gaze based on the particular intermodulation frequency components corresponding to the target stimulus target and the particular intermodulation frequency components corresponding to the candidate stimulus target comprises: comparing a specific intermodulation frequency component corresponding to the target stimulation target with a specific intermodulation frequency component corresponding to the candidate stimulation target, and determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency component; determining the candidate stimulation target with the largest correlation of the specific intermodulation frequency components as the target stimulation target gazed at by the user.
According to a second aspect of the present application, there is provided a brain-computer interface data processing apparatus including: a display module for displaying the candidate stimulation targets on a display screen at a first luminance transformation frequency and a first motion frequency; the acquisition module is used for responding to the gaze of the user on the candidate stimulation target and acquiring the electroencephalogram signals when the user gazes on the candidate stimulation target; and the determining module is used for determining the target stimulation target of the user gazing based on the electroencephalogram signals.
According to an embodiment of the present application, the candidate stimulation target includes a rectangular frame including a rectangular stimulation block therein; the display module is used for: displaying the flickering stimulus block on a display screen at the first luminance conversion frequency; displaying the stimulation block moving within the rectangular frame on a display screen at the first motion frequency.
According to an embodiment of the present application, the display module is further configured to: displaying N candidate stimulation targets on the display screen simultaneously at the first luminance transformation frequency and the first motion frequency, wherein N is a positive integer.
According to an embodiment of the present application, the display module is configured to: configuring a plurality of different first motion frequencies for each of the candidate stimulation targets, keeping a first intensity transition frequency of the candidate stimulation target unchanged; determining the first luminance transformation frequency and the first motion frequency for each of the candidate stimulation targets, and displaying the candidate stimulation targets on the display screen at the first luminance transformation frequency and the first motion frequency.
According to an embodiment of the application, responsive to the user gazing at the candidate stimulation target, generating the brain electrical signal, the brain electrical signal comprising steady-state visual evoked potentials of particular intermodulation frequency components; the acquisition module is configured to: determining a specific intermodulation frequency component corresponding to the candidate stimulus target based on the first luminance transformation frequency and the first motion frequency; wherein, in the N candidate stimulation targets, the specific intermodulation frequency component corresponding to each candidate stimulation target is different, where N is a positive integer; and acquiring the electroencephalogram signals generated when the user stares at the candidate stimulation target.
According to an embodiment of the present application, the obtaining module is configured to: multiplying the first luminance transformation frequency by the harmonic number of the first luminance transformation frequency to obtain an integral multiple harmonic of the first luminance change frequency; multiplying the first motion frequency by the harmonic number of the first motion frequency to obtain integral multiple harmonics of the first motion frequency; the harmonic number of the first luminance conversion frequency and the harmonic number of the first motion frequency are positive integers; and adding or subtracting the integral multiple harmonic of the first brightness change frequency and the integral multiple harmonic of the first movement frequency to determine the specific intermodulation frequency component.
According to an embodiment of the application, the determining module is configured to: preprocessing the electroencephalogram signals, and determining steady-state visual evoked potential electroencephalogram signals; performing feature extraction on the steady-state visual evoked potential electroencephalogram signal, and determining a specific intermodulation frequency component corresponding to the target stimulation target in the steady-state visual evoked potential electroencephalogram signal; determining the target stimulation target gazed by the user based on a specific intermodulation frequency component corresponding to the target stimulation target and a specific intermodulation frequency component corresponding to the candidate stimulation target.
According to an embodiment of the application, the determining module is configured to: comparing a specific intermodulation frequency component corresponding to the target stimulation target with a specific intermodulation frequency component corresponding to the candidate stimulation target, and determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency component; determining the candidate stimulation target with the largest correlation of the specific intermodulation frequency components as the target stimulation target gazed at by the user.
According to a third aspect of the present application, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method described herein.
According to a fourth aspect of the present application, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method described herein.
According to the method, candidate stimulation targets are displayed on a display screen at a first brightness conversion frequency and a first motion frequency; responding to the gaze of the user at the candidate stimulation target, and acquiring electroencephalogram signals when the user gazes at the candidate stimulation target; and determining a target stimulation target of the user gazing based on the electroencephalogram signal. Therefore, the stimulation targets can be intelligently coded, and the number of the stimulation targets which can be coded in the brain-computer interface data processing is increased.
It is to be understood that the teachings of this application do not require that all of the above-described benefits be achieved, but that certain technical solutions may achieve certain technical benefits, and that other embodiments of the application may achieve benefits not mentioned above.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description, which proceeds with reference to the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
in the drawings, like or corresponding reference characters designate like or corresponding parts.
Fig. 1 is a first process flow diagram illustrating a brain-computer interface data processing method provided by an embodiment of the present application;
fig. 2 is a schematic diagram illustrating a process flow of determining a specific intermodulation frequency component corresponding to a candidate stimulation target based on a first luminance transformation frequency and a first motion frequency provided by an embodiment of the present application;
fig. 3 illustrates a processing flow diagram ii of a brain-computer interface data processing method provided in an embodiment of the present application;
fig. 4 illustrates a processing flow diagram three of a brain-computer interface data processing method provided in an embodiment of the present application;
fig. 5 is a process flow diagram illustrating a fourth example of the brain-computer interface data processing method according to the embodiment of the present application;
fig. 6 is a processing flow diagram illustrating a fifth processing flow of the brain-computer interface data processing method provided by the embodiment of the present application;
fig. 7 is a processing flow diagram six illustrating a brain-computer interface data processing method provided by an embodiment of the present application;
fig. 8 is a diagram illustrating an application scenario of the brain-computer interface data processing method according to the embodiment of the present application;
fig. 9 is a diagram illustrating another application scenario of a brain-computer interface data processing method provided by an embodiment of the present application;
fig. 10 is a diagram illustrating another application scenario of a brain-computer interface data processing method provided by an embodiment of the present application;
fig. 11 is a diagram illustrating another application scenario of the brain-computer interface data processing method according to the embodiment of the present application;
fig. 12 is an alternative schematic diagram of a brain-computer interface data processing apparatus provided in an embodiment of the present application;
fig. 13 shows a schematic structural diagram of a component of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present application more obvious and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
The brain-computer interface is a brand new communication and control channel which is established between the brain of a human or an animal and a computer or other electronic equipment and is independent of peripheral nerves and muscle tissues.
In the related art, in the currently known technical scheme of brain-computer interface data processing, the brain-computer interface data processing is limited by the number of luminance conversion frequencies, the number of encodable stimulation targets is small in the brain-computer interface data processing process, and the related art is limited by the number of luminance conversion frequencies in the brain-computer interface data processing process, so that the problem of small number of encodable stimulation targets occurs.
Aiming at the problem that the number of encodable stimulation targets is small because the brain-computer interface data processing process is limited by the number of brightness conversion frequencies in the brain-computer interface data processing method provided by the related technology, the method provided by the embodiment of the application displays candidate stimulation targets on a display screen by using a first brightness conversion frequency and a first motion frequency; responding to the gaze of the user on the candidate stimulation target, and acquiring electroencephalograms when the user gazes on the candidate stimulation target; and determining a target stimulation target stared by the user based on the electroencephalogram signal. Therefore, compared with the prior art that the number of the stimulation targets which can be coded is small due to the limitation of the number of the brightness conversion frequencies in the brain-computer interface data processing process, the brain-computer interface data processing method can code the stimulation targets by using the intermodulation frequencies, can code the different stimulation targets by only using one brightness conversion frequency and a plurality of different movement frequencies as the intermodulation frequencies by introducing a plurality of different movement frequencies, and further improves the number of the stimulation targets which can be coded in the brain-computer interface data processing.
A processing flow in the brain-computer interface data processing method provided in the embodiment of the present application is explained. Referring to fig. 1, fig. 1 is a first schematic processing flow diagram of a brain-computer interface data processing method according to an embodiment of the present application, and will be described with reference to steps S101 to S104 shown in fig. 1.
Step S101, displaying candidate stimulation targets on a display screen at a first brightness conversion frequency and a first motion frequency.
In some embodiments, the candidate stimulation targets may comprise rectangular boxes, including rectangular stimulation blocks within the rectangular boxes. The first luminance transform frequency may include: a grey value transformation frequency of the candidate stimulation target, wherein the grey value of the candidate stimulation target may vary over time at the first luminance transformation frequency according to a sine wave law. The first luminance transformation frequency may also include: the grey value of the stimulation block may vary over time at a first luminance transformation frequency according to a sine wave law. The first motion frequency may include: the positions of the candidate stimulation targets vary in frequency, wherein the positions of the candidate stimulation targets may vary in time at the first motion frequency according to a sine wave law. The first motion frequency may also include: the position of the stimulation block within the rectangular frame, which remains stationary, may vary over time in accordance with a sine wave law at a first motion frequency. The first movement frequency is preferably 0.1 Hz-0.9 Hz, the rectangular frame is kept static, and the position of the stimulation block in the rectangular frame is preferably changed left and right according to the sine wave rule at the first movement frequency. The first brightness conversion frequency is preferably 15Hz, and the gray values of the stimulation blocks preferably vary according to a sine wave law in the gray value interval of 0 to 255 at the first brightness conversion frequency.
In specific implementation, a rectangular frame is displayed on a display screen, a rectangular stimulation block is included in the rectangular frame, and the rectangular frame is used for limiting the position transformation range of the stimulation block. The rectangular frame is kept static, the position of the stimulation block in the rectangular frame changes along with time at a first motion frequency according to a sine wave rule, and meanwhile the gray value of the stimulation block changes along with time at a first brightness conversion frequency according to the sine wave rule.
In some embodiments, displaying the candidate stimulation targets on the display screen at the first intensity transition frequency and the first motion frequency may further include: and simultaneously displaying N candidate stimulation targets on a display screen at the first brightness conversion frequency and the first motion frequency, wherein N is a positive integer, and the embodiment of the application does not limit the specific number of the candidate stimulation targets.
In specific implementation, N rectangular frames are displayed on a display screen, and N rectangular stimulation blocks are included in the N rectangular frames, wherein the N rectangular frames are kept static, the positions of the N stimulation blocks in the rectangular frames change along with time at a first motion frequency according to a sine wave rule, and meanwhile, the gray values of the N stimulation blocks change along with time at a first brightness conversion frequency according to the sine wave rule.
In some embodiments, displaying the candidate stimulation targets on the display screen at the first intensity transition frequency and the first motion frequency may include: configuring a plurality of different first motion frequencies for each candidate stimulation target by keeping the first brightness conversion frequency of the candidate stimulation target unchanged; a first intensity transform frequency and a first motion frequency for each candidate stimulation target are determined, and the candidate stimulation targets are displayed on the display screen at the first intensity transform frequency and the first motion frequency.
By way of example, candidate stimulation targets include: a stimulus target 1, a stimulus target 2, and a stimulus target 3. The stimulation target 1 comprises a rectangular frame 1, and a stimulation block 1 is arranged in the rectangular frame; the stimulation target 2 comprises a rectangular frame 2, and a stimulation block 2 is arranged in the rectangular frame; the stimulation target 3 comprises a rectangular frame 3 within which the stimulation block 3 is contained. While the 15Hz first luminance conversion frequency of each of the stimulation target 1, the stimulation target 2, and the stimulation target 3 is kept constant, the first movement frequency of 0.1Hz is assigned to the stimulation target 1, the first movement frequency of 0.2Hz is assigned to the stimulation target 2, and the first movement frequency of 0.3Hz is assigned to the stimulation target 3, respectively.
A first luminance transformation frequency and a first motion frequency of the stimulation target 1, the stimulation target 2 and the stimulation target 3 are determined. A rectangular frame 1, a rectangular frame 2, and a rectangular frame 3 are displayed on the display screen. The rectangular frame 1, the rectangular frame 2 and the rectangular frame 3 respectively comprise a stimulation block 1, a stimulation block 2 and a stimulation block 3, wherein the rectangular frame 1, the rectangular frame 2 and the rectangular frame 3 are kept static, the position of the stimulation block 1 in the rectangular frame 1 changes along with time at 0.1Hz according to a sine wave rule, the position of the stimulation block 2 in the rectangular frame 2 changes along with time at 0.2Hz according to a sine wave rule, and the position of the stimulation block 3 in the rectangular frame 3 changes along with time at 0.3Hz according to a sine wave rule. Meanwhile, the gray values of the stimulation blocks 1, 2 and 3 change along with time at 15Hz according to the sine wave rule.
It should be emphasized that the first luminance transformation frequency and the first motion frequency are only examples for illustration, and not actual frequency values, and the specific determination of the first luminance transformation frequency and the first motion frequency should be adjusted according to specific experimental requirements, and the embodiments of the present application are not limited in particular.
In some embodiments, prior to step S101, a prompt block is displayed on the display screen at a position where the candidate stimulation target is to be displayed, for a preset prompt duration, prompting the user to gaze at the candidate stimulation target to be displayed.
As an example, candidate stimulation targets to be displayed include: a stimulus target 1, a stimulus target 2, and a stimulus target 3. If the stimulation target 1 needs to be gazed by the user, displaying a prompt block at a position where the stimulation target 1 is to be displayed on the display screen within a preset prompt duration, and prompting the user to gaze at the stimulation target 1 to be displayed.
In some embodiments, step S101 may include: and displaying a prompt block at a position on a display screen where the candidate stimulation target is to be displayed, prompting a user to stare at the candidate stimulation target to be displayed, starting a stimulation task, displaying the candidate stimulation target on the display screen at a first brightness conversion frequency and a first motion frequency within a preset task duration, and ending the stimulation task in response to the display duration of the candidate stimulation target reaching the preset task duration.
And S102, responding to the gaze candidate stimulation target of the user, and acquiring the electroencephalogram signals when the user gazes at the candidate stimulation target.
In some embodiments, the user's gaze at the candidate stimulation target may include: a user gazes at one candidate stimulation target of the N candidate stimulation targets, wherein N rectangular frames of the N candidate stimulation targets are kept static, the positions of the N stimulation blocks in the N rectangular frames are changed at a plurality of different first motion frequencies according to a sine wave rule, meanwhile, the gray values of the stimulation blocks are changed at the same first brightness conversion frequency according to the sine wave rule, and N is a positive integer. The brain electrical signals may include: steady state visual evoked potentials for specific intermodulation frequency components. The electroencephalogram signal may also include other signals or other potentials, and the embodiment of the present application is not limited.
In some embodiments, in response to the user gazing at the candidate stimulation target, acquiring the brain electrical signal while the user gazed at the candidate stimulation target may include: generating an electroencephalogram signal in response to a user gazing at the candidate stimulation target; determining specific intermodulation frequency components corresponding to candidate stimulation targets based on the first brightness transformation frequency and the first motion frequency, wherein the specific intermodulation frequency components corresponding to each candidate stimulation target are different in N candidate stimulation targets, and N is a positive integer; and acquiring an electroencephalogram signal generated when the user gazes at the candidate stimulation target.
The method comprises the steps that brain electrical signals are generated in response to the fact that a user gazes at candidate stimulation targets, in specific implementation, the user gazes at one candidate stimulation target of N candidate stimulation targets, N rectangular frames of the N candidate stimulation targets are kept static, the positions of N stimulation blocks in the N rectangular frames change at a plurality of different first motion frequencies according to a sine wave rule, meanwhile, the gray values of the stimulation blocks change at the same first brightness conversion frequency according to a sine wave rule, and N is a positive integer. The user stares at one candidate stimulation target of the N candidate stimulation targets, feels the change of the position of the stimulation block relative to the rectangular frame and the change of the gray value of the stimulation block, and generates an electroencephalogram signal corresponding to the stared candidate stimulation target, wherein the electroencephalogram signal can comprise: steady-state visual evoked potentials for particular intermodulation frequency components corresponding to the gazed candidate stimulus target.
In an implementation, as shown in fig. 2, the process of determining the specific intermodulation frequency components corresponding to the candidate stimulation targets based on the first luminance transformation frequency and the first motion frequency may include: step S1, multiplying the first brightness conversion frequency by the harmonic number of the first brightness conversion frequency to obtain integral multiple harmonic of the first brightness conversion frequency. S2, multiplying the first motion frequency by the harmonic number of the first motion frequency to obtain integral multiple harmonics of the first motion frequency; the harmonic number of the first luminance conversion frequency and the harmonic number of the first motion frequency are positive integers. And S3, adding or subtracting the integral multiple harmonic of the first brightness change frequency and the integral multiple harmonic of the first movement frequency to determine a specific intermodulation frequency component.
Through steps S1 to S3, a specific intermodulation frequency component is obtained, which can be expressed by the following formula (1):
m×F 0 ±n×F i (i=1,2,3,...,j)(1)
wherein m is the harmonic number of the first luminance conversion frequency, F 0 For a first luminance conversion frequency, F 0 Fundamental frequency, mxF, which may also be referred to as first luminance transformation frequency 0 Harmonics which are integer multiples of the first luminance variation frequency, n being the harmonic number of the first movement frequency, F i At a first movement frequency, F i Fundamental frequency, n x F, which can also be referred to as first motion frequency i Is an integer multiple harmonic of the first motion frequency, j is the number of candidate stimulation targets, and m, n, and i are positive integers.
For example, if j is 2, i is 1, that is, the first luminance transformation frequency of the candidate stimulation target 1 is 15Hz, the harmonic number of the first luminance transformation frequency is 2, the first motion frequency is 0.4Hz, and the harmonic number of the first motion frequency is 2. Since the harmonic number of the first luminance conversion frequency is 2 and the harmonic number of the first motion frequency is 2, when the specific intermodulation frequency component is obtained according to the formula (1), m takes 1 or 2, n takes 1 or 2, that is, there are 4 combinations of values of m =1, n =1, m =1, n =2, m =2, n =1 and m =2, n =2, and according to the formula (1), 8 specific intermodulation frequency components can be obtained, respectively: 14.6, 14.2, 15.4, 15.8, 29.6, 29.2, 30.4, 30.8.
i is 2, i.e. the first intensity transform frequency of the candidate stimulation target 2 is 15Hz, the harmonics of the first intensity transform frequency is 2, the first motion frequency is 0.5Hz, and the harmonics of the first motion frequency is 2. Since the harmonic number of the first luminance conversion frequency is 2 and the harmonic number of the first motion frequency is 2, when the specific intermodulation frequency component is obtained according to the formula (1), m takes 1 or 2, n takes 1 or 2, that is, there are 4 combinations of values of m =1, n =1, m =1, n =2, m =2, n =1 and m =2, n =2, and according to the formula (1), 8 specific intermodulation frequency components can be obtained, respectively: 14.5, 14.0, 15.5, 16.0, 29.5, 29.0, 30.5, 31.0.
In this way, the specific intermodulation frequency components corresponding to each of the candidate stimulus target 1 and the candidate stimulus target 2 are different.
Aiming at acquiring electroencephalograms generated when a user gazes at a candidate stimulation target, the electroencephalograms generated when the user gazes at the candidate stimulation target are acquired in real time based on an electroencephalogram acquisition device in specific implementation, wherein the electroencephalograms can include: steady state visual evoked potentials of particular intermodulation frequency components corresponding to the gazed candidate stimulus target.
And S103, determining a target stimulation target of the user gazing based on the electroencephalogram signals.
In some embodiments, the target stimulation targets may include: the candidate stimulus target with the largest correlation of the specific intermodulation frequency components. Wherein, the target stimulation target may be the same as the candidate stimulation target gazed at by the user, and the target stimulation target may also be the same as the candidate stimulation target gazed at by the user.
In some embodiments, determining the target stimulation target at which the user gazes based on the brain electrical signal may include: preprocessing the electroencephalogram signals, and determining steady-state visual evoked potential electroencephalogram signals; performing feature extraction on the steady-state visual evoked potential electroencephalogram signal, and determining a specific intermodulation frequency component corresponding to a target stimulus in the steady-state visual evoked potential electroencephalogram signal; and determining the target stimulation target gazed by the user based on the specific intermodulation frequency component corresponding to the target stimulation target and the specific intermodulation frequency component corresponding to the candidate stimulation target.
For preprocessing an electroencephalogram signal and determining a steady-state visual evoked potential electroencephalogram signal, in specific implementation, firstly, time trigger signals for starting a stimulation task and ending the stimulation task are obtained, and the electroencephalogram signal is divided into the steady-state visual evoked potential electroencephalogram signals corresponding to a plurality of candidate stimulation targets according to the time trigger signals for starting the stimulation task and ending the stimulation task.
The method comprises the steps of performing feature extraction on a steady-state visual evoked potential electroencephalogram signal, determining a specific intermodulation frequency component corresponding to a target stimulus in the steady-state visual evoked potential electroencephalogram signal, performing feature extraction processing on the steady-state visual evoked potential based on filter bank typical correlation analysis, and determining the specific intermodulation frequency component corresponding to the target stimulus in the steady-state visual evoked potential electroencephalogram signal.
Determining a target stimulus target gazed by a user aiming at a specific intermodulation frequency component corresponding to the target stimulus target and a specific intermodulation frequency component corresponding to a candidate stimulus target, and comparing the specific intermodulation frequency component corresponding to the target stimulus target with the specific intermodulation frequency component corresponding to the candidate stimulus target during specific implementation to determine the candidate stimulus target with the maximum correlation of the specific intermodulation frequency components; and determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency components as the stimulation target for the user gazing purpose.
In some embodiments, a second processing flow diagram of the brain-computer interface data processing method is shown in fig. 3, and includes:
step S201, while keeping the first luminance transformation frequency of the candidate stimulus target unchanged, arranges a plurality of different first motion frequencies for each candidate stimulus target.
In step S202, a first luminance transformation frequency and a first motion frequency of each candidate stimulation target are determined.
Step S203, displaying the flickering stimulus block on the display screen at the determined first luminance conversion frequency.
Step S204, displaying the stimulation block moving in the rectangular frame on the display screen at the determined first motion frequency.
In some embodiments, before step S201, a prompt block is displayed on the display screen at a position where the candidate stimulation target is to be displayed for a preset prompt duration, prompting the user to gaze at the stimulation block to be displayed.
In the specific implementation, a prompt block is displayed at a position on a display screen where a candidate stimulation target is to be displayed, a user is prompted to start a stimulation task after staring at the candidate stimulation target to be displayed, the first brightness conversion frequency of the stimulation block is kept unchanged within a preset task duration, and a plurality of different first motion frequencies are configured for the stimulation block. Determining a first brightness conversion frequency and a first motion frequency of each stimulation block, displaying a rectangular frame on a display screen, wherein the rectangular frame comprises the rectangular stimulation blocks, and defining the left and right position conversion range of the stimulation blocks by using the rectangular frame. The rectangular frame is kept still, the position of the stimulation block in the rectangular frame changes along with time according to a sine wave rule by the determined first motion frequency, meanwhile, the gray value of the stimulation block changes along with time according to the sine wave rule by the determined first brightness conversion frequency, and the stimulation task is ended in response to the fact that the display duration of the candidate stimulation target reaches the preset task duration.
In some embodiments, a processing flow diagram of the brain-computer interface data processing method three is shown in fig. 4, and includes:
in step S301, a plurality of different first motion frequencies are allocated to each candidate stimulus target, while keeping the first luminance transformation frequency of the candidate stimulus target unchanged.
In step S302, a first luminance transformation frequency and a first motion frequency of each candidate stimulation target are determined.
Step S303, displaying N candidate stimulation targets on the display screen at the determined first luminance transformation frequency and the determined first motion frequency simultaneously, where N is a positive integer.
In some embodiments, prior to step S301, a prompt block is displayed on the display screen at a location where a candidate stimulation target is to be displayed for a preset prompt duration, prompting the user to gaze at the stimulation block to be displayed.
In particular implementation, the candidate stimulation targets include steps S301-S304: a stimulus target 1, a stimulus target 2, and a stimulus target 3. The stimulation target 1 comprises a rectangular frame 1, and a stimulation block 1 is arranged in the rectangular frame; the stimulation target 2 comprises a rectangular frame 2, and a stimulation block 2 is arranged in the rectangular frame; the stimulation target 3 comprises a rectangular frame 3 within which the stimulation block 3 is contained.
Displaying a prompt block at a position on a display screen where a candidate stimulation target is to be displayed, prompting a user to start a stimulation task after staring at the candidate stimulation target to be displayed, keeping first brightness conversion frequencies of 15Hz of a stimulation block 1, a stimulation block 2 and a stimulation block 3 unchanged within a preset task time length, configuring a first motion frequency of 0.1Hz for the stimulation block 1, configuring a first motion frequency of 0.2Hz for the stimulation block 2 and configuring a first motion frequency of 0.3Hz for the stimulation block 3.
A first luminance transformation frequency and a first motion frequency of stimulation block 1, stimulation block 2 and stimulation block 3 are determined. A rectangular frame 1, a rectangular frame 2, and a rectangular frame 3 are displayed on the display screen. The rectangular frame 1, the rectangular frame 2 and the rectangular frame 3 respectively comprise a stimulation block 1, a stimulation block 2 and a stimulation block 3, wherein the rectangular frame 1, the rectangular frame 2 and the rectangular frame 3 are kept static, the position of the stimulation block 1 in the rectangular frame 1 changes with 0.1Hz along with the left and right time according to a sine wave rule, the position of the stimulation block 2 in the rectangular frame 2 changes with 0.2Hz along with the left and right time according to the sine wave rule, and the position of the stimulation block 3 in the rectangular frame 3 changes with the left and right time according to the sine wave rule and 0.3 Hz. Meanwhile, the gray values of the stimulation block 1, the stimulation block 2 and the stimulation block 3 change from 0 to 255 along with time at 15Hz according to a sine wave rule, and the stimulation task is ended in response to the display duration of the candidate stimulation target reaching the preset task duration.
In some embodiments, a processing flow diagram of the brain-computer interface data processing method is four, as shown in fig. 5, including:
step S401, determining a specific intermodulation frequency component corresponding to the candidate stimulation target based on the first luminance transformation frequency and the first motion frequency.
And step S402, acquiring electroencephalogram signals generated when the user stares at the candidate stimulation target.
In some embodiments, first, the user gazes at one of the N candidate stimulation targets, where N rectangular frames of the N candidate stimulation targets remain stationary, the positions of the N stimulation blocks within the N rectangular frames vary according to a sine wave law at a plurality of different first motion frequencies, while the grayscale values of the stimulation blocks vary according to a sine wave law at the same first luminance transform frequency, N being a positive integer. And (3) staring at one candidate stimulation target in the N candidate stimulation targets by the user, sensing the change of the position of the stimulation block relative to the rectangular frame and the change of the gray value of the stimulation block, and generating an electroencephalogram signal corresponding to the stared candidate stimulation target.
And determining a specific intermodulation frequency component corresponding to the candidate stimulating target based on the first brightness conversion frequency and the first motion frequency, wherein the specific intermodulation frequency component corresponding to each candidate stimulating target is different in the N candidate stimulating targets, and N is a positive integer.
Based on the brain electricity collection system, gather the electroencephalogram signal that the user produced when staring at candidate amazing target in real time, wherein, the electroencephalogram signal can include: steady-state visual evoked potentials for particular intermodulation frequency components corresponding to the gazed candidate stimulus target. The brain electricity collection device can include: and measuring electrodes are arranged at the positions of Pz, PO5, PO3, POz, PO4, PO6, O1, oz and O2 of the occipital region of the brain of the user, and the distribution of the electrodes conforms to the international 10-20 system.
In some embodiments, a processing flow diagram of the brain-computer interface data processing method is five, as shown in fig. 6, including:
step S501, multiplies the first luminance conversion frequency by the harmonic number of the first luminance conversion frequency to obtain an integral multiple harmonic of the first luminance change frequency.
Step S502, multiplying the first motion frequency by the harmonic number of the first motion frequency to obtain integral multiple harmonic of the first motion frequency; the harmonic number of the first luminance transform frequency and the harmonic number of the first motion frequency are positive integers.
Step S503 is to add or subtract the integral multiple harmonics of the first luminance change frequency and the integral multiple harmonics of the first motion frequency to determine the specific intermodulation frequency component.
In some embodiments, the specific intermodulation frequency components corresponding to the candidate stimulus target are determined based on the first luminance transformation frequency and the first motion frequency, via steps S501-S503. Candidate stimulus targets are encoded by specific intermodulation frequency components generated in the steady-state visual evoked potentials by the first luminance transformation frequency and the first motion frequency.
In some embodiments, a processing flow diagram of the brain-computer interface data processing method is shown as six, as shown in fig. 7, and includes:
step S601, preprocessing the electroencephalogram signals and determining steady-state visual evoked potential electroencephalogram signals.
As an example, the number of candidate stimulation targets is 9, and the user gazes at one of the 9 candidate stimulation targets at each stimulation task until all of the 9 candidate stimulation targets have been gazed, i.e. 9 stimulation tasks are performed. Firstly, acquiring time trigger signals of a stimulation starting task and a stimulation ending task of each stimulation task, and dividing the electroencephalogram signal into steady-state visual evoked potential electroencephalogram signals corresponding to 9 candidate stimulation targets according to the time trigger signals of the stimulation starting task and the stimulation ending task.
Step S602, feature extraction is carried out on the steady-state visual evoked potential electroencephalogram signal, and specific intermodulation frequency components corresponding to a target stimulation target in the steady-state visual evoked potential electroencephalogram signal are determined.
Step S603, comparing the specific intermodulation frequency component corresponding to the target stimulus target with the specific intermodulation frequency component corresponding to the candidate stimulus target, and determining the candidate stimulus target having the largest correlation of the specific intermodulation frequency components.
And step S604, determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency components as the target stimulation target stared by the user.
In steps S602 to S604, in a specific implementation, first, a feature extraction process is performed on the steady-state visual evoked potential based on an FBCCA algorithm (Filter Bank acoustical correlation Analysis, filter Bank typical correlation Analysis), the steady-state visual evoked potential electroencephalogram signal is decomposed into a plurality of subband signals by using a plurality of band pass filters with different frequency ranges, and specific intermodulation frequency components corresponding to the plurality of subband signals are determined.
Secondly, calculating typical correlation coefficients of the specific intermodulation frequency component corresponding to each subband signal and the specific intermodulation frequency component corresponding to the candidate stimulating target and weighting coefficients of the specific intermodulation frequency component corresponding to each subband signal respectively.
And finally, calculating the square of a typical correlation coefficient corresponding to each subband signal and multiplying the square of the typical correlation coefficient corresponding to each subband signal by a weight coefficient corresponding to each subband signal to obtain a characteristic vector rho used as target identification, wherein the characteristic vector rho of the maximum target identification in the 9 rho is called as a maximum correlation coefficient, and a candidate stimulation target corresponding to the maximum correlation coefficient is a candidate stimulation target with the maximum correlation of the specific intermodulation frequency component.
And determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency components as the target stimulation target of the gaze of the user.
Fig. 8 is a diagram illustrating an application scenario of the brain-computer interface data processing method according to the embodiment of the present application.
Referring to fig. 8, an application scenario of the brain-computer interface data processing method provided in the embodiment of the present application is applied to a stimulation interface for displaying candidate stimulation targets in a display screen, where a screen refresh rate of the display screen is 60Hz, a screen resolution of the display screen is 1920 pixels by 1080 pixels, a background color of the stimulation interface is black, the stimulation interface includes 9 candidate stimulation targets, and 9 rectangular frames corresponding to the 9 candidate stimulation targets are arranged in a matrix form of 3 × 3. The specific arrangement mode of the rectangular frame is as follows: uniformly distributing and displaying 9 rectangular frames in the display screen, wherein the rectangular frames are separated by a preset distance, and the rectangular frames positioned on the periphery of the matrix are separated from the edge of the display screen by a preset blank distance; the left and right position variation range of the moving stimulation block is defined by a rectangular frame, the height of the rectangular frame is the same as the height of the stimulation block, and the width of the rectangular frame is significantly larger than the width of the stimulation block, so that a user can fully sense the movement of the stimulation block relative to the rectangular frame.
It can be understood that the application scenario of the brain-computer interface data processing method in fig. 8 is only a partial exemplary implementation manner in the embodiment of the present application, and the application scenario of the brain-computer interface data processing method in the embodiment of the present application includes, but is not limited to, the application scenario of the brain-computer interface data processing method shown in fig. 8.
Fig. 9 is a diagram illustrating another application scenario of the brain-computer interface data processing method according to the embodiment of the present application.
Referring to fig. 9, another application scenario of the brain-computer interface data processing method provided in the embodiment of the present application is applied to a stimulation interface in a display screen within a preset prompt duration. Displaying a prompt block at the center position of a random one of the 9 rectangular frames within a preset prompt duration, prompting a user to gaze at a stimulation block to be displayed at the center position, wherein the preset prompt duration is preferably 1s, and the specific prompt duration is not limited in the embodiment of the application. The height and the width of the prompt block are preferably the same as those of the stimulation block, the prompt block is highlighted in highlight color for highlighting the prompt block in the stimulation interface, and the embodiment of the application does not limit the height, the width and the highlight color of the specific prompt block. Fig. 9 shows that within the preset prompt duration, the rectangular boxes are arranged in a 3 × 3 matrix form, wherein in the center position of the rectangular box in the third row and the third column, the prompt block is highlighted in a highlight color.
It can be understood that the application scenario of the brain-computer interface data processing method in fig. 9 is only a partial exemplary implementation manner in the embodiment of the present application, and the application scenario of the brain-computer interface data processing method in the embodiment of the present application includes, but is not limited to, the application scenario of the brain-computer interface data processing method shown in fig. 9.
Fig. 10 is a diagram illustrating another application scenario of the brain-computer interface data processing method according to the embodiment of the present application.
Referring to fig. 10, another application scenario of the brain-computer interface data processing method provided in the embodiment of the present application is applied to a stimulation interface in a display screen within a preset task duration. The 9 stimulation blocks are assigned different first movement frequencies and the 9 stimulation blocks are assigned the same first luminance transformation frequency. As, stimulation block 1 is configured with a first motion frequency of 0.1Hz, stimulation block 2 is configured with a first motion frequency of 0.2Hz, stimulation block 3 is configured with a first motion frequency of 0.3Hz, stimulation block 4 is configured with a first motion frequency of 0.4Hz, stimulation block 5 is configured with a first motion frequency of 0.5Hz, stimulation block 6 is configured with a first motion frequency of 0.6Hz, stimulation block 7 is configured with a first motion frequency of 0.7Hz, stimulation block 8 is configured with a first motion frequency of 0.8Hz, stimulation block 9 is configured with a first motion frequency of 0.9Hz, and stimulation blocks 1-9 are each configured with the same first luminance translation frequency of 15 Hz. The width of the stimulation block can enable a user to fully sense the flicker of the object and acquire obvious first brightness conversion frequency and harmonic signals of the first brightness conversion frequency in the electroencephalogram signal.
Within the preset task duration, the 9 rectangular frames are kept static, the positions of the 9 stimulation blocks in the 9 rectangular frames respectively change along with the time according to a sine wave rule by using different first motion frequencies configured for the 9 stimulation blocks, and meanwhile, the gray values of the 9 stimulation blocks change along with the time according to the sine wave rule by using a first brightness conversion frequency of 15 Hz. Fig. 10 shows the positions of the stimulation blocks relative to the rectangular frame when the gray-scale values of the 9 stimulation blocks are 0 during the preset task time, wherein the stimulation blocks move left and right within the rectangular frame from the center of the rectangular frame as the starting position.
It is understood that the application scenario of the brain-computer interface data processing method in fig. 10 is only a partial exemplary implementation manner in the embodiment of the present application, and the application scenario of the brain-computer interface data processing method in the embodiment of the present application includes, but is not limited to, the application scenario of the brain-computer interface data processing method shown in fig. 10.
Fig. 11 shows a further application scenario diagram of the brain-computer interface data processing method according to the embodiment of the present application.
Referring to fig. 11, a further application scenario of the brain-computer interface data processing method provided in the embodiment of the present application is applied to the presentation of a stimulation target based on a steady-state visual evoked potential by luminance and motion joint coding. F in fig. 11 denotes the first luminance transformation frequency, where F corresponds to a sinusoidal waveform indicating that the gray-scale value of the stimulation block varies with time at the first luminance transformation frequency according to a sinusoidal law. f. of 1 Denotes a first movement frequency, wherein 1 And the corresponding sine waveform shows that the position of the stimulation block in the rectangular frame changes along with time at a first motion frequency according to a sine wave rule. The six rectangular frames arranged from top to bottom in time on the left side in fig. 11 and the stimulation blocks in the rectangular frames show that the position of the same stimulation block in the rectangular frames changes with time left and right at a first motion frequency according to a sine wave rule along with time, and meanwhile, the gray value of the stimulation block changes with time at a first brightness conversion frequency according to the sine wave rule.
In some embodiments, the process of the stimulation paradigm for steady-state visual evoked potential presentation based on luminance and motion joint coding is: after the stimulation program starts to run, 9 rectangular frames are firstly displayed in a stimulation interface of a display screen, a red prompt block is displayed at the center of a random one of the 9 rectangular frames within a preset prompt time length, a user is prompted to stare at a stimulation block to be displayed at the center, and the preset prompt time length is preferably 1s.
After the prompt is finished, starting a stimulation task, keeping 9 rectangular frames static within the preset task duration, enabling the positions of 9 stimulation blocks in the 9 rectangular frames to change along with the time according to a sine wave rule by using different first motion frequencies configured for the 9 stimulation blocks, and enabling the gray values of the 9 stimulation blocks to change along with the time according to the sine wave rule by using a first brightness conversion frequency of 15Hz in response to the fact that the display duration of the candidate stimulation target reaches the preset task duration, and finishing the stimulation task. Wherein the preset task time is preferably 4s. So far, the stimulation test of one candidate stimulation target is performed, and by analogy, 9 candidate stimulation targets randomly complete the stimulation test in sequence, so that a group of stimulation test groups is formed.
It is understood that the application scenario of the brain-computer interface data processing method in fig. 11 is only a partial exemplary implementation manner in the embodiment of the present application, and the application scenario of the brain-computer interface data processing method in the embodiment of the present application includes, but is not limited to, the application scenario of the brain-computer interface data processing method shown in fig. 11.
According to the method, the first brightness conversion frequency of the candidate stimulation targets is kept unchanged, and a plurality of different first motion frequencies are configured for each candidate stimulation target; determining a first brightness conversion frequency and a first motion frequency of each candidate stimulation target, and displaying the candidate stimulation targets on a display screen by using the first brightness conversion frequency and the first motion frequency, so that coding of different stimulation targets can be realized by using brightness and motion joint coding and only adopting fewer brightness conversion frequencies, and the quantity of the stimulation targets which can be coded in the brain-computer interface data processing is further increased; the method of the embodiment of the application determines a specific intermodulation frequency component corresponding to the candidate stimulation target based on the first brightness conversion frequency and the first motion frequency; the specific intermodulation frequency components corresponding to each candidate stimulation target in the N candidate stimulation targets are different, wherein N is a positive integer, so that the intermodulation frequency components generated by the brightness conversion frequency and the motion frequency in the steady-state visual evoked potential can be used for coding the stimulation targets, and the codes corresponding to each stimulation target are different; the method of the embodiment of the application preprocesses the electroencephalogram signal and determines the steady-state visual evoked potential electroencephalogram signal; performing feature extraction on the steady-state visual evoked potential electroencephalogram signal, and determining a specific intermodulation frequency component corresponding to a target stimulus in the steady-state visual evoked potential electroencephalogram signal; and determining the target stimulation target gazed by the user based on the specific intermodulation frequency component corresponding to the target stimulation target and the specific intermodulation frequency component corresponding to the candidate stimulation target, so that the quantity of the stimulation targets which can be coded in the data processing of the brain-computer interface can be increased, the performance of the brain-computer interface is optimized, and the practical level of the brain-computer interface is improved.
Therefore, compared with the related art in which the brain-computer interface data processing is limited by the number of brightness conversion frequencies, the brain-computer interface data processing method has the advantages that the number of the encodable stimulation targets is small in the brain-computer interface data processing process, the brain-computer interface data processing method can encode the stimulation targets by using the intermodulation frequencies, and by introducing a plurality of different motion frequencies, the encoding of the different stimulation targets can be realized only by using one brightness conversion frequency and a plurality of different motion frequencies as the intermodulation frequencies, so that the number of the encodable stimulation targets in the brain-computer interface data processing is increased.
Continuing with the exemplary structure of the brain-computer interface data processing apparatus 90 provided in the embodiment of the present application as software modules, in some embodiments, as shown in fig. 12, the software modules in the brain-computer interface data processing apparatus 90 may include: a display module 901, configured to display candidate stimulation targets on a display screen at a first luminance transformation frequency and a first motion frequency; an obtaining module 902, configured to, in response to a user gazing at a candidate stimulation target, obtain an electroencephalogram signal when the user gazes at the candidate stimulation target; and the determining module 903 is used for determining the target stimulation target of the user gazing based on the electroencephalogram signals.
In some embodiments, the candidate stimulation targets comprise rectangular boxes, including rectangular stimulation blocks within the rectangular boxes; the display module 901 is specifically configured to, during the displaying of the candidate stimulation targets on the display screen at the first luminance transformation frequency and the first motion frequency: displaying the flickering stimulus block on a display screen at a first luminance conversion frequency; the stimulation block moving within the rectangular frame is displayed on the display screen at a first motion frequency.
In some embodiments, the display module 901 is further specifically configured, in displaying the candidate stimulation targets on the display screen at the first luminance transformation frequency and the first motion frequency, to: and simultaneously displaying N candidate stimulation targets on a display screen at a first brightness conversion frequency and a first motion frequency, wherein N is a positive integer.
In some embodiments, the display module 901 is specifically configured to, during the displaying of the stimulation target on the display screen at the first luminance transformation frequency and the first motion frequency: configuring a plurality of different first motion frequencies for each candidate stimulation target while keeping the first luminance transformation frequency of the candidate stimulation target unchanged; a first intensity transform frequency and a first motion frequency for each candidate stimulation target are determined, and the candidate stimulation targets are displayed on the display screen at the first intensity transform frequency and the first motion frequency.
In some embodiments, responsive to a user gazing at the candidate stimulation target, generating an electroencephalogram signal, the electroencephalogram signal including a steady-state visual evoked potential for a particular intermodulation frequency component; the obtaining module 902 is specifically configured to, in response to the user gazing at the candidate stimulation target, obtain the electroencephalogram signal when the user gazes at the candidate stimulation target: determining a specific intermodulation frequency component corresponding to the candidate stimulating target based on the first brightness conversion frequency and the first motion frequency; in the N candidate stimulation targets, the specific intermodulation frequency components corresponding to each candidate stimulation target are different, wherein N is a positive integer; and acquiring an electroencephalogram signal generated when the user stares at the candidate stimulation target.
In some embodiments, the obtaining module 902 is specifically configured to, in determining the specific intermodulation frequency components corresponding to the candidate stimulation target based on the first luminance transformation frequency and the first motion frequency: multiplying the first luminance conversion frequency by the harmonic number of the first luminance conversion frequency to obtain an integral multiple harmonic of the first luminance change frequency; multiplying the first motion frequency by the harmonic number of the first motion frequency to obtain integral multiple harmonic of the first motion frequency; the harmonic number of the first luminance conversion frequency and the harmonic number of the first motion frequency are positive integers; the integer multiples of the first brightness change frequency are added or subtracted from the integer multiples of the first motion frequency to determine the specific intermodulation frequency components.
In some embodiments, the determining module 903 is specifically configured to, in determining the target stimulation target of the user's gaze based on the brain electrical signal: preprocessing the electroencephalogram signals, and determining steady-state visual evoked potential electroencephalogram signals; performing feature extraction on the steady-state visual evoked potential electroencephalogram signal, and determining a specific intermodulation frequency component corresponding to a target stimulus in the steady-state visual evoked potential electroencephalogram signal; and determining the target stimulation target gazed by the user based on the specific intermodulation frequency component corresponding to the target stimulation target and the specific intermodulation frequency component corresponding to the candidate stimulation target.
In some embodiments, the determining module 903 is specifically configured to, in determining the target stimulation target at which the user gazes based on the specific intermodulation frequency components corresponding to the target stimulation target and the specific intermodulation frequency components corresponding to the candidate stimulation target: comparing a specific intermodulation frequency component corresponding to the target stimulation target with a specific intermodulation frequency component corresponding to the candidate stimulation target, and determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency components; and determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency components as the stimulation target for the user gazing purpose.
It should be noted that the description of the apparatus in the embodiment of the present application is similar to that of the method embodiment described above, and has similar beneficial effects to the method embodiment, and therefore, the description is not repeated. The technical details, which are not used up, in the brain-computer interface data processing apparatus provided in the embodiment of the present application can be understood from the description of any one of fig. 1 to 12.
The present application also provides an electronic device and a non-transitory computer readable storage medium according to embodiments of the present application.
FIG. 13 shows a schematic block diagram of an example electronic device 800 that may be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular telephones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 13, the electronic apparatus 800 includes a computing unit 801 that can perform various appropriate actions and processes in accordance with a computer program stored in a Read Only Memory (ROM) 802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the electronic apparatus 800 can also be stored. The calculation unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
A number of components in the electronic device 800 are connected to the I/O interface 805, including: an input unit 806 such as a keyboard, a mouse, or the like; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, or the like; and a communication unit 809 such as a network card, modem, wireless communication transceiver, etc. The communication unit 809 allows the electronic device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Computing unit 801 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 801 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and the like. The calculation unit 801 executes the respective methods and processes described above, such as the brain-computer interface data processing method. For example, in some embodiments, the brain-computer interface data processing method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 808. In some embodiments, part or all of the computer program can be loaded and/or installed onto the electronic device 800 via the ROM 802 and/or the communication unit 809. When loaded into RAM 803 and executed by computing unit 801, may perform one or more steps of the brain-computer interface data processing method described above. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the brain-computer interface data processing method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present application may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above, reordering, adding or deleting steps, may be used. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. A method for processing brain-computer interface data, the method comprising:
configuring a plurality of different first motion frequencies for each candidate stimulation target while keeping a first luminance transformation frequency of the candidate stimulation target unchanged;
determining the first brightness transition frequency and the first motion frequency for each of the candidate stimulation targets, displaying the candidate stimulation targets on a display screen at the first brightness transition frequency and the first motion frequency;
responding to the gaze of the user on the candidate stimulation target, and acquiring electroencephalograms when the user gazes on the candidate stimulation target;
and determining a target stimulation target of the user gazing based on the electroencephalogram signal.
2. The method of claim 1, wherein said displaying the candidate stimulation targets on a display screen at the first luminance transition frequency and the first motion frequency comprises:
the candidate stimulation target comprises a rectangular frame, and a rectangular stimulation block is included in the rectangular frame;
displaying the flickering stimulus block on a display screen at the first luminance conversion frequency;
displaying the stimulation block moving within the rectangular frame on a display screen at the first motion frequency.
3. The method of claim 2, wherein said displaying said candidate stimulation targets on a display screen at said first intensity transition frequency and said first motion frequency further comprises:
displaying N candidate stimulation targets on the display screen simultaneously at the first luminance transformation frequency and the first motion frequency, wherein N is a positive integer.
4. The method of claim 2 or 3, wherein said acquiring the brain electrical signal of the user at the gaze of the candidate stimulation target in response to the user at the gaze of the candidate stimulation target comprises:
responsive to the user gazing at the candidate stimulation target, generating the brain electrical signal, the brain electrical signal comprising steady-state visual evoked potentials for particular intermodulation frequency components;
determining a specific intermodulation frequency component corresponding to the candidate stimulation target based on the first luminance transformation frequency and the first motion frequency; wherein, in the N candidate stimulation targets, the specific intermodulation frequency component corresponding to each candidate stimulation target is different, where N is a positive integer;
and acquiring the electroencephalogram signals generated when the user gazes at the candidate stimulation target.
5. The method of claim 4, wherein the determining the particular intermodulation frequency components corresponding to the candidate stimulation target based on the first luminance transformation frequency and the first motion frequency comprises:
multiplying the first luminance conversion frequency by the harmonic number of the first luminance conversion frequency to obtain an integral multiple harmonic of the first luminance change frequency;
multiplying the first motion frequency by the harmonic number of the first motion frequency to obtain integral multiple harmonics of the first motion frequency; the harmonic number of the first luminance transform frequency and the harmonic number of the first motion frequency are positive integers;
adding or subtracting integer multiples of the first brightness change frequency to or from integer multiples of the first motion frequency to determine the particular intermodulation frequency component.
6. The method of claim 5, wherein determining the target stimulation target for the user's gaze based on the brain electrical signal comprises:
preprocessing the electroencephalogram signals, and determining steady-state visual evoked potential electroencephalogram signals;
performing feature extraction on the steady-state visual evoked potential electroencephalogram signal, and determining a specific intermodulation frequency component corresponding to the target stimulation target in the steady-state visual evoked potential electroencephalogram signal;
determining the target stimulation target gazed by the user based on a specific intermodulation frequency component corresponding to the target stimulation target and a specific intermodulation frequency component corresponding to the candidate stimulation target.
7. The method of claim 6, wherein the determining the stimulation target of interest at which the user is gazing based on a particular intermodulation frequency component to which the stimulation target of interest corresponds and a particular intermodulation frequency component to which the stimulation target of interest corresponds comprises:
comparing a specific intermodulation frequency component corresponding to the target stimulation target with a specific intermodulation frequency component corresponding to the candidate stimulation target, and determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency component;
determining the candidate stimulation target with the largest correlation of the specific intermodulation frequency components as the target stimulation target gazed at by the user.
8. A brain-computer interface data processing apparatus, comprising:
the display module is used for keeping a first brightness conversion frequency of candidate stimulation targets unchanged and configuring a plurality of different first motion frequencies for each candidate stimulation target;
determining the first brightness transition frequency and the first motion frequency for each of the candidate stimulation targets, displaying the candidate stimulation targets on a display screen at the first brightness transition frequency and the first motion frequency;
the acquisition module is used for responding to the gaze of the user on the candidate stimulation target and acquiring the electroencephalogram signals when the user gazes on the candidate stimulation target;
and the determining module is used for determining the target stimulation target stared by the user based on the electroencephalogram signals.
9. The apparatus of claim 8, wherein the candidate stimulation target comprises a rectangular box including a rectangular stimulation block therein; the display module is used for:
displaying the flickering stimulus block on a display screen at the first luminance conversion frequency;
displaying the stimulation block moving within the rectangular frame on a display screen at the first motion frequency.
10. The apparatus of claim 9, wherein the display module is further configured to:
displaying N candidate stimulation targets on the display screen simultaneously at the first luminance transformation frequency and the first motion frequency, wherein N is a positive integer.
11. The apparatus of claim 9 or 10, wherein the brain electrical signal is generated in response to the user gazing at the candidate stimulation target, the brain electrical signal comprising a steady state visual evoked potential for a particular intermodulation frequency component; the acquisition module is configured to:
determining a specific intermodulation frequency component corresponding to the candidate stimulus target based on the first luminance transformation frequency and the first motion frequency; wherein, in the N candidate stimulation targets, the specific intermodulation frequency component corresponding to each candidate stimulation target is different, where N is a positive integer;
and acquiring the electroencephalogram signals generated when the user gazes at the candidate stimulation target.
12. The apparatus of claim 11, wherein the obtaining module is configured to:
multiplying the first luminance conversion frequency by the harmonic number of the first luminance conversion frequency to obtain an integral multiple harmonic of the first luminance change frequency;
multiplying the first motion frequency by the harmonic number of the first motion frequency to obtain integral multiple harmonics of the first motion frequency; the harmonic number of the first luminance conversion frequency and the harmonic number of the first motion frequency are positive integers;
and adding or subtracting the integral multiple harmonic of the first brightness change frequency and the integral multiple harmonic of the first movement frequency to determine the specific intermodulation frequency component.
13. The apparatus of claim 12, wherein the determining module is configured to:
preprocessing the electroencephalogram signals, and determining steady-state visual evoked potential electroencephalogram signals;
performing feature extraction on the steady-state visual evoked potential electroencephalogram signal, and determining a specific intermodulation frequency component corresponding to the target stimulation target in the steady-state visual evoked potential electroencephalogram signal;
determining the target stimulation target gazed by the user based on a specific intermodulation frequency component corresponding to the target stimulation target and a specific intermodulation frequency component corresponding to the candidate stimulation target.
14. The apparatus of claim 13, wherein the determining module is configured to:
comparing a specific intermodulation frequency component corresponding to the target stimulation target with a specific intermodulation frequency component corresponding to the candidate stimulation target, and determining the candidate stimulation target with the maximum correlation of the specific intermodulation frequency components;
determining the candidate stimulation target with the largest correlation of the specific intermodulation frequency components as the target stimulation target gazed at by the user.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-7.
CN202210882050.1A 2022-07-26 2022-07-26 Brain-computer interface data processing method and device, electronic equipment and storage medium Active CN115268747B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210882050.1A CN115268747B (en) 2022-07-26 2022-07-26 Brain-computer interface data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210882050.1A CN115268747B (en) 2022-07-26 2022-07-26 Brain-computer interface data processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115268747A CN115268747A (en) 2022-11-01
CN115268747B true CN115268747B (en) 2023-04-14

Family

ID=83769099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210882050.1A Active CN115268747B (en) 2022-07-26 2022-07-26 Brain-computer interface data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115268747B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970273A (en) * 2014-05-09 2014-08-06 西安交通大学 Steady motion visual evoked potential brain computer interface method based on stochastic resonance enhancement
CN111012335A (en) * 2019-11-28 2020-04-17 燕山大学 Electroencephalogram intention decoding method based on nonnegative CP decomposition model
CN114115547A (en) * 2022-01-27 2022-03-01 中国医学科学院生物医学工程研究所 Target presentation method and device of hybrid brain-computer interface

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107748622A (en) * 2017-11-08 2018-03-02 中国医学科学院生物医学工程研究所 A kind of Steady State Visual Evoked Potential brain-machine interface method based on face perception
CN113515195A (en) * 2021-06-30 2021-10-19 杭州回车电子科技有限公司 Brain-computer interaction method and device based on SSVEP, electronic device and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970273A (en) * 2014-05-09 2014-08-06 西安交通大学 Steady motion visual evoked potential brain computer interface method based on stochastic resonance enhancement
CN111012335A (en) * 2019-11-28 2020-04-17 燕山大学 Electroencephalogram intention decoding method based on nonnegative CP decomposition model
CN114115547A (en) * 2022-01-27 2022-03-01 中国医学科学院生物医学工程研究所 Target presentation method and device of hybrid brain-computer interface

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Jiajun Lin ; Liyan Liang ; Xu Han.Cross-target transfer algorithm based on the volterra model of SSVEP-BCI.《 Tsinghua Science and Technology》.2021,第505-522页. *
结合稳态视觉诱发电位的多模态脑机接口研究进展;迟新一崔红岩陈小刚;《中国生物医学工程学报》;第204-213页 *

Also Published As

Publication number Publication date
CN115268747A (en) 2022-11-01

Similar Documents

Publication Publication Date Title
Chen et al. A high-itr ssvep-based bci speller
Adini et al. Perceptual learning in contrast discrimination: the effect of contrast uncertainty
Zhang et al. A novel convolutional neural network model to remove muscle artifacts from EEG
CN105260025B (en) Steady State Visual Evoked Potential brain machine interface system based on mobile terminal
CN109388448A (en) Image display method, display system and computer readable storage medium
Dugué et al. Distinct perceptual rhythms for feature and conjunction searches
Long et al. Negative emotional state modulates visual working memory in the late consolidation phase
CN109817168B (en) Display control method and device
CN103092340A (en) Brain-computer interface (BCI) visual stimulation method and signal identification method
Naber et al. Tri-stable stimuli reveal interactions among subsequent percepts: Rivalry is biased by perceptual history
Ovalle Fresa et al. Training enhances fidelity of color representations in visual long-term memory
CN112971809A (en) Brain rhythm information detection method and device and electronic equipment
CN114847975A (en) Electroencephalogram data processing method, device, system, computer device and storage medium
CN103914677B (en) A kind of action identification method and device
CN115268747B (en) Brain-computer interface data processing method and device, electronic equipment and storage medium
Nikolaev et al. Intermittent regime of brain activity at the early, bias-guided stage of perceptual learning
CN114947886A (en) Symbol digital conversion testing method and system based on asynchronous brain-computer interface
Kreutzer et al. Attention modulates visual size adaptation
Bruhn et al. Degree of certainty modulates anticipatory processes in real time.
Harel et al. Early electrophysiological markers of navigational affordances in scenes
CN112767508B (en) Pseudo-color display method, system, equipment and medium for respiratory gating image
Blickhan et al. 1/fp Characteristics of the Fourier power spectrum affects ERP correlates of face learning and recognition
Niu et al. A dynamically optimized time-window length for SSVEP based hybrid BCI-VR system
CN113509188A (en) Method and device for amplifying electroencephalogram signal, electronic device and storage medium
CN112329834B (en) Method and device for distributing video memory space during training of cyclic network model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240412

Address after: Room 4-3, 4th Floor, No. 25, Landianchang South Road, Haidian District, Beijing, 100000

Patentee after: Boruikang Technology (Beijing) Co.,Ltd.

Country or region after: China

Address before: 300192, 236 Bai Causeway Road, Tianjin, Nankai District

Patentee before: CHINESE ACADEMY OF MEDICAL SCIENCES INSTITUTE OF BIOMEDICAL ENGINEERING

Country or region before: China