CN116382481A - Man-machine interaction glasses based on brain-computer interface technology - Google Patents
Man-machine interaction glasses based on brain-computer interface technology Download PDFInfo
- Publication number
- CN116382481A CN116382481A CN202310361579.3A CN202310361579A CN116382481A CN 116382481 A CN116382481 A CN 116382481A CN 202310361579 A CN202310361579 A CN 202310361579A CN 116382481 A CN116382481 A CN 116382481A
- Authority
- CN
- China
- Prior art keywords
- module
- brain
- stimulation
- glasses
- electroencephalogram signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000011521 glass Substances 0.000 title claims abstract description 82
- 230000003993 interaction Effects 0.000 title claims abstract description 36
- 238000005516 engineering process Methods 0.000 title claims abstract description 18
- 230000000638 stimulation Effects 0.000 claims abstract description 56
- 230000002452 interceptive effect Effects 0.000 claims abstract description 33
- 230000000007 visual effect Effects 0.000 claims abstract description 14
- 238000000034 method Methods 0.000 claims description 24
- 210000004556 brain Anatomy 0.000 claims description 14
- 238000007781 pre-processing Methods 0.000 claims description 11
- 238000013507 mapping Methods 0.000 claims description 10
- 238000001914 filtration Methods 0.000 claims description 8
- 230000005540 biological transmission Effects 0.000 claims description 5
- 238000000605 extraction Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 5
- 210000003710 cerebral cortex Anatomy 0.000 claims description 4
- 230000000763 evoking effect Effects 0.000 claims description 4
- 210000004279 orbit Anatomy 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 4
- 238000000926 separation method Methods 0.000 claims description 4
- 230000008054 signal transmission Effects 0.000 claims description 4
- 238000004891 communication Methods 0.000 claims description 3
- 239000000284 extract Substances 0.000 claims description 3
- WABPQHHGFIMREM-UHFFFAOYSA-N lead(0) Chemical compound [Pb] WABPQHHGFIMREM-UHFFFAOYSA-N 0.000 claims description 3
- 230000004936 stimulating effect Effects 0.000 claims description 3
- 230000001976 improved effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 5
- 238000011160 research Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 206010052143 Ocular discomfort Diseases 0.000 description 1
- 208000003464 asthenopia Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 206010015037 epilepsy Diseases 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 210000001595 mastoid Anatomy 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 210000000869 occipital lobe Anatomy 0.000 description 1
- 210000001428 peripheral nervous system Anatomy 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 210000000857 visual cortex Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/377—Electroencephalography [EEG] using evoked responses
- A61B5/378—Visual stimuli
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/372—Analysis of electroencephalograms
- A61B5/374—Detecting the frequency distribution of signals, e.g. detecting delta, theta, alpha, beta or gamma waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Psychiatry (AREA)
- General Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Psychology (AREA)
- Neurosurgery (AREA)
- Neurology (AREA)
- Dermatology (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention discloses man-machine interaction glasses based on brain-computer interface technology. The man-machine interaction glasses comprise interaction glasses bodies, an electroencephalogram signal analysis module and a controlled module. The interactive glasses main body comprises a stimulation module and an electroencephalogram signal acquisition module. According to the invention, the stimulation module is used for providing stimulation with different frequencies, the electroencephalogram signal analysis module is used for analyzing the characteristic information in the multichannel electroencephalogram signals collected by the electroencephalogram signal collection module, positioning of the two-dimensional coordinate information of the fixation point relative to the stimulation coordinate system is carried out, the two-dimensional coordinate information is converted into a control instruction, the control instruction is transmitted to a corresponding controlled end for execution, and the information is fed back to a user in a visual form. According to the invention, the stimulation module is converted to the wearable equipment, so that the simplicity, convenience and portability of the SSVEP-BCI system are improved, and the sense of discomfort of a user in vision is relieved by adopting a space coding mode, so that the man-machine interaction habit of the user in daily life is more fitted.
Description
Technical Field
The invention relates to a man-machine interaction eye based on brain-computer interface technology, and belongs to the technical field of brain-computer interfaces.
Background
The brain-computer interface is used as a novel man-machine interaction mode, and the brain thinking activities and intentions are decoded into control instructions by detecting the brain electrical signals of the occipital lobe visual areas to interact with equipment in the surrounding environment. The method can transfer the intention of a person to external equipment only by establishing a direct path between the brain of the person and the external equipment without controlling the movement of related muscle groups by a peripheral nervous system. Common brain electrical control signals include P300, motor imagery (motorimage), steady-State visual induction (Steady-State VisualEvokedPotential, SSVEP), and the like. Compared with other types of electroencephalogram signals, the SSVEP signal has high signal-to-noise ratio and high information transmission rate, and is close to the reaction time used by human beings to the greatest extent, training can be omitted, and the characteristics of low threshold are used, so that the development of brain-computer interfaces based on the SSVEP is also in a situation of rising, and along with the research of the SSVEP signal, the research of the SSVEP signal gradually goes from theoretical research to the application of a real daily environment, the equipment of the SSVEP system has increasingly outstanding urgent demands on simplicity and portability, good use experience of users and rich application scenes. Compared with mobile intelligent terminals such as notebook computers, intelligent flat boards and the like, the interactive glasses have stronger space sense and multi-level user experience, and have richer and portable content characteristics and flexible and three-dimensional expression modes in the application process of collecting and feeding back information.
In addition, since the research of the brain-computer interface system based on the SSVEP at the present stage is mostly focused on the coding mode of the frequency and the phase of the middle-low frequency stimulation frequency band with the most obvious evoked response, namely, the visual stimulation provided by direct vision and having different flicker frequencies and phases in the middle-low frequency band is required by a user in the application of the brain-computer interface system, the target classification and identification are performed by collecting the sine-like waveform response generated by the visual cortex of the brain and extracting the characteristic information about the frequency and the phase. This long-term direct-view flickering stimulation process is easy to cause visual fatigue and visual discomfort of the user, and even has a certain risk of inducing epilepsy. Thus, unlike conventional encoding schemes, spatial encoding encodes a target with different relative spatial positions between the target and the stimulus, and extracts positional information by analyzing differences in the electroencephalogram signals by looking at the target instead of the stimulus itself. The space coding mode does not need a one-to-one mapping relation between visual stimulus and the gazed target as a medium, directly recognizes the target according to the difference between different positions and nerve responses, and outputs different instructions.
Disclosure of Invention
Technical problems: the invention aims to provide a man-machine interaction eye based on brain-computer interface technology, which is characterized in that a stimulation module is used for providing stimulation with different frequencies, characteristic information in acquired multichannel brain-electrical signals is analyzed to position two-dimensional coordinate information of a fixation point relative to a stimulation coordinate system, the two-dimensional coordinate information is converted into a control instruction and transmitted to a corresponding controlled end, and information is fed back to a user in a visual form.
The technical scheme is as follows: the invention provides man-machine interaction glasses based on brain-computer interface technology. Comprising the following steps: the device comprises an interaction glasses main body, an electroencephalogram signal analysis module and a controlled module; the connection modes among the modules can be connected by using a lead wire or are data transmission in a wireless communication mode of WiFi, bluetooth and infrared rays;
the interactive glasses main body comprises a stimulation module and an electroencephalogram signal acquisition module, wherein the stimulation module is used for providing stimulation of flickering at different frequencies around an eye socket, acquiring electroencephalogram signals and receiving feedback of a controlled module after executing instructions; the electroencephalogram signal acquisition module comprises a multichannel dry electrode and two reference electrodes for acquiring electroencephalogram signals, so that portability and scene universality of electroencephalogram acquisition equipment are realized;
the electroencephalogram signal analysis module comprises preprocessing of electroencephalogram signals, extraction of characteristic information, coordinate recognition and output of control instructions;
the controlled module comprises a PC end and intelligent mobile equipment and is used for receiving various instructions output by the electroencephalogram analysis module, executing corresponding operations and giving visual and auditory feedback to a user.
The lens part of the interactive glasses main body comprises a non-perspective display or a transparent display screen display device; the stimulus provided by the stimulus module is provided by an led array embedded around the glasses frame under the condition that the interactive glasses body has no lens display screen, or a graphic stimulus source or a mode reversal stimulus source is provided by a display device on the lens.
The stimulating shape of the interactive glasses main body is rectangular or elliptical; the coding mode is based on frequency or phase or the combination of the frequency and the phase; the waveform is sine wave, square wave or triangular wave; the flickering frequency of the stimulation is different stimulation frequencies of a low frequency band smaller than 15Hz, a middle frequency band between 15 and 30Hz or a high frequency band larger than 30 Hz.
The channels of the electroencephalogram signal acquisition module are selected from a plurality of combinations, namely thirty-two channels comprising the whole area of the cerebral cortex, or eight channels of the occipital area of the brain, and even more reduced channel number.
The electroencephalogram signal preprocessing in the electroencephalogram signal analysis module comprises baseline drift removal, notch pass filtering, band-pass filtering, blind source signal separation ICA (InndeendentCorrelation Algorithm) and data downsampling; the center frequency of the band-pass filter corresponds to the flicker frequency of the stimulation module and the harmonic frequency of the stimulation module.
The electroencephalogram signal analysis module is characterized in that component amplitude, phase and correlation coefficient of different frequency components and the ratio between every two of the component amplitude, phase and correlation coefficient are extracted as characteristic information, and two-dimensional coordinate information of the fixation point relative to the stimulation coordinate system is calculated by using an interpolation method or a fitting method.
The electroencephalogram signal acquisition module, the electroencephalogram signal analysis module and the controlled module are all integrally arranged on the main body of the interactive glasses, or the electroencephalogram signal analysis module and the controlled module are used as external equipment of the interactive glasses system; the signal transmission between the modules is wired or wireless.
In the mechanical structure, a wireless electroencephalogram signal processor is arranged on one side of an interactive glasses main body, a multi-channel non-invasive electrode is arranged on the rear side of the interactive glasses main body 1, a lens is arranged on the front side of the interactive glasses main body 1, and an LED array embedded in a glasses frame is arranged around the lens; and the electroencephalogram signal analysis module and the controlled module are positioned in the wireless electroencephalogram signal processor.
The electroencephalogram signal processing process of the interaction glasses main body, the electroencephalogram signal analysis module and the controlled module is as follows:
the stimulation of the main body frame of the interactive glasses flickers at different frequencies, when eyes of a user watch different positions, an electroencephalogram signal analysis module performs signal preprocessing and analyzes characteristic information of different frequency components in Steady visual evoked potential (SSVEP) signals, an interpolation method or a fitting method is adopted to find a mapping relation between the characteristic information and a two-dimensional coordinate position of a fixation point, a relative stimulation source coordinate system of the fixation point is positioned by using the mapping relation, and the mapping relation is converted into a control instruction and sent to a controlled module.
The stimulus source provided by the stimulus module can be a light stimulus source embedded in the upper, lower, left and right edges of the glasses frame, or a graphic stimulus source or a mode reversal stimulus source displayed on the screen of the glasses lens. As an improvement of the invention, the display device embedded in the eyeglass lens can be a non-perspective screen to increase the immersion of the user, and also can be a perspective screen, so that the normal field of view of the user is not influenced, the user can directly observe the surrounding environment information, and the daily application requirements are more met.
The beneficial effects are that: the invention can integrate flicker stimulation, signal acquisition and electroencephalogram signal analysis at one end, controls a controlled end through the control module, and the stimulation is not provided by using a large screen any more, but the stimulation module is transplanted to the glasses frame, the stimulation can be placed at the upper, lower, left and right edges of the glasses frame, or on display equipment on the glasses lenses, and the stimulation module is converted to wearable equipment, thereby improving the simplicity, convenience and portability of an SSVEP-BCI system and expanding the application mode of brain-computer interface equipment in complex dynamic scenes.
The invention adopts a space coding mode, and uses the limited stimulation frequency quantity to position the continuous coordinate information of the fixation point relative to the stimulation coordinate system. The user can control the place where the glasses look at in real time only by looking at one point in the area observed through the glasses without direct vision stimulation, so that the limitation of the activities of the user is reduced, the discomfort of the user in vision is relieved, and the user is more in line with the man-machine interaction habit in the daily life of the user.
Drawings
Fig. 1 is an overall block diagram of a system according to the present invention. The method comprises the following steps: an interactive glasses main body 1, an electroencephalogram signal analysis module 2 and a controlled module 3; the device comprises a stimulation module 1.1, an electroencephalogram signal acquisition module 1.2, an electroencephalogram signal preprocessing module 2.1, a characteristic information extraction module 2.2, a coordinate recognition module 2.3 and an output control instruction 2.4;
fig. 2 is a flow chart of the system of the present invention.
Fig. 3 is a schematic structural diagram of man-machine interaction glasses according to an embodiment of the present invention, where: a wireless brain signal processor A, a multichannel non-invasive electrode B, an LED array C embedded in a glasses frame and a lens D.
Fig. 4 is a schematic diagram of a graphical stimulus or a pattern reversal stimulus.
FIG. 5 is a diagram showing the electrode distribution according to the present invention, which is placed according to the 10-20 International Standard System electrode placement method.
Detailed Description
Exemplary embodiments of the present invention will be described in further detail below with reference to examples and corresponding drawings. The following examples or figures are intended to illustrate the application of the invention but are not intended to limit the scope of the invention.
The invention relates to man-machine interaction glasses based on brain-computer interface technology, which comprises: the device comprises an interaction glasses main body 1, an electroencephalogram signal analysis module 2 and a controlled module 3; the connection modes among the modules can be connected by using a lead wire or are data transmission in a wireless communication mode of WiFi, bluetooth and infrared rays;
the interactive glasses main body 1 comprises a stimulation module 1.1 and an electroencephalogram signal acquisition module 1.2, wherein the stimulation module 1.1 is used for providing stimulation of flickering at different frequencies around an eye socket, acquiring electroencephalogram signals and receiving feedback of a controlled module after executing instructions; the electroencephalogram signal acquisition module 1.2 comprises a multi-channel dry electrode and two reference electrodes to acquire electroencephalogram signals, so that portability and scene universality of electroencephalogram acquisition equipment are realized;
the electroencephalogram signal analysis module 2 comprises an electroencephalogram signal preprocessing module 2.1, a characteristic information extracting module 2.2, a coordinate identifying module 2.3 and a control instruction outputting module 2.4;
the controlled module 3 comprises a PC end and intelligent mobile equipment, and is used for receiving various instructions output by the electroencephalogram analysis module, executing corresponding operations and giving visual and auditory feedback to a user.
The lens part of the interactive glasses main body 1 comprises a non-perspective display or a transparent display screen display device; the stimulus provided by the stimulus module 1.1 is provided with a light stimulus source by an led array embedded around the glasses frame under the condition that the interactive glasses main body 1 has no lens display screen, or a graphic stimulus source or a mode reversal stimulus source is provided by display equipment on the lens; the stimulating shape is rectangular or elliptical; the coding mode is based on frequency or phase or the combination of the frequency and the phase; the waveform is sine wave, square wave or triangular wave; the flickering frequency of the stimulation is different stimulation frequencies of a low frequency band smaller than 15Hz, a middle frequency band between 15 and 30Hz or a high frequency band larger than 30 Hz. The channels of the electroencephalogram signal acquisition module 1.2 are selected from a plurality of combinations, namely thirty-two channels comprising the whole area of the cerebral cortex, or eight channels of the occipital area of the brain, and even the number of channels is reduced.
The electroencephalogram signal preprocessing 2.1 in the electroencephalogram signal analysis module 2 comprises baseline drift removal, notch-pass filtering, band-pass filtering, blind source signal separation ICA and data downsampling; the center frequency of the band-pass filter corresponds to the flicker frequency of the stimulation module 1.1 and the harmonic frequency of the stimulation. The characteristic information extraction 2.2 extracts component amplitude, phase, correlation coefficient and ratio between every two of different frequency components as characteristic information, and calculates two-dimensional coordinate information of the fixation point relative to the stimulation coordinate system by using interpolation or fitting method.
The electroencephalogram signal acquisition module 1.2, the electroencephalogram signal analysis module 2 and the controlled module 3 are all integrally arranged on the main body of the interactive glasses, or the electroencephalogram signal analysis module 2 and the controlled module 3 are used as external equipment of the interactive glasses system; the signal transmission between the modules is wired or wireless.
In the mechanical structure, a wireless electroencephalogram signal processor A is arranged on one side of an interactive glasses main body 1, a multi-channel non-invasive electrode B is arranged on the rear side of the interactive glasses main body 1, a lens D is arranged on the front side of the interactive glasses main body 1, and an LED array C embedded in a glasses frame is arranged around the lens D; the electroencephalogram signal analysis module 2 and the controlled module 3 are positioned in the wireless electroencephalogram signal processor A.
The electroencephalogram signal processing process of the interaction glasses main body 1, the electroencephalogram signal analysis module 2 and the controlled module 3 is as follows: the stimulus of the glasses frame of the interactive glasses main body 1 flickers at different frequencies, when eyes of a user watch different positions, the electroencephalogram signal analysis module 2 performs signal preprocessing and analyzes characteristic information of different frequency components in steady-state visual evoked potential SSVEP signals, a mapping relation between the characteristic information and a two-dimensional coordinate position of a fixation point is found by adopting an interpolation method or a fitting method, a relative stimulus source coordinate system of the fixation point is positioned by utilizing the mapping relation, and the relative stimulus source coordinate system is converted into a control instruction and is sent to the controlled module 3.
Example 1: an overall block diagram of man-machine interaction glasses based on brain-machine interface technology is shown in fig. 1, and comprises: the device comprises an electroencephalogram signal acquisition module, an electroencephalogram signal analysis module, a controlled module and an interactive glasses main body.
The interactive glasses main body comprises a stimulation module and an electroencephalogram signal acquisition module. And the stimulation module is used for providing stimulation of flickering at different frequencies around the eye socket and receiving feedback after executing instructions, and the electroencephalogram signal acquisition module comprises a multi-channel dry electrode and two reference electrodes for acquiring electroencephalogram signals, so that portability and scene universality of the electroencephalogram signal acquisition equipment are realized.
The electroencephalogram signal analysis module comprises preprocessing of electroencephalogram signals, extraction of characteristic information, classification and identification and output of control instructions. The electroencephalogram signal preprocessing in the electroencephalogram signal analysis module comprises baseline drift removal, notch-pass filtering, band-pass filtering and blind source signal separation (ICA) and data downsampling. The direct current signal, the power frequency interference, and the interference caused by the electrooculogram and the electromyogram signals are removed, the accuracy of the classification and identification of the gazing direction is improved, and the stability and portability of the system are improved. The center frequency of the band-pass filter corresponds to the flicker frequency of the stimulation module and the harmonic frequency of stimulation of the stimulation module.
The characteristic information in the electroencephalogram signal analysis module is to extract component amplitude, phase and correlation coefficient of different frequency components and the ratio between every two of the component amplitude, phase and correlation coefficient as characteristic information, and calculate the two-dimensional coordinate information of the fixation point relative to the stimulation coordinate system by using an interpolation method or a fitting method.
And the controlled module is used for receiving various instructions output by the electroencephalogram signal analysis module, executing corresponding operations and giving visual and auditory feedback to a user.
The system flow of the embodiment 1 disclosed by the invention is shown in fig. 2, and the man-machine interaction glasses based on the brain-computer interface technology are realized by the following steps: the stimulus of the interactive glasses frame flickers at different frequencies, when eyes of a user watch different positions, characteristic information of different frequency components in the SSVEP signal is preprocessed and analyzed, a mapping relation between the characteristic information and a two-dimensional coordinate position of a fixation point is found by adopting an interpolation method or a fitting method, and the positioning of a relative stimulus source coordinate system of the fixation point is obtained by utilizing the mathematical relation and the characteristics, so that the decoding of space information is realized, and the space information is converted into a control instruction and sent to a controlled end.
Referring to fig. 3, when the user uses the system, the user needs to wear the interactive glasses body on the head and place the electrodes 2 in the occipital region of the brain, and two reference electrodes are respectively placed at mastoid processes on two sides behind the ear and perform impedance examination, and the impedance values of all the lead channels are kept below 5kΩ. The stimulus 3 around the glasses frame flickers at different frequencies at the same time, a user only needs to watch the position to be operated, and does not need to directly watch the stimulus 3, the collected brain electrical signals are processed by the wireless brain electrical signal amplifier 1 and then are connected to the brain electrical signal processing module through the WiFi local area network, the space position watched by the user is decoded and converted into a corresponding control instruction, and the controlled module executes corresponding operation to realize visual feedback.
Example 2: this embodiment is substantially the same as embodiment 1 except that: the lens portion 4 of the interactive spectacle body may not comprise a display screen, but may comprise a display device such as a non-see-through display or a transparent display screen. The stimulus provided by the glasses body can be provided by the led array 3 embedded around the glasses frame under the condition of no lens display screen, and can also be provided by the display device on the lens 4. The graphical stimulus or the pattern reversal stimulus displayed on the lens is shown in fig. 4. Four stimulation blocks are respectively placed at the upper, lower, left and right boundaries of the display screen, the flicker frequencies are the frequencies with stronger SSVEP signals in the low-frequency areas of 15Hz, 16Hz, 17Hz and 18Hz, and the middle is a user interface, namely the range where a user needs to look. The shape of the stimulus provided by the spectacle lens 4 may be rectangular or elliptical; the coding mode can be based on frequency or phase or the combination of the frequency and the phase; the waveform may be a sine wave, square wave or triangular wave. In addition, the flicker frequency of the stimulus is not limited to the low frequency band below 30Hz, and different stimulus frequencies of the high frequency band can be flexibly combined.
Example 3: this embodiment is substantially the same as embodiment 1, except that: referring to fig. 4, the channels of the electroencephalogram signal acquisition module may be selected in various combinations, may be thirty-two channels including the whole area of the cerebral cortex, may be eight channels of the occipital region of the brain, or may be even more reduced in number.
Example 4: this embodiment is substantially the same as embodiment 1, except that: in the connection of each module structure, the electroencephalogram signal acquisition module and the electroencephalogram signal analysis module can be integrally arranged on the main body of the interactive glasses, and the electroencephalogram signal analysis module and the controlled module can be used as external equipment of the interactive glasses system according to application requirements. The signal transmission between the modules can be wired or wireless, and the wireless transmission mode can be Bluetooth, wifi, radio frequency or infrared.
The foregoing description of the preferred embodiments of the invention has been presented only to illustrate and describe the basic principles and features of the invention and the advantages of the invention and is not intended to limit the invention. Various modifications and variations of the present invention will be apparent to those skilled in the art. The present invention is subject to various changes and modifications without departing from the spirit and scope thereof, and such changes and modifications fall within the scope of the invention as hereinafter claimed. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (9)
1. Human-computer interaction glasses based on brain-computer interface technology, characterized by comprising: the device comprises an interactive glasses main body (1), an electroencephalogram signal analysis module (2) and a controlled module (3); the connection modes among the modules can be connected by using a lead wire or are data transmission in a wireless communication mode of WiFi, bluetooth and infrared rays;
the interactive glasses main body (1) comprises a stimulation module (1.1) and an electroencephalogram signal acquisition module (1.2), wherein the stimulation module (1.1) is used for providing stimulation of flickering at different frequencies around an eye socket, acquiring electroencephalogram signals and receiving feedback of a controlled module after executing instructions; the electroencephalogram signal acquisition module (1.2) comprises a multi-channel dry electrode and two reference electrodes to acquire electroencephalogram signals, so that portability and scene universality of the electroencephalogram acquisition equipment are realized;
the electroencephalogram signal analysis module (2) comprises an electroencephalogram signal preprocessing (2.1), a characteristic information extraction (2.2), a coordinate recognition (2.3) and an output control instruction (2.4);
the controlled module (3) comprises a PC end and intelligent mobile equipment, and is used for receiving various instructions output by the electroencephalogram analysis module, executing corresponding operations and giving visual and auditory feedback to a user.
2. The human-computer interaction glasses based on brain-computer interface technology according to claim 1, characterized in that the interaction glasses body (1) comprises a non-see-through display or a transparent display screen display device in a lens part; the stimulus provided by the stimulus module (1.1) is provided by an led array embedded around the glasses frame under the condition that the interactive glasses main body (1) has no lens display screen, or a graphic stimulus source or a mode reversal stimulus source is provided by a display device on the lens.
3. The human-computer interaction glasses based on brain-computer interface technology according to claim 1, wherein the interaction glasses body (1) has a rectangular or elliptical stimulating shape; the coding mode is based on frequency or phase or the combination of the frequency and the phase; the waveform is sine wave, square wave or triangular wave; the flickering frequency of the stimulation is different stimulation frequencies of a low frequency band smaller than 15Hz, a middle frequency band between 15 and 30Hz or a high frequency band larger than 30 Hz.
4. The human-computer interaction glasses based on brain-computer interface technology according to claim 1, wherein the channels of the electroencephalogram signal acquisition module (1.2) are selected from a plurality of combinations, including thirty-two channels of the whole area of the cerebral cortex, or eight channels of the occipital region of the brain, and even more reduced channel numbers.
5. The human-computer interaction glasses based on brain-computer interface technology according to claim 1, wherein the electroencephalogram signal preprocessing (2.1) in the electroencephalogram signal analysis module (2) comprises baseline drift removal, notch-pass filtering, band-pass filtering, blind source signal separation ICA and data downsampling; the center frequency of the band-pass filter corresponds to the flicker frequency of the stimulation module (1.1) and the harmonic frequency of the stimulation.
6. The human-computer interaction glasses based on the brain-computer interface technology according to claim 1, wherein the electroencephalogram signal analysis module (2) extracts component amplitudes, phases, correlation coefficients and ratios between the component amplitudes, phases and correlation coefficients of different frequency components as characteristic information, and calculates two-dimensional coordinate information of the fixation point relative to the stimulation coordinate system by using an interpolation method or a fitting method.
7. The human-computer interaction glasses based on the brain-computer interface technology according to claim 1, wherein the brain-computer signal acquisition module (1.2), the brain-computer signal analysis module (2) and the controlled module (3) are all integrally arranged on the interaction glasses main body, or the brain-computer signal analysis module (2) and the controlled module (3) are used as external equipment of the interaction glasses system; the signal transmission between the modules is wired or wireless.
8. The human-computer interaction glasses based on the brain-computer interface technology according to claim 1, wherein a wireless brain-electrical signal processor (a) is arranged on one side of the interaction glasses main body (1) in a mechanical structure, a multi-channel non-invasive electrode (B) is arranged on the rear side of the interaction glasses main body (1), a lens (D) is arranged on the front side of the interaction glasses main body (1), and an LED array (C) embedded in a glasses frame is arranged around the lens (D); the electroencephalogram signal analysis module (2) and the controlled module (3) are positioned in the wireless electroencephalogram signal processor (A).
9. The human-computer interaction glasses based on the brain-computer interface technology according to claim 1, wherein the brain-computer interaction glasses main body (1), the brain-computer signal analysis module (2) and the controlled module (3) have the following brain-computer signal processing processes:
the method comprises the steps that the stimulation of a glasses frame of an interactive glasses main body (1) flickers at different frequencies, when eyes of a user watch different positions, an electroencephalogram signal analysis module (2) preprocesses signals and analyzes characteristic information of different frequency components in steady-state visual evoked potential SSVEP signals, an interpolation method or a fitting method is adopted to find a mapping relation between the characteristic information and a two-dimensional coordinate position of a fixation point, a relative stimulation source coordinate system of the fixation point is positioned by utilizing the mapping relation, and the mapping relation is converted into a control instruction and sent to a controlled module (3).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310361579.3A CN116382481A (en) | 2023-04-06 | 2023-04-06 | Man-machine interaction glasses based on brain-computer interface technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310361579.3A CN116382481A (en) | 2023-04-06 | 2023-04-06 | Man-machine interaction glasses based on brain-computer interface technology |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116382481A true CN116382481A (en) | 2023-07-04 |
Family
ID=86978358
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310361579.3A Pending CN116382481A (en) | 2023-04-06 | 2023-04-06 | Man-machine interaction glasses based on brain-computer interface technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116382481A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116880700A (en) * | 2023-09-07 | 2023-10-13 | 华南理工大学 | Raspberry group intelligent trolley control method and system based on wearable brain-computer interface |
-
2023
- 2023-04-06 CN CN202310361579.3A patent/CN116382481A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116880700A (en) * | 2023-09-07 | 2023-10-13 | 华南理工大学 | Raspberry group intelligent trolley control method and system based on wearable brain-computer interface |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Liu et al. | Implementation of SSVEP based BCI with Emotiv EPOC | |
CN102793540B (en) | Method for optimizing audio-visual cognitive event-related potential experimental paradigm | |
CN108294748A (en) | A kind of eeg signal acquisition and sorting technique based on stable state vision inducting | |
KR101389015B1 (en) | Brain wave analysis system using amplitude-modulated steady-state visual evoked potential visual stimulus | |
CN110347242A (en) | Audio visual brain-computer interface spelling system and its method based on space and semantic congruence | |
CN116382481A (en) | Man-machine interaction glasses based on brain-computer interface technology | |
CN110262658B (en) | Brain-computer interface character input system based on enhanced attention and implementation method | |
CN116360600A (en) | Space positioning system based on steady-state visual evoked potential | |
CN106909226A (en) | A kind of polymorphic brain machine interface system | |
CN113180992A (en) | Upper limb rehabilitation exoskeleton closed-loop control system and method based on electroencephalogram interaction and myoelectricity detection | |
CN113101021B (en) | Mechanical arm control method based on MI-SSVEP hybrid brain-computer interface | |
CN101339413B (en) | Switching control method based on brain electric activity human face recognition specific wave | |
CN109567936B (en) | Brain-computer interface system based on auditory attention and multi-focus electrophysiology and implementation method | |
CN113359991B (en) | Intelligent brain-controlled mechanical arm auxiliary feeding system and method for disabled people | |
CN110688013A (en) | English keyboard spelling system and method based on SSVEP | |
Paul et al. | Attention state classification with in-ear EEG | |
CN114003129A (en) | Idea control virtual-real fusion feedback method based on non-invasive brain-computer interface | |
Tello et al. | Comparison between wire and wireless EEG acquisition systems based on SSVEP in an Independent-BCI | |
KR20140129820A (en) | Method and apparatus for object control using steady-state visually evoked potential | |
CN113082448A (en) | Virtual immersion type autism children treatment system based on electroencephalogram signal and eye movement instrument | |
Materka et al. | High-speed noninvasive brain-computer interfaces | |
CN114415833A (en) | Electroencephalogram asynchronous control software design method based on time-space frequency conversion SSVEP | |
Zao et al. | 37‐4: Invited Paper: Intelligent Virtual‐Reality Head‐Mounted Displays with Brain Monitoring and Visual Function Assessment | |
CN113407026B (en) | Brain-computer interface system and method for enhancing hairless zone brain electric response intensity | |
Zhang et al. | FPGA implementation of visual noise optimized online steady-state motion visual evoked potential BCI System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |