CN115153592A - Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration - Google Patents

Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration Download PDF

Info

Publication number
CN115153592A
CN115153592A CN202210827770.8A CN202210827770A CN115153592A CN 115153592 A CN115153592 A CN 115153592A CN 202210827770 A CN202210827770 A CN 202210827770A CN 115153592 A CN115153592 A CN 115153592A
Authority
CN
China
Prior art keywords
stimulation
eye movement
eye
module
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210827770.8A
Other languages
Chinese (zh)
Inventor
林艳飞
郭嵘骁
谭鹰
罗熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202210827770.8A priority Critical patent/CN115153592A/en
Publication of CN115153592A publication Critical patent/CN115153592A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/398Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]

Abstract

The invention discloses a brain-eye synchronous mechanical arm control system with non-fixed stimulation duration, which comprises a virtual scene end, a hardware end and a signal processing end; the virtual scene end mainly provides real-time rendering of a virtual reality 3D scene to finish flicker stimulation and instruction feedback; the hardware end presents a virtual scene for a user and completes the synchronous acquisition of the electroencephalogram signal and the eye movement signal; and the signal processing end respectively calculates classification coefficients of the electroencephalogram signal and the eye movement signal, fusion is completed by adopting an eye movement data confidence coefficient and priori weighting method, and a control instruction is output. The system can determine the appropriate stimulation duration by a dynamic window method according to the characteristics of the user; the user can also control the starting and stopping of stimulation through three times of blinking and eye closing operations; the invention can realize the virtual mechanical arm control of the brain-eye synchronous fusion with non-fixed stimulation duration; the invention improves the classification accuracy, robustness and friendliness of the system, improves the autonomy of the user to the system operation, and meets the individual requirements.

Description

Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration
Technical Field
The invention belongs to the technical field of human-computer interaction and brain-computer interfaces, and particularly relates to a brain-eye synchronous mechanical arm control system with non-fixed stimulation duration.
Background
Brain-computer interface (BCI) technology is a communication system for transmitting information between the brain and the outside world, and has been widely used in the fields of rehabilitation training, assistance of disabled people, cognitive research, leisure and entertainment, and the like. And the mixed brain-computer interface combined with other physiological signals improves the accuracy of the brain-computer interface and widens the functions of the brain-computer interface. In addition, virtual Reality (VR) technology is added into the BCI, and richer and more diverse scene feedback can be provided for the user compared with the traditional brain-computer interface. Therefore, the mixed brain-computer interface system in the virtual reality environment can provide more efficient, low-cost and safe implementation ways for more training scenes.
However, the current virtual reality mixed brain-computer interface system still has a plurality of problems. Firstly, most of the existing systems need visual stimulation to induce a control signal, the stimulation duration is often a fixed value, the form does not meet the personalized requirements of users, and the flexibility of the system is low. Secondly, most of the existing systems continuously operate according to fixed stimulation intervals, users cannot freely control the stimulation to start and stop so as to have a rest in due time, and the system is poor in friendliness. Therefore, in the virtual reality mixed brain-computer interface system, how to realize flexible stimulation duration, a friendly stimulation start-stop mode and an effective method for brain-eye fusion is a problem worthy of being researched.
Disclosure of Invention
In view of the above, the invention provides a brain-eye synchronization mechanical arm control system with non-fixed stimulation duration, which can synchronously acquire electroencephalogram signals and eye movement signals, flexibly determine the stimulation duration, freely control the stimulation start and stop, more effectively realize brain-eye fusion, and improve the use experience of users.
A brain-eye synchronous mechanical arm control system with non-fixed stimulation duration comprises a hardware end, a virtual scene end and a signal processing end;
the hardware end comprises a virtual reality module, an electroencephalogram signal acquisition module and an eye movement signal acquisition module;
the virtual scene end renders a virtual reality 3D scene based on a virtual reality module, and comprises a virtual mechanical arm and a visual stimulation module; the visual stimulation module is provided with a plurality of flicker stimulation blocks which represent different actions to be executed by the virtual mechanical arm;
the signal processing end comprises an electroencephalogram signal processing module, an eye movement signal processing module and a fusion classification module;
the electroencephalogram signal processing module processes the electroencephalogram signals collected by the electroencephalogram signal collecting module to obtain electroencephalogram classification coefficients of the scintillation stimulation blocks;
the eye movement signal processing module processes the eye movement signals acquired by the eye movement signal acquisition module to obtain eye movement classification coefficients of the scintillation stimulation blocks; judging whether the eyes of the user are closed for more than a set time according to the eye movement signal, if so, sending a control instruction to the virtual scene end, and controlling the flicker stimulation block to stop flickering; after the stimulation is stopped, if the user blinks for 3 times, sending a control instruction to the virtual scene end, controlling the blinking stimulation block to blink, and continuously setting the stimulation duration; the determination process of the set stimulation duration is as follows: in the calibration stage, gradually increasing the scintillation duration of the scintillation stimulation block by a set step length, obtaining the brain-eye fusion classification accuracy under each scintillation duration, and determining the set stimulation duration of the current user according to the satisfaction degree of the classification accuracy;
the fusion classification module fuses the electroencephalogram classification coefficient and the eye movement classification coefficient by adopting a priori weighting method to obtain a classification result, namely, the flicker stimulation block selected by the user is determined, and the virtual mechanical arm is controlled to execute the action represented by the flicker stimulation block.
Preferably, the virtual scene end is realized by a Unity engine; the electroencephalogram signal acquisition module, the eye movement signal acquisition module and the virtual reality module at the hardware end are all connected with the computer through USB ports, and the electroencephalogram signal processing module, the eye movement signal processing module and the fusion classification module are realized on the computer.
Preferably, the electroencephalogram signal acquisition module and the eye movement signal acquisition module finish synchronous acquisition of the brain and eye data by marking the stimulation occurrence time through a Trigger.
Preferably, the fusion classification module adopts the following stepsObtaining a classification result rho by a formula fuse
Figure BDA0003747156290000021
Wherein, C eye The confidence coefficient of the eye movement signal is set, if more than half of the eye movement signals of the two eyes are null values, the confidence coefficient of the eye movement data is set to be 0, and the confidence coefficient of the eye movement data is set to be 1 under the other conditions; ACC (adaptive cruise control) eye For the prior classification accuracy of the eye movement, norm (g) is the normalization operation, p eye For eye movement classification coefficients, ACC eeg For the EEG prior classification accuracy, ρ eeg Is an electroencephalogram classification coefficient.
Preferably, the eye movement signal acquisition module calibrates the eye movement and the electroencephalogram signal respectively in the process of determining the set stimulation duration before data acquisition, that is, a user watches all the flicker stimulation blocks, and determines the eye movement position coordinates and the brain-eye first-test classification accuracy of the center of each flicker stimulation block.
Preferably, in the determination of the set stimulation duration, the blinking duration is incremented by a step size of 0.1 s.
Further, the virtual scene end further comprises a visual and auditory feedback function for instructing the execution effect: the visual feedback comprises the action of the mechanical arm for executing the corresponding instruction and the color change of the corresponding visual stimulation block, and the auditory feedback is the sound prompt of the corresponding instruction action.
Preferably, the virtual scene end and the signal processing end use a TCP/IP communication protocol to realize interaction.
Preferably, the eye movement signal acquisition module acquires an eye movement signal of a user through an embedded infrared eye movement instrument.
Preferably, the electroencephalogram signal acquisition module acquires electroencephalogram signals of a user through a wireless electroencephalogram amplifier.
The invention has the following beneficial effects:
(1) The invention provides a brain-eye synchronous mechanical arm control system with non-fixed stimulation duration, which has natural interaction of eye movement signals, can visually reflect the intention of a user, has good adaptability with electroencephalogram signals, effectively integrates the eye movement signals and the electroencephalogram signals, solves the problem of poor single-mode brain-computer interface effect, has high overall system execution speed and high classification accuracy, and has better human-computer interaction performance.
(2) The invention adopts a dynamic window method to realize visual stimulation with non-fixed time length, can select proper stimulation time length for each user, meets the individual requirements of the users and improves the information transmission rate and the robustness of the system.
(3) According to the invention, three-time blinking operation and three-time eye closing operation are added to realize flexible control of stimulus starting and stopping, so that a user has more autonomy for controlling a system, and the system friendliness is improved.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention.
Fig. 2 is a schematic view of a virtual scene of the robot arm according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention, and the described embodiments are only some embodiments of the present invention. All other embodiments obtained by a person skilled in the art without making any inventive step are within the scope of protection of the present invention.
As shown in fig. 1, the invention provides a brain-eye synchronous mechanical arm control system with non-fixed stimulation duration, which comprises a virtual scene end, a hardware end and a signal processing end. The virtual scene end comprises a virtual mechanical arm module and a visual stimulation module, the hardware end comprises a virtual reality module, an electroencephalogram signal acquisition module and an eye movement signal acquisition module, and the signal processing end comprises an electroencephalogram signal processing module, an eye movement signal processing module and a fusion classification module.
The virtual scene end mainly provides real-time rendering of a virtual reality 3D scene, the hardware end presents the virtual scene and signal acquisition for a user, and the signal processing end mainly completes signal processing and instruction output;
the virtual scene end comprises a virtual mechanical arm module and a visual stimulation module. The virtual mechanical arm module provides a virtual mechanical arm and other 3D scenes, can execute a command which a user wants to execute and provides visual and auditory feedback for the user; the stimulation module is a transparent plane containing a flicker stimulation block, and the stimulation block flickers after a user watches the corresponding stimulation block for a certain time;
the hardware end comprises a virtual reality module, an electroencephalogram signal acquisition module and an eye movement signal acquisition module. The virtual reality module is a head-wearing virtual reality device, and the electroencephalogram signal acquisition module acquires electroencephalograms of a user through a wireless electroencephalogram amplifier; the eye movement signal acquisition module acquires eye movement signals of a user through the embedded infrared eye movement instrument;
the signal processing end comprises an electroencephalogram signal processing module, an eye movement signal processing module and a fusion classification module. The electroencephalogram signal processing module carries out preprocessing operations such as filtering and baseline removal on the acquired electroencephalogram signals, and then carries out real-time processing on the electroencephalogram signals by using a filter bank typical correlation analysis algorithm to obtain electroencephalogram classification coefficients of all targets. The eye movement signal processing module is used for processing the abnormal value of the acquired eye movement signal and then processing the eye movement signal in real time by using a K mean algorithm to obtain eye movement classification coefficients of all targets; and the fusion classification module fuses the electroencephalogram classification coefficient and the eye movement classification coefficient by adopting a priori weighting method and outputs a final control instruction.
Further, the control system is deployed on the same computer. The virtual scene end is realized by a Unity engine. The electroencephalogram signal acquisition module, the eye movement signal acquisition module and the virtual reality module at the hardware end are all connected with a computer through USB ports, and electroencephalogram signal acquisition software and eye movement signal acquisition software are installed on the computer.
Furthermore, the electroencephalogram signal acquisition module and the eye movement signal acquisition module finish synchronous acquisition of brain and eye data by marking stimulation occurrence time through Trigger. The electroencephalogram data are transmitted to a signal processing end by a wireless electroencephalogram amplifier, and the eye movement data are transmitted to the signal processing end by an embedded infrared eye movement instrument.
Further, the fusion classification module determines a classification target by adopting a fusion mode of eye movement data confidence and prior weighting. Firstly, an electroencephalogram classification coefficient is obtained from an electroencephalogram signal processing module, and an eye movement classification coefficient is obtained from an eye movement signal processing module. And secondly, judging the effectiveness of the eye movement classification coefficient by using the confidence coefficient of the eye movement data. If more than half of the collected eye movement signals of the two eyes are null values, the confidence coefficient of the eye movement data is set to be 0, and the confidence coefficient of the eye movement data is set to be 1 under the other conditions. And thirdly, carrying out prior weighting by taking the classification accuracy of the brain electricity and the eye movement obtained in the calibration stage as a fusion coefficient to obtain a final classification target, thereby realizing a brain-eye synchronous control system.
Further, the stimulation duration of the system flash block is not fixed. Before the system is used, the eye movement signals and the stimulation window length are respectively calibrated, and the calibration of the eye movement signals means that a user watches all stimulation blocks to determine the eye movement position coordinates of the center of each stimulation block. And then, determining the duration of the flicker stimulation by adopting a dynamic window method according to the characteristics of the user. The dynamic window is realized by gradually increasing the length of the brain-eye effective data, calculating the brain-eye fusion classification accuracy under the condition of changing the stimulation duration, and determining the flicker stimulation duration suitable for the characteristics of each user according to the satisfaction degree of the classification accuracy.
Further, the system has the function of automatically controlling the stimulation start and stop by a user. The stimulation block will pause the blinking if a user eye closure is detected during the blinking stimulation period. After the pause, the stimulus block resumes blinking if three blinks of the user are detected.
Further, the system has a visual and auditory feedback function of instruction execution effect. The visual feedback comprises the action of the mechanical arm for executing the corresponding instruction and the color change of the corresponding visual stimulation block, and the auditory feedback is the sound prompt of the corresponding instruction action.
Further, the virtual scene end and the signal processing end use a self-defined TCP/IP communication protocol to realize interaction. The specific instruction content comprises the following steps: communication state, control command, stimulation end mark, action times and the like.
Example (b):
fig. 2 is a schematic view of a virtual scene of a mechanical arm, where the mechanical arm is a virtual control object, a plane where 8 stimulation blocks are located is a visual stimulation module, and the mechanical arm can execute 8 instructions of "up-rotation", "down-rotation", "left-rotation", "right-rotation", "forward", "backward", "grab" and "release".
Before the system is used formally, system calibration operation is required to complete estimation of the eye movement position coordinates of the centers of the 8 stimulation blocks and the brain-eye classification accuracy, and further the determination of the flicker stimulation duration is completed through a dynamic window method. After the system runs formally, a user is required to watch the stimulation blocks, the system sends out buzzing sound at intervals of fixed time, all the stimulation blocks start to flicker after buzzing sound, and after the flicker stimulation is finished, the signal processing end carries out fusion processing on the acquired electroencephalogram signals and the eye movement signals and finishes command output. And then the mechanical arm executes corresponding actions, namely the execution process of one action. The system automatically cycles the process until the user completes control of the robotic arm.
The non-fixed stimulation duration is determined by means of a dynamic window. And for the data obtained in the system calibration stage, the data duration is dynamically increased in steps of 0.1s, the brain-eye classification accuracy under each duration is respectively calculated, and the current data duration is selected as the duration for receiving stimulation by the user until the brain-eye fusion classification accuracy reaches a higher level and is kept stable.
The system realizes the synchronous acquisition of the brain electrical signals and the eye movement signals. The electroencephalogram signal acquisition module and the eye movement signal acquisition module mark the stimulus generation time through Trigger so as to synchronously acquire the electroencephalogram signal and the eye movement signal of a user; and determining the data marked by Trigger in a certain stimulation time period as effective data.
And the electroencephalogram signal processing module is used for processing the read effective electroencephalogram data in real time. Firstly, preprocessing of filtering and baseline removal is carried out, and then a typical correlation analysis algorithm of a filter bank is used for obtaining an electroencephalogram classification coefficient.
And the eye movement signal processing module is used for processing the read effective eye movement data in real time. Firstly, removing abnormal values of eye movement fixation coordinates, and then calculating the sum of Euclidean distances between all fixation coordinate points and 8 template centers by adopting a K-means algorithm to obtain eye movement classification coefficients.
The fusion classification module adopts prior weighted fusion to fuse classification values rho fuse As follows:
Figure BDA0003747156290000051
wherein, C eye For eye movement data confidence, ACC eye For the prior classification accuracy of the eye movement, norm (g) is the normalization operation, p eye For eye movement classification coefficients, ACC eeg For EEG prior classification accuracy, ρ eeg Is an electroencephalogram classification coefficient.
The system can provide visual and auditory feedback for the classification instructions, wherein the visual feedback comprises that the mechanical arm executes corresponding actions and color changes of corresponding stimulation blocks, and the auditory feedback is a Chinese prompt tone for playing corresponding action instructions.
The information interaction between the signal processing end and the virtual scene end depends on a TCP/IP protocol, wherein a data packet is a 16-system code containing a packet header, a function code, a data length, a data address, a data value and a packet tail, and the information interaction between the signal processing end and the virtual scene end is completed by encoding and decoding and real-time receiving and transmitting of the data packet. The specific instruction content comprises the following steps: communication state, control command, stimulation end mark, action times and the like.
Finally, if the eyes of the user are detected to be closed in the flicker stimulation stage, the flicker stimulation of the whole system is suspended, so that the user can rest or think; after the user is ready, the eye can blink three times to restart the blinking stimulus. The closed eye detection is realized by calculating the proportion of effective pupil diameter values of the two eyes in a fixed time window, and the three-time blink detection is realized by identifying positive and negative three-time peak values of first-order difference of the pupil diameters of the two eyes in the fixed time window.
It should be understood that the above-mentioned embodiments are only some of the embodiments of the present invention, and are only for the purpose of illustrating the technical concept and features of the present invention, which are intended to enable those skilled in the art to understand the contents of the present invention and implement the present invention accordingly, and thus the protection scope of the present invention is not limited thereby. All equivalent changes and modifications made according to the spirit of the present invention should be covered within the protection scope of the present invention.

Claims (10)

1. A brain-eye synchronous mechanical arm control system with non-fixed stimulation duration is characterized by comprising a hardware end, a virtual scene end and a signal processing end;
the hardware end comprises a virtual reality module, an electroencephalogram signal acquisition module and an eye movement signal acquisition module;
the virtual scene end renders a virtual reality 3D scene based on a virtual reality module, and comprises a virtual mechanical arm and a visual stimulation module; the visual stimulation module is provided with a plurality of flicker stimulation blocks which represent different actions to be executed by the virtual mechanical arm;
the signal processing end comprises an electroencephalogram signal processing module, an eye movement signal processing module and a fusion classification module;
the electroencephalogram signal processing module processes the electroencephalogram signals collected by the electroencephalogram signal collecting module to obtain electroencephalogram classification coefficients of the scintillation stimulation blocks;
the eye movement signal processing module processes the eye movement signals acquired by the eye movement signal acquisition module to obtain eye movement classification coefficients of the scintillation stimulation blocks; judging whether the eyes of the user are closed for more than a set time according to the eye movement signal, if so, sending a control instruction to the virtual scene end, and controlling the flicker stimulation block to stop flickering; after the stimulation is stopped, if the user blinks for 3 times, sending a control instruction to the virtual scene end, controlling the blinking stimulation block to blink, and continuously setting the stimulation duration; the determination process of the set stimulation duration is as follows: in the calibration stage, gradually increasing the scintillation duration of the scintillation stimulation block by a set step length, obtaining the brain-eye fusion classification accuracy under each scintillation duration, and determining the set stimulation duration of the current user according to the satisfaction degree of the classification accuracy;
the fusion classification module fuses the electroencephalogram classification coefficient and the eye movement classification coefficient by adopting a priori weighting method to obtain a classification result, namely, the flicker stimulation block selected by the user is determined, and the virtual mechanical arm is controlled to execute the action represented by the flicker stimulation block.
2. The system of claim 1, wherein the virtual scene end is implemented by a Unity engine; the electroencephalogram signal acquisition module, the eye movement signal acquisition module and the virtual reality module at the hardware end are all connected with the computer through USB ports, and the electroencephalogram signal processing module, the eye movement signal processing module and the fusion classification module are realized on the computer.
3. The system as claimed in claim 1, wherein the electroencephalogram signal acquisition module and the eye movement signal acquisition module complete synchronous acquisition of the brain-eye data by marking the stimulation occurrence time through Trigger.
4. The system as claimed in claim 1, wherein the fusion classification module obtains the classification result ρ by using the following formula fuse
Figure FDA0003747156280000011
Wherein, C eye If more than half of the eye movement signals of the two eyes are null values, the confidence coefficient of the eye movement data is set to be 0, and the confidence coefficient of the eye movement data is set to be 1 under the other conditions; ACC (adaptive cruise control) eye For the prior classification accuracy of the eye movement, norm (g) is the normalization operation, p eye For eye movement classification coefficients, ACC eeg For EEG prior classification accuracy, ρ eeg Is the classification coefficient of the brain electricity.
5. The system as claimed in claim 1, wherein the eye movement signal acquisition module calibrates the eye movement and the electroencephalogram signal respectively in the process of determining the set stimulation duration before data acquisition, i.e. the user watches all the scintillation stimulation blocks to determine the eye movement position coordinates and the brain eye preoperative classification accuracy of the center of each scintillation stimulation block.
6. The system as claimed in claim 1, wherein the determination of the set stimulation duration is performed by incrementing the blinking duration in steps of 0.1 s.
7. The system for controlling the brain-eye synchronous mechanical arm with non-fixed stimulation duration according to claim 1, wherein the virtual scene end further comprises a visual-auditory feedback function for instructing the execution effect: the visual feedback comprises the action of the mechanical arm for executing the corresponding instruction and the color change of the corresponding visual stimulation block, and the auditory feedback is the sound prompt of the corresponding instruction action.
8. The system as claimed in claim 1, wherein the virtual scene end and the signal processing end interact with each other using a TCP/IP communication protocol.
9. The system as claimed in claim 1, wherein the eye movement signal collecting module collects the eye movement signal of the user through an embedded infrared eye movement instrument.
10. The system of claim 1, wherein the electroencephalogram signal acquisition module acquires the electroencephalogram signal of the user through a wireless electroencephalogram amplifier.
CN202210827770.8A 2022-07-14 2022-07-14 Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration Pending CN115153592A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210827770.8A CN115153592A (en) 2022-07-14 2022-07-14 Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210827770.8A CN115153592A (en) 2022-07-14 2022-07-14 Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration

Publications (1)

Publication Number Publication Date
CN115153592A true CN115153592A (en) 2022-10-11

Family

ID=83494356

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210827770.8A Pending CN115153592A (en) 2022-07-14 2022-07-14 Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration

Country Status (1)

Country Link
CN (1) CN115153592A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115590535A (en) * 2022-11-17 2023-01-13 季华实验室(Cn) Time window adjusting method, device and equipment for electroencephalogram signal identification and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115590535A (en) * 2022-11-17 2023-01-13 季华实验室(Cn) Time window adjusting method, device and equipment for electroencephalogram signal identification and storage medium

Similar Documents

Publication Publication Date Title
CN110824979B (en) Unmanned equipment control system and method
CN108829245B (en) A kind of virtual sand table intersection control routine based on multi-modal brain-machine interaction technology
CN112990074B (en) VR-based multi-scene autonomous control mixed brain-computer interface online system
CN112905015B (en) Meditation training method based on brain-computer interface
CN107957783B (en) Multi-mode intelligent control system and method based on electroencephalogram and electromyogram information
CN109032384B (en) Music playing control method and device, storage medium and wearable device
CN112114670B (en) Man-machine co-driving system based on hybrid brain-computer interface and control method thereof
CN108646915B (en) Method and system for controlling mechanical arm to grab object by combining three-dimensional sight tracking and brain-computer interface
CN110850987A (en) Specific identification control method and device based on two-dimensional intention expressed by human body
CN115153592A (en) Brain-eye synchronous mechanical arm control system with non-fixed stimulation duration
CN106648068A (en) Method for recognizing three-dimensional dynamic gesture by two hands
CN106362260A (en) VR emotion regulation device
CN110658742A (en) Multi-mode cooperative control wheelchair control system and method
CN106527711A (en) Virtual reality equipment control method and virtual reality equipment
CN113069125A (en) Head-mounted equipment control system, method and medium based on brain wave and eye movement tracking
KR20220116329A (en) Intraoral device control system
CN109446957B (en) EMG signal identification method
CN107452381B (en) Multimedia voice recognition device and method
Asakawa et al. An electric music baton system using a haptic interface for visually disabled persons
CN113082448A (en) Virtual immersion type autism children treatment system based on electroencephalogram signal and eye movement instrument
CN109901711B (en) Asynchronous real-time brain control method driven by weak myoelectricity artifact micro-expression electroencephalogram signals
CN111273578A (en) Real-time brain-controlled robot system based on Alpha wave and SSVEP signal control and control method
CN114613486A (en) Device for psychotherapy
CN114662606A (en) Behavior recognition method and apparatus, computer readable medium and electronic device
CN113552945B (en) Man-machine interaction glove system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination