CN115381465A - Rehabilitation training system based on BCI/VR and AR technologies - Google Patents

Rehabilitation training system based on BCI/VR and AR technologies Download PDF

Info

Publication number
CN115381465A
CN115381465A CN202210899344.5A CN202210899344A CN115381465A CN 115381465 A CN115381465 A CN 115381465A CN 202210899344 A CN202210899344 A CN 202210899344A CN 115381465 A CN115381465 A CN 115381465A
Authority
CN
China
Prior art keywords
patient
scene
motor
rehabilitation training
intention
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210899344.5A
Other languages
Chinese (zh)
Inventor
张海峰
张海燕
赵绍晴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Haitian Intelligent Engineering Co ltd
Original Assignee
Shandong Haitian Intelligent Engineering Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Haitian Intelligent Engineering Co ltd filed Critical Shandong Haitian Intelligent Engineering Co ltd
Priority to CN202210899344.5A priority Critical patent/CN115381465A/en
Publication of CN115381465A publication Critical patent/CN115381465A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation

Abstract

The invention discloses a rehabilitation training system based on BCI/VR and AR technologies, which comprises an EEG signal acquisition system, an AR and VR scene system, an off-line training system and a motor imagery detection system, wherein the EEG signal acquisition system acquires EEG signals through a non-implanted electrode; the AR and VR scene system modifies and re-renders the scene according to the imaginal movement intention in the electroencephalogram of the patient, and then feeds the scene back to the patient through AR or VR equipment, and the patient can complete rehabilitation training according to AR or VR animation; the off-line training system may pre-process the EEG signal; the motor imagery detection system may detect motor intent of the patient; the invention can accurately sense the motor imagery of the patient and carry out animation feedback, and has great significance for the rehabilitation of the patient.

Description

Rehabilitation training system based on BCI/VR and AR technologies
Technical Field
The invention relates to the technical field of limb rehabilitation, in particular to a rehabilitation training system based on BCI/VR and AR technologies.
Background
The existing virtual reality motion rehabilitation system based on a motor imagery brain-computer interface has certain promotion potential that related equipment of the system is complex and not portable, and is difficult to build in a home scene and carry out rehabilitation training; the cost is high; the motor imagery degree of the patient is not easy to be accurately sensed; the feedback mechanism of the rehabilitation training is video feedback mostly, the immersion of the patient is not high, the motor imagery state feedback is difficult to carry out efficiently, and targeted subjective adjustment is difficult to make.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a rehabilitation training system based on BCI/VR and AR technologies.
The invention is realized by the following technical scheme:
a rehabilitation training system based on BCI/VR and AR technologies comprises an EEG signal acquisition system, an AR, VR scene system, an off-line training system and a motor imagery detection system, wherein the EEG signal acquisition system acquires EEG signals through a non-implanted electrode; the AR and VR scene system modifies and re-renders the scene according to the imaginal movement intention in the electroencephalogram of the patient, and then feeds the scene back to the patient through AR or VR equipment, and the patient can complete rehabilitation training according to AR or VR animation; the off-line training system can preprocess EEG signals, and the preprocessing work comprises baseline drift removal, power frequency interference removal, ocular artifact removal and band-pass digital filtering, and also comprises an EEG signal characteristic extraction module and a classifier module; after receiving the EEG signal, the motor imagery detection system calculates the motor intention of the patient by using the trained feature extraction and classification model, adds the motor intention into a decision pool, then counts the decision result with the maximum probability in the decision pool, if the probability corresponding to the result is greater than a decision threshold, the system understands the corresponding motor intention, otherwise, does not output the motor intention understanding result.
Preferably, the EEG system comprises a set acquisition time, a set rest time and a set motor imagery action prompt, wherein the period of each acquisition time is 8S,0-2s, the screen display of each acquisition time is blank, the 2-4s screen prompt is about to start acquisition, and the 4-8s screen display is the motor imagery action prompt.
Preferably, the AR and VR scene system includes a scene rendering module and a feedback module, the scene rendering module obtains the movement intention of the patient through analysis and decoding, then generates an animation according to the movement intention, the generated animation is presented to the patient through the feedback module, and the feedback module is an AR or VR device.
Preferably, the EEG signal feature extraction module adopts a common spatial mode feature extraction algorithm, and the classifier adopts a support vector machine to classify the brain signals of the motor imagery of the patient.
The invention has the beneficial effects that: better nerve activation effect can be generated through animation scene feedback, the enthusiasm of motor imagery is mobilized, the motor imagery degree of a patient can be accurately sensed through BCI and AR/VR technology, and a rehabilitation system is displayed in a high immersion mode; the invention is light and portable, has low cost, can accurately sense the motor imagery of the patient and carry out animation feedback, and is very convenient for the use and rehabilitation of the patient.
Detailed Description
In order that the above objects, features and advantages of the present invention will be readily understood, numerous specific details are set forth in the following description in order to provide a thorough understanding of the present invention. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The present invention is described in detail below: the rehabilitation training system based on BCI/VR and AR technologies comprises an EEG signal acquisition system, an AR and VR scene system, an off-line training system and a motor imagery detection system, wherein the EEG signal acquisition system acquires EEG signals through a non-implanted electrode; the AR and VR scene system modifies and re-renders the scene according to the imaginal movement intention in the electroencephalogram of the patient, and then feeds the scene back to the patient through AR or VR equipment, and the patient can complete rehabilitation training according to AR or VR animation; the off-line training system can preprocess EEG signals, and the preprocessing work comprises baseline drift removal, power frequency interference removal, ocular artifact removal and band-pass digital filtering, and also comprises an EEG signal characteristic extraction module and a classifier module; after receiving the EEG signal, the motor imagery detection system calculates the motor intention of the patient by using the trained feature extraction and classification model, adds the motor intention into a decision pool, then counts the decision result with the maximum probability in the decision pool, if the probability corresponding to the result is greater than a decision threshold, the system understands the corresponding motor intention, otherwise, does not output the motor intention understanding result.
Preferably, the EEG system comprises a set acquisition time, a set rest time and a set motor imagery action prompt, wherein the period of each acquisition time is 8S,0-2s, the screen display of each acquisition time is blank, the 2-4s screen prompt is about to start acquisition, and the 4-8s screen display is the motor imagery action prompt.
Preferably, the AR and VR scene system includes a scene rendering module and a feedback module, the scene rendering module obtains the movement intention of the patient through analysis and decoding, and then generates an animation according to the movement intention, the generated animation is presented to the patient through the feedback module, and the feedback module is an AR or VR device.
Preferably, the EEG signal feature extraction module adopts a common spatial mode feature extraction algorithm, and the classifier adopts a support vector machine to classify the brain signals of the motor imagery of the patient.
EEG signal preprocessing:
(1) Removing baseline drift:
the sequence of data segments is set as
Figure RE-DEST_PATH_IMAGE001
First, an appropriate window length N (N) is set<N/2), calculating the average value of the data sequences in the window x (0) -x (N), and recording the average value as not x 1 (0). Then, setting a minimum step length p for moving a window, continuously moving the window to the right by a length, obtaining an abnormal value of a central point every time of moving, and then fitting to obtain all the central points
Figure RE-865608DEST_PATH_IMAGE002
. Where the length m of the fitted curve data is determined indirectly by the motion growth, i.e., l = (N-N)/p, and rounding is performed. Finally, handleUpsampling and reconstructing the just obtained fitted curve to obtain a curve
Figure RE-DEST_PATH_IMAGE003
. Subtracting the original signal sequence value and the fitted curve sequence value to obtain a signal with the baseline drift removed, namely:
Figure RE-973241DEST_PATH_IMAGE004
(2) Removing power frequency interference:
and (3) performing difference on an original signal containing a noise component and a reference interference signal, calculating deviation according to a preset self-adaptive algorithm such as least mean square error LMS and recursive least squares RLS, and adjusting the weight value according to the deviation feedback until the deviation is minimum. Assuming that the sequence of the input signal is s (i) + v (i), the power frequency interference signal is doped
Figure RE-DEST_PATH_IMAGE005
The artifact reference signal x (i) = cos (wn), and the output signal y (i) of the adaptive 50Hz filter can be obtained by the structure in the figure as follows:
Figure RE-682965DEST_PATH_IMAGE006
error signal output
Figure RE-DEST_PATH_IMAGE007
Is composed of
Figure RE-90812DEST_PATH_IMAGE008
The target cost function e (i) is chosen as:
Figure RE-DEST_PATH_IMAGE009
in consideration of the requirement that the real-time system requires the calculated amount to be as small as possible, the minimum mean square error LMS is selected as a self-adaptive function to carry out weight adjustment on parameters so as to eliminate power frequency interference. Updating the weight coefficient by adopting a steepest descent method, namely:
Figure RE-647695DEST_PATH_IMAGE010
in the formula: beta represents toneThe length of the step is adjusted,
Figure RE-649149DEST_PATH_IMAGE007
and representing the electroencephalogram signals after the power frequency interference is removed.
(3) Removing ocular artifacts:
firstly, input data s (t) is decomposed into n-dimensional vectors after passing through a hybrid system matrix A
Figure RE-DEST_PATH_IMAGE011
In the formula
Figure RE-192257DEST_PATH_IMAGE012
One of the components. Then, after sequentially passing through the spheroidizing matrix W and the orthogonal system matrix U, a final output y (t) = Uz (t) is obtained, and the obtained final output is obtained
Figure RE-DEST_PATH_IMAGE013
Signal components from a plurality of independent signal sources are obtained by decomposition.
(4) Band-pass digital filtering:
carrying out band-pass filtering on the EEG signals at 8 to 30Hz, wherein the used filter is a 6-order Butterworth filter, and setting stop band cut-off frequencies to be 6Hz and 32Hz respectively.
VR animation design: the system detects the electroencephalogram signal of a patient through a BCI technology, writes and decodes the electroencephalogram signal to obtain a movement intention, and feeds the movement intention back to the patient in a VR scene through an animation mode.
The invention adopts a support vector machine to split the electroencephalogram signal of the motor imagery of a patient, the support vector machine is used for searching a hyperplane to enable the interval of different samples on a characteristic space to be maximum, then new data are mapped to a uniform space, and a linearly separable electroencephalogram sample set is set as
Figure RE-329978DEST_PATH_IMAGE014
Wherein s is i Is the ithA sample is obtained; l i Is the category corresponding to the sample, and
Figure RE-DEST_PATH_IMAGE015
is the total number of samples. The decision plane is hyperplane
Figure RE-57762DEST_PATH_IMAGE016
Where k is the weight and b is the classification threshold, and the class is determined according to which side of the interval the sample data falls on.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (4)

1. The utility model provides a rehabilitation training system based on BCI/VR, AR technique which characterized in that: the system comprises an EEG signal acquisition system, an AR (augmented reality), a VR (virtual reality) scene system, an offline training system and a motor imagery detection system, wherein the EEG signal acquisition system acquires EEG signals through a non-implanted electrode; the AR and VR scene system modifies and re-renders the scene according to the imagination movement intention in the electroencephalogram of the patient, and then feeds back the scene to the patient through AR or VR equipment, and the patient can complete rehabilitation training according to AR or VR animation; the off-line training system can preprocess EEG signals, and the preprocessing work comprises baseline drift removal, power frequency interference removal, ocular artifact removal and band-pass digital filtering, and also comprises an EEG signal characteristic extraction module and a classifier module; after receiving the EEG signal, the motor imagery detection system calculates the motor intention of the patient by using the trained feature extraction and classification model, adds the motor intention into a decision pool, then counts the decision result with the maximum probability in the decision pool, if the probability corresponding to the result is greater than a decision threshold, the system understands the corresponding motor intention, otherwise, does not output the motor intention understanding result.
2. The BCI/AR, VR technology-based rehabilitation training system of claim 1, wherein: the EEG system comprises a set acquisition frequency, a set rest time and a set motor imagery action prompt, wherein the period of each acquisition frequency is 8S,0-2s, a screen displays blank, a 2-4s screen prompts that acquisition is about to start, and a 4-8s screen displays the motor imagery action prompt.
3. The BCI/AR, VR technology-based rehabilitation training system of claim 1, wherein: the AR and VR scene system comprises a scene rendering module and a feedback module, wherein the scene rendering module obtains the movement intention of the patient through analysis and decoding, then generates the animation according to the movement intention, the generated animation is presented to the patient through the feedback module, and the feedback module is AR or VR equipment.
4. The BCI/AR, VR technology-based rehabilitation training system of claim 1, wherein: the EEG signal feature extraction module adopts a common spatial mode feature extraction algorithm, and the classifier adopts a support vector machine to classify the brain signals of the motor imagery of the patient.
CN202210899344.5A 2022-07-28 2022-07-28 Rehabilitation training system based on BCI/VR and AR technologies Pending CN115381465A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210899344.5A CN115381465A (en) 2022-07-28 2022-07-28 Rehabilitation training system based on BCI/VR and AR technologies

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210899344.5A CN115381465A (en) 2022-07-28 2022-07-28 Rehabilitation training system based on BCI/VR and AR technologies

Publications (1)

Publication Number Publication Date
CN115381465A true CN115381465A (en) 2022-11-25

Family

ID=84116621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210899344.5A Pending CN115381465A (en) 2022-07-28 2022-07-28 Rehabilitation training system based on BCI/VR and AR technologies

Country Status (1)

Country Link
CN (1) CN115381465A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102985002A (en) * 2010-03-31 2013-03-20 新加坡科技研究局 Brain-computer interface system and method
CN106621287A (en) * 2017-02-07 2017-05-10 西安交通大学 Upper limb rehabilitation training method based on brain-computer interface and virtual reality technology
WO2018117439A1 (en) * 2016-12-23 2018-06-28 계명대학교 산학협력단 Game type rehabilitation system using brain-computer interface (bci) and control method therefor
CN108417249A (en) * 2018-03-06 2018-08-17 上海大学 The multi-modal healing hand function method of audiovisual tactile based on VR
CN113274032A (en) * 2021-04-29 2021-08-20 上海大学 Cerebral apoplexy rehabilitation training system and method based on SSVEP + MI brain-computer interface
CN113398422A (en) * 2021-07-19 2021-09-17 燕山大学 Rehabilitation training system and method based on motor imagery-brain-computer interface and virtual reality
US20220012489A1 (en) * 2020-07-10 2022-01-13 Korea University Research And Business Foundation Apparatus and method for motor imagery classification using eeg
CN114587391A (en) * 2022-03-10 2022-06-07 山东中科先进技术研究院有限公司 Brain-computer interface-based rehabilitation training device and training method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102985002A (en) * 2010-03-31 2013-03-20 新加坡科技研究局 Brain-computer interface system and method
WO2018117439A1 (en) * 2016-12-23 2018-06-28 계명대학교 산학협력단 Game type rehabilitation system using brain-computer interface (bci) and control method therefor
CN106621287A (en) * 2017-02-07 2017-05-10 西安交通大学 Upper limb rehabilitation training method based on brain-computer interface and virtual reality technology
CN108417249A (en) * 2018-03-06 2018-08-17 上海大学 The multi-modal healing hand function method of audiovisual tactile based on VR
US20220012489A1 (en) * 2020-07-10 2022-01-13 Korea University Research And Business Foundation Apparatus and method for motor imagery classification using eeg
CN113274032A (en) * 2021-04-29 2021-08-20 上海大学 Cerebral apoplexy rehabilitation training system and method based on SSVEP + MI brain-computer interface
CN113398422A (en) * 2021-07-19 2021-09-17 燕山大学 Rehabilitation training system and method based on motor imagery-brain-computer interface and virtual reality
CN114587391A (en) * 2022-03-10 2022-06-07 山东中科先进技术研究院有限公司 Brain-computer interface-based rehabilitation training device and training method

Similar Documents

Publication Publication Date Title
CN108310759B (en) Information processing method and related product
CN112120694A (en) Motor imagery electroencephalogram signal classification method based on neural network
CN110353672A (en) Eye artefact removal system and minimizing technology in a kind of EEG signals
CN110974212A (en) Electrocardio and myoelectric characteristic fused rehabilitation training motion state monitoring method and system
CN109770900A (en) Brain-computer interface based on convolutional neural networks instructs delivery method, system, device
CN112515685A (en) Multi-channel electroencephalogram signal channel selection method based on time-frequency co-fusion
CN114631831A (en) Cross-individual emotion electroencephalogram recognition method and system based on semi-supervised field self-adaption
CN111543984B (en) Method for removing ocular artifacts of electroencephalogram signals based on SSDA
CN114190953A (en) Training method and system of electroencephalogram signal noise reduction model for electroencephalogram acquisition equipment
CN112488002A (en) Emotion recognition method and system based on N170
CN115381465A (en) Rehabilitation training system based on BCI/VR and AR technologies
CN113208614A (en) Electroencephalogram noise reduction method and device and readable storage medium
CN108388345B (en) Brain electrode optimization method based on wavelet multi-resolution complex network and application thereof
CN114168651B (en) System for counting xerophthalmia patient distribution group by utilizing cloud computing
CN116028874A (en) Lightweight motor imagery electroencephalogram signal classification method
CN111990992A (en) Electroencephalogram-based autonomous movement intention identification method and system
Tang et al. Dynamic pruning group equivariant network for motor imagery EEG recognition
CN113126767A (en) PYNQ and multi-mode brain-computer interface-based aircraft control system and method
Upadhyay et al. Ocular artifact removal from EEG signals using discrete orthonormal stockwell transform
CN114533083B (en) Motor imagery state identification method based on multi-fusion convolutional neural network
Turnip et al. An experiment of ocular artifacts elimination from EEG signals using ICA and PCA methods
CN114082169B (en) Disabled hand soft body rehabilitation robot motor imagery identification method based on electroencephalogram signals
Ngo et al. EEG Signal-Based Eye Blink Classifier Using Convolutional Neural Network for BCI Systems
SONG et al. A method of motor imagery eeg recognition based on cnn-elm
CN108629302A (en) It is a kind of to use eye Activity recognition method based on convolutional neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination