CN105183167A - Mouse movement control method based on movement imagery and mVEP signal control - Google Patents

Mouse movement control method based on movement imagery and mVEP signal control Download PDF

Info

Publication number
CN105183167A
CN105183167A CN201510589210.3A CN201510589210A CN105183167A CN 105183167 A CN105183167 A CN 105183167A CN 201510589210 A CN201510589210 A CN 201510589210A CN 105183167 A CN105183167 A CN 105183167A
Authority
CN
China
Prior art keywords
mvep
signal
movement
cursor
mouse
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510589210.3A
Other languages
Chinese (zh)
Inventor
马腾
徐鹏
杨丹
杨浩
李辉
刘铁军
尧德中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201510589210.3A priority Critical patent/CN105183167A/en
Publication of CN105183167A publication Critical patent/CN105183167A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses a mouse movement control method based on movement imagery and mVEP signal control. The mouse movement control method includes the following steps that S1, system initializing is carried out, wherein a user is connected with a computer brain-computer interface device through an electrode at the scalp, a working interface of the computer brain-computer interface device is opened, and targets and a mouse cursor appear on the working interface; S2, brain signal collecting is carried out, wherein mixed signals having mVEP characteristics and movement imagery characteristics at the same time are collected through the electrode and transmitted to a computer; S3, the collected mixed signals are processed, an MI-mVEP fused track is obtained, and a movement track of the mouse cursor is obtained. According to the mouse movement control method, mVEP potential signals and movement imagery are combined to carry out control, and a control mode is more optional and more flexible; in addition, stimulation of mVEP potential is small, a user can more conveniently carry out left-right-hand movement imagery control while observing mVEP stimulation, extracting of movement imagery signal characteristics can not be influenced, and the mouse movement control method can be widely applied to brain-computer interface systems.

Description

Based on the mouse control method for movement that Mental imagery and mVEP signal control
Technical field
The present invention relates to technical field of biological information, relate to brain-computer interface system, particularly a kind of mouse control method for movement controlled based on Mental imagery and mVEP signal.
Background technology
Brain-computer interface (BrainComputerInterface, BCI) be utilize to realize human brain between computing machine or other external electronic device and carry out directly exchanging and control channel (WolpawJR with the external world, BirbaumerN, McFarlandDJ, PfurtschellerG, VaughanTM (2002) Brain-computerinterfacesforcommunicationandcontrol.ClinN europhysiol113:767-791.).BCI research relates to numerous subject, and such as: Neuscience, input, signal transacting, pattern-recognition, control theory etc., the cross development of these subjects has promoted the propelling of BCI research.The basic theory of BCI and clinical application research, be included into the category of brain science and neuroengineering, also by international a lot of authoritative institution think 21 century brain and Neuroscience Research in one of forefront and focus.
The main target of brain-computer interface is between brain and computing machine, set up a direct passage, for the people of physical disabilities or linguistic function obstacle provides a feasible pattern of linking up with the external world.Brain-computer interface achieves the control of mysterious idea type, is directly exchanged with the external world by brain, thus compensatory or the compensatory people of part are congenital or the day after tomorrow loses and the communication exchange function in the external world, reaches the object improving people's quality of life.
Nowadays be the epoch of information explosion, the mode mainly internet of people's obtaining information, and the terminal of internet is tantamount to the PC in people's hand.But for the handicapped people of body, controlling computer is almost an impossible mission.In order to change the destiny of this part people, make them also can obtain and control the computer ability of being socially reintegrated, people devise the online Multimode Control System of BCI, achieve the control to mouse function: the random movement of mouse, left mouse button are clicked and right-click.In two dimensional cursor controls, cursor movement can by its control realization up and down.Although this control mode is simple, what cause is rough motion, because cursor can only direction motion that vertically four, left and right is fixing.In actual life, the motion of cursor is smooth track is natural, and this just needs tested simultaneously completing to control while upper and lower orientation and orientation, left and right.It is all based on single mode or time-sharing multimode state that Most current BCI controls, and also has to utilize P300 and Mental imagery to carry out controlling.But the uncertain poor-performing of P300 signal, pungency is strong, easily has an impact to the feature extraction of Mental imagery signal.
Summary of the invention
The object of the invention is to overcome the deficiencies in the prior art, a kind of mVEP electric potential signal and Mental imagery of adopting is provided to jointly control, control mode is more arbitrarily with flexible, and the stimulation of mVEP current potential is little, the more convenient mouse control method for movement controlled based on Mental imagery and mVEP signal carrying out the control of the Mental imagery of right-hand man while watching mVEP stimulation attentively of user can be made.
The object of the invention is to be achieved through the following technical solutions: the mouse control method for movement controlled based on Mental imagery and mVEP signal, comprises the following steps:
S1, system initialization: by Electrode connection user and the computing machine brain-computer interface equipment at scalp place, open the working interface of computing machine brain-computer interface equipment, working interface occurs target and cursor of mouse;
S2, brain signal collection: gather the mixed signal simultaneously with mVEP (motion originating vision Evoked ptential) and MI (Mental imagery characteristic signal) by electrode and be sent to computing machine;
S3, the mixed signal collected to be processed, obtain the fusion track of MI and mVEP, be the motion track of cursor of mouse.
Further, described step S3 comprises following sub-step:
The mixed signal of S31, two class signals has had the feature of two kinds of signals concurrently, therefore without the need to being separated, directly in mixed signal, carries out feature extraction and Classification and Identification respectively by two mode to MI signal and mVEP signal;
S32, online control is exported to MI signal and mVEP signal, specifically comprises following sub-step:
The tangential movement of S321, cursor of mouse and vertical movement are controlled by MI and mVEP respectively, make p x(t) and p yt () represents the x-axis of cursor t and the coordinate of y-axis respectively, Δ x and Δ y represents the increment of t+1 moment cursor in horizontal level and upright position respectively, then the position p of t+1 moment cursor xand p (t+1) y(t+1) renewal is expressed as:
{ p x ( t + 1 ) p x ( t ) + Δ x p y ( t + 1 ) p y ( t ) + Δ y - - - ( 1 )
This formula shows that the increment of its current time in x-axis and y-axis is depended in the position of cursor subsequent time, therefore the change in cursor movement direction is limited by Δ x and Δ y;
S322, make D hand D vrepresent the classification Output rusults of t MI and mVEP respectively, this classification problem is two classification, therefore D hand D vvalue be 1 or-1, v xand v yrepresent the rate travel (being set according to self adaptability by tested) of t horizontal direction and vertical direction respectively, then the positional increment Δ x of bead and Δ y is:
Δ x = D H × v x Δ y = D V × v y - - - ( 2 ) ;
S323, formula (2) to be brought in formula (1), obtain the fusion track of MI signal and mVEP signal, be the motion track of cursor of mouse.
The invention has the beneficial effects as follows: the present invention adopts mVEP electric potential signal and Mental imagery to jointly control, control mode is more arbitrarily with flexible, and compared to the mode that P300 and Mental imagery jointly control, the stability of mVEP signal characteristic is better than P300; And the stimulation of mVEP current potential is little, the more convenient control carrying out the Mental imagery of right-hand man while watching mVEP stimulation attentively of user can be made, and the extraction to Mental imagery signal characteristic can not be had influence on, can be widely used in brain-computer interface system.
Accompanying drawing explanation
Fig. 1 is the experiment interface of cursor of mouse control method confirmatory experiment of the present invention;
Fig. 2 is that a subject imagines the spectrogram at C3 and C4 electrode place at single mode and multi-modal task middle left and right hands movement;
Fig. 3 is that this subject imagines the head table topological diagram on maximum differential CSP filtering direction in single mode and multi-modal task middle left and right hands movement;
Fig. 4 be a subject in multi-modal task and single mode task, the mVEP signal amplitude waveform schematic diagram of visual area electrode.
Embodiment
Technical scheme of the present invention is further illustrated below in conjunction with the drawings and specific embodiments.
In two dimensional cursor controls, cursor movement can by its control realization up and down.Although this control mode is simple, what cause is rough motion, because cursor can only direction motion that vertically four, left and right is fixing.In actual life, the motion of cursor is smooth track is natural, and this just needs tested simultaneously completing to control while upper and lower orientation and orientation, left and right.We attempt adopting MI and mVEP to set up mixing BCI system for this reason, and the former is the thought of tested heart, and the latter is the stimulation of external view.
A kind of mouse control method for movement controlled based on Mental imagery and mVEP signal of the present invention, comprises the following steps:
S1, system initialization: by Electrode connection user and the computing machine brain-computer interface equipment at scalp place, open the working interface of computing machine brain-computer interface equipment, working interface occurs target and cursor of mouse;
S2, brain signal collection: gather the mixed signal simultaneously with mVEP (motion originating vision Evoked ptential) and MI (Mental imagery characteristic signal) by electrode and be sent to computing machine;
S3, the mixed signal collected to be processed, obtain the fusion track of MI and mVEP, be the motion track of cursor of mouse.
Further, described step S3 comprises following sub-step:
The mixed signal of S31, two class signals has had the feature of two kinds of signals concurrently, therefore without the need to being separated, directly in mixed signal, carries out feature extraction and Classification and Identification respectively by two mode to MI signal and mVEP signal;
S32, online control is exported to MI signal and mVEP signal, specifically comprises following sub-step:
The tangential movement of S321, cursor of mouse and vertical movement are controlled by MI and mVEP respectively, make p x(t) and p yt () represents the x-axis of cursor t and the coordinate of y-axis respectively, Δ x and Δ y represents the increment of t+1 moment cursor in horizontal level and upright position respectively, then the position p of t+1 moment cursor xand p (t+1) y(t+1) renewal is expressed as:
{ p x ( t + 1 ) p x ( t ) + Δ x p y ( t + 1 ) p y ( t ) + Δ y - - - ( 1 )
This formula shows that the increment of its current time in x-axis and y-axis is depended in the position of cursor subsequent time, therefore the change in cursor movement direction is limited by Δ x and Δ y;
S322, make D hand D vrepresent the classification Output rusults of t MI and mVEP respectively, this classification problem is two classification, therefore D hand D vvalue be 1 or-1, v xand v yrepresent the rate travel (being set according to self adaptability by tested) of t horizontal direction and vertical direction respectively, then the positional increment Δ x of bead and Δ y is:
Δ x = D H × v x Δ y = D V × v y - - - ( 2 ) ;
S323, formula (2) to be brought in formula (1), obtain the fusion track of MI signal and mVEP signal, be the motion track of cursor of mouse.
Below by experiment, cursor of mouse control method of the present invention is verified.
Experiment interface as shown in Figure 1, has four square frames glimmer to lay respectively at four angles at interface, when user watch attentively they one of them time, mVEP current potential can be produced, therefore they are used to the vertical movement controlling bead (cursor of expression mouse).When user watch attentively be positioned at above two angles flicker frame one of them time, little club moves upward; When watching flicker frame below attentively, little club moves downward.
System is once be activated, and the square frame of four flickers starts to glimmer with random sequence, and user ignores by watching one of them square frame attentively the motion that other three square frames control bead simultaneously.All square frames to glimmer the control once completed a bead direction of motion according to random sequence.The identification of current system to mVEP is by completing the superposed average of twice motion control, and namely the motion of bead vertical direction exports once every twice control.
If user wants to allow bead move upward, he just can watch the slider bar being positioned at two angles above interface attentively.When systems axiol-ogy goes out the signal characteristic of the corresponding mVEP of bead that he watches attentively, bead will move upward.In like manner, if user wants to allow bead move downward, he just can watch the slider bar being positioned at two angles below interface attentively.When systems axiol-ogy goes out the signal characteristic of the corresponding mVEP of bead that he watches attentively, bead will move downward.The control why moved up and down all adopts two slider bar to control, and is in order to avoid watching attentively when bead tested transfer too much when two borders, left and right at interface.
Mental imagery is passed through in the control of bead horizontal direction, when system identification goes out the imagination that left hand (right hand) moves, bead will (right side) move left, user wants to allow bead (right side) motion left, and he just needs the motion imagining left hand (right hand).
In systems in which, bead is encountered target (interface frame) and has been and has once tested.In each test, bead and target are all appear at random in interface.Tested needs realizes making bead run into target to the control of MI and mVEP simultaneously.Once experiment starts from target and bead when occurring at random.After the setup time of tested 2 seconds, four slider bar start random flicker, and the lines of the inside slide from left to right.Each slider bar flicker 124ms, and without the time interval between adjacent two flickers, therefore four slider bar 496ms consuming time altogether.But the time interval is 1000ms between the control of twice adjacent bead direction of motion.Because the extraction of system to mVEP signal is by completing the superposed average of twice continuous motion control, therefore the output cycle of vertical direction is 1992ms.Meanwhile, the identification of MI needs the eeg data of 6000ms to process, and every 2000ms, we upgrade the 2000ms data in 6000ms data, and send the steering order of horizontal motion.
Feasibility of the present invention the most important thing is to verify tested in multi-modal task EEG signals Mental imagery composition and mVEP composition be separate, therefore devise multi-modal combined experiments and test with single mode the verification scheme mutually compared.Have selected 7 tested carried out respectively Mental imagery single mode experiment, mVEP single mode experiment and multi-modal combined experiments.Wherein multi-modal combined experiments requires the tested visual stimulus that also will accept mVEP while carrying out right-hand man's Mental imagery, so just can collect the tested mixed signal two kinds of mode, by mixed signal respectively at Mental imagery, mVEP single mode experiment compare can verify tested in multi-modal task EEG signals Mental imagery composition and mVEP composition be separate.Fig. 2 is that a subject imagines the spectrogram at C3 and C4 electrode place at single mode and multi-modal task middle left and right hands movement, and in figure, (a) and (b) is in single mode task; C () and (d) is in multi-modal task.Fig. 3 is that this subject imagines the head table topological diagram on maximum differential CSP filtering direction in single mode and multi-modal task middle left and right hands movement, and in figure, (a) and (b) is in single mode task; C () and (d) is in multi-modal task.In Fig. 3 (a), from right to left, every layer parameter raises gradually; Fig. 3 (b), from left to right, every layer parameter raises gradually; In Fig. 3 (c), from left to right, every layer parameter raises gradually; In Fig. 3 (d), from left to right, every layer parameter raises gradually.As can be seen from figure we, tested no matter in single mode experiment or multi-modal combined experiments, the difference of right-hand man's task is all fairly obvious, and in two kinds of tests the results that obtain all very similar, can from the tested signal content extracting Mental imagery single mode complete the mixed signal of two kinds of mode.
Every tested many experiments result has been done to cross validation and calculated accuracy rate respectively as shown in table one and table two.
The each tested off-line Average Accuracy obtained respectively in the experiment of single mode Mental imagery of table one
The each tested off-line Average Accuracy obtained respectively in multi-modal combined experiments of table two
Can find out that the tested Average Accuracy obtained in two kinds of experiments is more or less the same, also can think that the tested multi-modal combined experiments carried out does not affect the recognition effect of tested Mental imagery single mode signal.
Fig. 4 be one tested respectively in multi-modal task and single mode task, the mVEP signal amplitude oscillogram of visual area electrode.As can be seen from the figure, tested no matter in single mode experiment or multi-modal combined experiments, each composition of tested mVEP signal is all fairly obvious, and in two kinds of tests the waveform results that obtain very similar, can from the tested signal content extracting mVEP single mode complete the mixed signal of two kinds of mode.Every tested many experiments result has been done to cross validation and calculated accuracy rate as table three with than shown in table four.
The each tested off-line Average Accuracy obtained respectively in single mode mVEP experiment of table three
The each tested off-line Average Accuracy obtained respectively in multi-modal combined experiments of table four
Can find out that the tested Average Accuracy obtained in two kinds of experiments is more or less the same, also can think that the tested multi-modal combined experiments carried out does not affect the recognition effect of tested mVEP single mode signal.
Finally, also the method for validity of the present invention of the most effectively checking the most directly perceived is exactly On-line Control experiment, allow 7 tested everyone carried out the online experiment of 4 groups, in online experiment, every tested multi-modal on-line system by us controls a bead appearing at control inerface corner at random and clashes into the target square being in middle, interface, complete shock to be denoted as and once to have tested, every tested is completed experiment, and we have added up the bead controlled in tested test and have advanced step number with final to test tested precise control rate as shown in the table at every turn:
On the whole, validity of the present invention and feasibility is all demonstrated from off-line analysis and online experiment result.
Those of ordinary skill in the art will appreciate that, embodiment described here is to help reader understanding's principle of the present invention, should be understood to that protection scope of the present invention is not limited to so special statement and embodiment.Those of ordinary skill in the art can make various other various concrete distortion and combination of not departing from essence of the present invention according to these technology enlightenment disclosed by the invention, and these distortion and combination are still in protection scope of the present invention.

Claims (2)

1., based on the mouse control method for movement that Mental imagery and mVEP signal control, it is characterized in that, comprise the following steps:
S1, system initialization: by Electrode connection user and the computing machine brain-computer interface equipment at scalp place, open the working interface of computing machine brain-computer interface equipment, working interface occurs target and cursor of mouse;
S2, brain signal collection: gather the mixed signal simultaneously with mVEP and MI by electrode and be sent to computing machine;
S3, the mixed signal collected to be processed, obtain the fusion track of MI and mVEP, be the motion track of cursor of mouse.
2. the mouse control method for movement controlled based on Mental imagery and mVEP signal according to claim 1, it is characterized in that, described step S3 comprises following sub-step:
S31, respectively feature extraction and Classification and Identification are carried out to MI signal and mVEP signal by two mode;
S32, online control is exported to MI signal and mVEP signal, specifically comprises following sub-step:
The tangential movement of S321, cursor of mouse and vertical movement are controlled by MI and mVEP respectively, make p x(t) and p yt () represents the x-axis of cursor t and the coordinate of y-axis respectively, Δ x and Δ y represents the increment of t+1 moment cursor in horizontal level and upright position respectively, then the position p of t+1 moment cursor xand p (t+1) y(t+1) renewal is expressed as:
p x ( t + 1 ) = p x ( t ) + Δ x p y ( t + 1 ) = p y ( t ) + Δ y - - - ( 1 )
This formula shows that the increment of its current time in x-axis and y-axis is depended in the position of cursor subsequent time, therefore the change in cursor movement direction is limited by Δ x and Δ y;
S322, make D hand D vrepresent the classification Output rusults of t MI and mVEP respectively, D hand D vvalue be 1 or-1, v xand v yrepresent the rate travel of t horizontal direction and vertical direction respectively, then the positional increment Δ x of bead and Δ y is:
Δ x = D H × v x Δ y = D V × v y - - - ( 2 ) ;
S323, formula (2) to be brought in formula (1), obtain the fusion track of MI signal and mVEP signal, be the motion track of cursor of mouse.
CN201510589210.3A 2015-09-16 2015-09-16 Mouse movement control method based on movement imagery and mVEP signal control Pending CN105183167A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510589210.3A CN105183167A (en) 2015-09-16 2015-09-16 Mouse movement control method based on movement imagery and mVEP signal control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510589210.3A CN105183167A (en) 2015-09-16 2015-09-16 Mouse movement control method based on movement imagery and mVEP signal control

Publications (1)

Publication Number Publication Date
CN105183167A true CN105183167A (en) 2015-12-23

Family

ID=54905293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510589210.3A Pending CN105183167A (en) 2015-09-16 2015-09-16 Mouse movement control method based on movement imagery and mVEP signal control

Country Status (1)

Country Link
CN (1) CN105183167A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112244774A (en) * 2020-10-19 2021-01-22 西安臻泰智能科技有限公司 Brain-computer interface rehabilitation training system and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102866775A (en) * 2012-09-04 2013-01-09 同济大学 System and method for controlling brain computer interface (BCI) based on multimode fusion
CN103699217A (en) * 2013-11-18 2014-04-02 南昌大学 Two-dimensional cursor motion control system and method based on motor imagery and steady-state visual evoked potential

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102866775A (en) * 2012-09-04 2013-01-09 同济大学 System and method for controlling brain computer interface (BCI) based on multimode fusion
CN103699217A (en) * 2013-11-18 2014-04-02 南昌大学 Two-dimensional cursor motion control system and method based on motor imagery and steady-state visual evoked potential

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘铁军等: "基于运动想象的脑机接口关键技术研究", 《中国生物医学工程学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112244774A (en) * 2020-10-19 2021-01-22 西安臻泰智能科技有限公司 Brain-computer interface rehabilitation training system and method

Similar Documents

Publication Publication Date Title
Foster et al. A freely-moving monkey treadmill model
Iturrate et al. A noninvasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation
CN101923392A (en) Asynchronous brain-computer interactive control method for EEG signal
CN101339455B (en) Brain machine interface system based on human face recognition specific wave N170 component
US20190369727A1 (en) Human-machine interaction method based on visual stimulation
CN104083258A (en) Intelligent wheel chair control method based on brain-computer interface and automatic driving technology
CN103955269A (en) Intelligent glass brain-computer interface method based on virtual real environment
CN102866775A (en) System and method for controlling brain computer interface (BCI) based on multimode fusion
CN111265212A (en) Motor imagery electroencephalogram signal classification method and closed-loop training test interaction system
CN107179683A (en) A kind of interaction intelligent robot motion detection and control method based on neutral net
CN103083014B (en) Method controlling vehicle by electroencephalogram and intelligent vehicle using method
CN104799984A (en) Assistance system for disabled people based on brain control mobile eye and control method for assistance system
Štrbac et al. Microsoft kinect-based artificial perception system for control of functional electrical stimulation assisted grasping
CN105708587A (en) Lower-limb exoskeleton training method and system triggered by brain-computer interface under motion imagination pattern
CN103116279A (en) Vague discrete event shared control method of brain-controlled robotic system
CN111258428B (en) Brain electricity control system and method
CN103892829A (en) Eye movement signal identification system and method based on common spatial pattern
CN105511600A (en) Multi-media man-machine interaction platform based on mixed reality
CN107981997A (en) A kind of method for controlling intelligent wheelchair and system based on human brain motion intention
CN103557862A (en) Detection method for movement track of mobile terminal
CN111714339B (en) Brain-myoelectricity fusion small-world neural network prediction method for human lower limb movement
CN106491251A (en) One kind is based on non-intrusion type brain-computer interface robotic arm control system and its control method
CN107193374A (en) A kind of detection means and detection method of the intentional gesture motion of active
CN109126045A (en) intelligent motion analysis and training system
CN105183167A (en) Mouse movement control method based on movement imagery and mVEP signal control

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20151223