CN108415554A - A kind of brain man-controlled mobile robot system and its implementation based on P300 - Google Patents
A kind of brain man-controlled mobile robot system and its implementation based on P300 Download PDFInfo
- Publication number
- CN108415554A CN108415554A CN201810048019.1A CN201810048019A CN108415554A CN 108415554 A CN108415554 A CN 108415554A CN 201810048019 A CN201810048019 A CN 201810048019A CN 108415554 A CN108415554 A CN 108415554A
- Authority
- CN
- China
- Prior art keywords
- sub
- module
- brain
- robot
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Neurosurgery (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Health & Medical Sciences (AREA)
- Dermatology (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention belongs to brain-computer interfaces and technical field of robot control, a kind of brain man-controlled mobile robot system and its implementation based on P300, wherein system include visual stimulus module and be sequentially connected with it subject, electrode cap module, Neuroscan brain wave acquisitions module, signal processing module, control interface module, Pioneer3 DX humanoid robot motion modules, the Pioneer3 DX humanoid robots motion module is also connected with visual stimulus module, and control interface module is also connected with Pioneer3 DX humanoid robot context detection modules.The present invention uses graphical symbol for new stimulus type, and combination improves brain-computer interface attribute, induces the P300 ingredients of more high-amplitude, improves system transfer rate.And brain-computer interface technology and automatic control technology are combined, it realizes that the interaction of brain control and robot autonomous control is shared using asynchronous controlling pattern, realizes the motor behavior that robot advances, retreats, turns left, turns right and remains stationary as, so that system is more stablized quick.
Description
Technical field
The present invention relates to a kind of brain man-controlled mobile robot system and its implementation based on P300, belong to brain-computer interface and machine
People's control technology field.
Background technology
Brain-computer interface (Brain-Computer Interface, BCI) is one established between brain and external equipment
The non-muscle communication port of kind realizes the exchange that brain is intended between external environment.BCI is as a kind of new human-computer interaction side
Formula provides new approaches for the brain idea control of equipment, becomes the research hotspot of field in intelligent robotics, erected " human brain
The bridge that biological intelligence " is combined with " artificial intelligence ".With merging for BCI technologies and robot automatic control technology, produce
One new technology-brain man-controlled mobile robot technology.In brain control system, according to different application scenarios, the BCI of selection controls signal
Also different, wherein it is adaptable based on the brain machine interface system of event-related potential N400 because the training time is short, become BCI and grinds
One of important hot spot in studying carefully.But since P300 noises are relatively low, superposition is needed repeatedly can just to obtain more apparent and stabilization
Waveform limits the transmission rate of system.In addition, P300 current potentials are Evoked ptentials, there is certain incubation period, currently based on
The BCI systems of P300, which are concentrated mainly on, instructs the discrete synchronous control of selection such as to spell task using upper.It is relatively low for noise
The problem of, it is mainly carried out by improving signal processing algorithm and experimental paradigm both direction, that is, optimizes feature extraction and classification is calculated
Method and reinforcement P300 ingredients.Present invention concentrates primarily on improve experimental paradigm.Classical P300 experimental paradigms are by Farwell
It is proposed with Donchin, all symbols are grouped by linescan method and are presented at random, and it is dry that experiment effect is susceptible to neighbour
Disturb, double sudden strain of a muscle problems and repeat blindness problems.Guan et al. proposes single sudden strain of a muscle experimental paradigm, although improving to a certain extent
Phenomenon is stated, but stimulus sequence length is long to be easy to cause subject's fatigue, is relatively suitble to the sign matrix of small size.Then grind
The person's of studying carefully using area dodges (similar sudden strain of a muscle single twice) and improves accuracy rate.Difference in addition to stimulating flashing mode, stimulus attribute also shadow
Ring experimental result.Researches show that brains to yellowish green and green awakening rate highest, and bluish-green combined transformation subject perception is more comfortable,
The symbol color background of traditional experiment normal form is changed to blue by Takano, is flickered for green, is improved experiment accuracy rate.In addition,
Many scholars also pay close attention to influence of the stimulus type to system, replace sign stimulus, induction to obtain P300 current potentials with face stimulation
Event related potential in addition provides more possibilities for BCI systems.For the incubation period problem of P300, in recent years
There is the research of many mixing brain-computer interfaces, such as P300+SSVEP brain-computer interfaces, P300+Mu/Beta brain-computer interfaces, although
Continuous asynchronous controlling is realized, but technology is not mature enough, and increase the complexity of system.
Invention content
In order to overcome the deficiencies in the prior art, the object of the present invention is to provide a kind of brain control machine based on P300
People's system and its implementation.The present invention is designed a kind of based on 5-oddball's by the combination improvement to stimulus attribute
P300 experimental paradigms, the normal form interface is simple, is easy to study and uses, and can induce stronger P300 ingredients, effectively reduces superposition
Number improves the transmission rate of system.In addition the characteristics of this system is according to brain man-controlled mobile robot system, by brain control and robot autonomous
Control is combined, and realizes that asynchronous controlling is shared in two kinds of control command interactions, compared to mixing brain-computer interface, this asynchronous mode is
Complexity of uniting reduces, and the stabilization for being conducive to robot quickly controls.
In order to achieve the above-mentioned object of the invention, in the presence of solving the problems, such as prior art, technical solution that the present invention takes
It is:A kind of brain man-controlled mobile robot system based on P300, including visual stimulus module and be sequentially connected with it subject, electrode cap
Module, Neuroscan brain wave acquisitions module, signal processing module, control interface module, the movement of Pioneer3-DX humanoid robots
Module, the Pioneer3-DX humanoid robots motion module are also connected with visual stimulus module, control interface module also with
Pioneer3-DX humanoid robot context detection modules be connected, the visual stimulus module, be for subject present visual stimulus with
Evoked brain potential signal P300, including 5-oddball normal forms visual stimulus interface configurations module and the parameter that is connected with are set
Module is set, module is presented in stimulation and stimulates design module, the signal processing module is to be converted into controlling by the EEG signals of acquisition
System instruction, including preprocessing module and the characteristic extracting module and tagsort module that are sequentially connected with it, the control interface
Module is to realize that the interaction of brain control and robot autonomous control is shared, including Compliance control module and the brain that is connected with
Control command module and environmental information module.
A kind of implementation method of the brain man-controlled mobile robot system based on P300, includes the following steps:
Step A, system initialization, to Neuroscan brain wave acquisition modules, visual stimulus module and Pioneer3-DX types
Robot motion's module carries out Initialize installation, specifically includes following sub-step:
Sub-step A1, initialization Neuroscan brain wave acquisition modules, setting P300 recording electrodes be Fz, C3, Cz, C4,
Pz, Oz, with being averaged as reference for left and right mastoid electrode A1, A2, the impedance of each electrode is less than 5k Ω, and setting sample frequency is
250Hz;
Sub-step A2, initialization visual stimulus module, using 5-oddball experimental paradigms, stimulus type chooses figure symbol
Number, centered on 〇, ↑, ↓, ←, → be respectively distributed to it is just upper, just under, just left, positive right four orientation arrange, interface by
Top to bottm is divided into three parts, and top shows that experiment technical personnel specifies subject to need the graphical symbol paid close attention to, such as fruit
It tests technical staff not require, this position is shown as empty, and the graphical symbol is fed back at middle part by current time, and lower part is figure
The whole background at sign stimulus interface, visual stimulus is white, and graphical symbol is black, and flash color is green, feedback prompts
For yellow, stimulation presentation mode be it is single dodge, duration of a scintillation 100ms, stimulus intervals 125ms, then by visual stimulus circle
Face is placed in the screen upper left corner;
Sub-step A3, initialization Pioneer3-DX humanoid robot motion modules, open MobileSim softwares, create virtual
Robot, and map making is loaded under path, the angle speed of setting robot initial coordinate position, the linear velocity of advance and steering
Degree, then map interface is placed in the screen upper right corner;
Sub-step A4, check the state of mind of subject, remind its to keep collected state, open system experiment, into step
Rapid B;
Step B, eeg signal acquisition, using electricity at the acquisition of Neuroscan brain wave acquisition modules Fz, C3, Cz, C4, Pz, Oz
The eeg data of pole, acquisition are divided into training stage and test phase, specifically include following sub-step:
Sub-step B1, acquisition EEG signals training data, specifically include following sub-step:
Sub-step B11, need subject to watch the visual stimulus interface for being presented on the screen upper left corner, interface attentively in gatherer process
Top can show the graphical symbol that experiment technical personnel needs subject to pay close attention to successively;
Sub-step B12, experiment start after, subject has the adjustment time of 2s, then stimulation start, 5 kinds stimulation with stochastic ordering
It is primary to arrange each flicker, referred to as 1 trial, the time interval between trial and trial is 500ms;
Sub-step B13, the graphical symbol that is shown according to interface top of subject, its flashing times is learnt by heart successively, when in interface
After feedback signal occurs in portion, next graphical symbol for needing to pay close attention to is turned to, training data includes 500 trials altogether, then
Into sub-step C1;
Sub-step B2, acquisition EEG signals test data, specifically include following sub-step:
Sub-step B21, need subject to watch the visual stimulus interface for being presented on the screen upper left corner, interface attentively in gatherer process
Top will not show that graphical symbol, subject first have to the map and robot that observation is presented on the screen upper right corner, confirm and want
Its direction moved;
Sub-step B22, with sub-step B12;
Sub-step B23, map and robot obtain according to the observation the direction of motion, learn corresponding graphical symbol flicker time by heart
It counts, wherein ← indicating at the uniform velocity 30 ° of left-hand rotation, → indicating that at the uniform velocity 30 ° of right-hand rotation, ↑ expression at the uniform velocity advance 0.5m, ↓ indicate at the uniform velocity to retreat
0.5m, 〇 represent the figure that remains stationary is motionless, and subject can pay close attention to according to the feedback signal adjustment needs shown in the middle part of interface
Symbol, test data includes 400 trials, subsequently into sub-step C2;
Step C, signal processing pre-processes EEG signals, and feature extraction and classification specifically include following sub-step
Suddenly:
Sub-step C1, feature extraction and classification based training, structural classification device model carried out by training data, specifically include with
Lower sub-step:
Sub-step C11, signal is pre-processed, select eeg signal acquisition in six electricity of Fz, C3, Cz, C4, Pz, Oz
Number of poles is tieed up according to signal processing module, data format is transmitted to for R × S, and here, R is equal to 6 and indicates that number of electrodes, S indicate sampled point,
Pass through the iir filter that cutoff frequency is 0.1Hz successively again and the FIR filter that cutoff frequency is 10Hz is pre-processed;
Sub-step C12, feature extraction is carried out to signal, 0ms is carved at the beginning of each single stimulation, EEG signals is divided
Section, every section is referred to as Epoch, and time window length is selected as 600ms, wherein preceding 100ms is baseline, the Epoch of homologous stimulus is folded
Add 5 times, data are down-sampled 25Hz later, totally 15 points, obtain new Epoch, data format is 6 × 15 dimensions, then will be obtained
Epoch splice successively according to electrode sequence, then the feature vector x of single stimulation extraction is 90 × 1 dimensions;
Sub-step C13, grader is trained, the grader selected is Fisher linear discriminant analysis graders, differentiate
Function g (x) is described by formula (1),
G (x)=wT x (1)
In formula, w is weight vectors, and x is feature vector, and training sample is X=[x1,x2,...,xN], sample size N is equal to
500, the wherein sample size of target stimulation is 100, and corresponding tag along sort is 1, and the sample size of non-target stimulation is 400, corresponding point
Class label is 0, and the optimal value w* of w is obtained by Fisher linear discrimination classification devices;
The induction of P300 is related to expectation, and unrelated with stimulation, so to the identification of P300, i.e. grader in brain-computer interface
Be divided into target stimulation class be denoted as 1 and non-target stimulation class be denoted as 0 two classes as a result, and classify according to formula (2),
In formula, label (x) is grader output function,
In practical P300 brain-computer interfaces, data input is n xiThe set of (i=1 ..., n), n indicate stimulation type
Number, have 5 kinds of stimulations (〇, ↑, ↓, ←, →), i.e. n=5 is classified by formula (3),
If grader identifies that 1 target stimulation class and n-1 non-target stimulation class, grader representative have instruction to export altogether,
If grader identifies 1 or more target stimulation class altogether, grader is represented to be exported without instruction, subsequently into sub-step B2;
Sub-step C2, P300 on-line checking, specifically include following sub-step:
Sub-step C21, with sub-step C11;
Sub-step C22, with sub-step C12;
Sub-step C23, by the grader application on-line system constructed, after 5 wheel stimulation flickers, the features of 5 kinds of stimulations to
The input of composition and classification device is measured, grader exports the corresponding control command of 5 kinds of stimulations according to formula (1) and formula (3), subsequently into
Step D;
Step D, Compliance control is realized, using the Pioneer3-DX type machines of ActivMedia Robotics companies of the U.S.
People controls, and carries out the data transmission between brain control order and robot by ICP/IP protocol, specifically includes following sub-step
Suddenly:
Sub-step D1, system judge whether there is brain control command information, have then enter sub-step D2, otherwise enter sub-step D3;
Sub-step D2, first carry out brain control order detection, judge whether it meets environmental information, i.e., detection work as barrier
When with robot apart from less than 0.5m, whether robot still receives the brain control order close to barrier, if there is then entering
Step D3, otherwise corresponding brain control command action is executed, i.e., that is, within the brain control order duration into brain control command mode
Brain control order is ↑ or ↓ when, robot at the uniform velocity moves forward or back 0.5m, brain control order is ← or → when, robot at the uniform velocity turn left or
Turn right 30 °, brain control order be 〇 when, robot remains stationary as, whether subsequent system judges the end of control command, end then into
Enter sub-step D1, otherwise wait for control command and terminate;
Sub-step D3, into robot autonomous control model, environmental information is acquired by robotic laser sensor, is passed through
The angular speed of the linear velocity and steering of Fuzzy Discrete Event Systems method calculating robot movement exports control command and judges it
Whether it is finished, is finished and then enters sub-step D1, otherwise wait for control command and be finished.
Present invention has the advantages that:A kind of brain man-controlled mobile robot system and its implementation based on P300, using different from
The graphical symbol of traditional letter character forms the new experimental paradigm of 5 kinds of selections, and meaning is referred to more for brain man-controlled mobile robot system
It is clear, strong applicability.White background and public more sensitive green is selected to reinforce in addition, stimulation color combines, to induce
P300 ingredients by a larger margin.Stimulation reinforces mode using the brightness singly dodged, and avoids " double sudden strain of a muscle effects " and " neighbour's interference " and asks
Topic influences.By the combination improvement to BCI system properties, its transmission rate is improved, and combines asynchronous BCI and robot shared
Brain control technology is, it can be achieved that brain man-controlled mobile robot controls more fast and accurately.
Description of the drawings
Fig. 1 is present system functional block diagram.
Fig. 2 is the visual stimulus module principle block diagram in the present invention.
Fig. 3 is the signal processing module functional block diagram in the present invention.
Fig. 4 is the control interface module principle block diagram in the present invention.
Fig. 5 is the method for the present invention flow chart of steps.
Fig. 6 is the visual stimulus module interfaces figure in the present invention.
Specific implementation mode
The invention will be further described below in conjunction with the accompanying drawings:
As shown in Figure 1,2,3, 4, a kind of brain man-controlled mobile robot system based on P300, including visual stimulus module and with its according to
Secondary connected subject, electrode cap module, Neuroscan brain wave acquisitions module, signal processing module, control interface module,
Pioneer3-DX humanoid robot motion modules, the Pioneer3-DX humanoid robots motion module also with visual stimulus module phase
Even, control interface module is also connected with Pioneer3-DX humanoid robot context detection modules, the visual stimulus module, be for
Visual stimulus is presented with evoked brain potential signal P300 in subject, including 5-oddball normal form visual stimulus interface configurations module is simultaneously
Parameter setting module, stimulation presentation module and the stimulation design module being connected with, the signal processing module is will to adopt
The EEG signals of collection are converted into control instruction, including preprocessing module and the characteristic extracting module and feature point that are sequentially connected with it
Generic module, the control interface module are to realize that the interaction of brain control and robot autonomous control is shared, including Compliance control module
And the brain control command module and environmental information module being connected with.
As shown in figure 5, a kind of implementation method of the brain man-controlled mobile robot system based on P300, includes the following steps:
Step A, system initialization, to Neuroscan brain wave acquisition modules, visual stimulus module and Pioneer3-DX types
Robot motion's module carries out Initialize installation, specifically includes following sub-step:
Sub-step A1, initialization Neuroscan brain wave acquisition modules, setting P300 recording electrodes be Fz, C3, Cz, C4,
Pz, Oz, with being averaged as reference for left and right mastoid electrode A1, A2, the impedance of each electrode is less than 5k Ω, and setting sample frequency is
250Hz;
Sub-step A2, initialization visual stimulus module, using 5-oddball experimental paradigms, as shown in fig. 6, stimulus type
Choose graphical symbol, centered on 〇, ↑, ↓, ←, → be respectively distributed to it is just upper, just under, just left, positive right four orientation carry out
Arrangement, interface are divided into three parts from top to bottom, and top shows that experiment technical personnel specifies subject to need the figure paid close attention to
Symbol, if experiment technical personnel does not require, this position is shown as empty, and the figure symbol is fed back at middle part by current time
Number, lower part is graphical symbol stimulation interface, and the whole background of visual stimulus is white, and graphical symbol is black, and flash color is
Green, feedback prompts are yellow, and stimulation presentation mode is that list dodges, duration of a scintillation 100ms, stimulus intervals 125ms, then
Visual stimulus interface is placed in the screen upper left corner;
Sub-step A3, initialization Pioneer3-DX humanoid robot motion modules, open MobileSim softwares, create virtual
Robot, and map making is loaded under path, the angle speed of setting robot initial coordinate position, the linear velocity of advance and steering
Degree, then map interface is placed in the screen upper right corner;
Sub-step A4, check the state of mind of subject, remind its to keep collected state, open system experiment, into step
Rapid B;
Step B, eeg signal acquisition, using electricity at the acquisition of Neuroscan brain wave acquisition modules Fz, C3, Cz, C4, Pz, Oz
The eeg data of pole, acquisition are divided into training stage and test phase, specifically include following sub-step:
Sub-step B1, acquisition EEG signals training data, specifically include following sub-step:
Sub-step B11, need subject to watch the visual stimulus interface for being presented on the screen upper left corner, interface attentively in gatherer process
Top can show the graphical symbol that experiment technical personnel needs subject to pay close attention to successively;
Sub-step B12, experiment start after, subject has the adjustment time of 2s, then stimulation start, 5 kinds stimulation with stochastic ordering
It is primary to arrange each flicker, referred to as 1 trial, the time interval between trial and trial is 500ms;
Sub-step B13, the graphical symbol that is shown according to interface top of subject, its flashing times is learnt by heart successively, when in interface
After feedback signal occurs in portion, next graphical symbol for needing to pay close attention to is turned to, training data includes 500 trials altogether, then
Into sub-step C1;
Sub-step B2, acquisition EEG signals test data, specifically include following sub-step:
Sub-step B21, need subject to watch the visual stimulus interface for being presented on the screen upper left corner, interface attentively in gatherer process
Top will not show that graphical symbol, subject first have to the map and robot that observation is presented on the screen upper right corner, confirm and want
Its direction moved;
Sub-step B22, with sub-step B12;
Sub-step B23, map and robot obtain according to the observation the direction of motion, learn corresponding graphical symbol flicker time by heart
It counts, wherein ← indicating at the uniform velocity 30 ° of left-hand rotation, → indicating that at the uniform velocity 30 ° of right-hand rotation, ↑ expression at the uniform velocity advance 0.5m, ↓ indicate at the uniform velocity to retreat
0.5m, 〇 represent the figure that remains stationary is motionless, and subject can pay close attention to according to the feedback signal adjustment needs shown in the middle part of interface
Symbol, test data includes 400 trials, subsequently into sub-step C2;
Step C, signal processing pre-processes EEG signals, and feature extraction and classification specifically include following sub-step
Suddenly:
Sub-step C1, feature extraction and classification based training, structural classification device model carried out by training data, specifically include with
Lower sub-step:
Sub-step C11, signal is pre-processed, select eeg signal acquisition in six electricity of Fz, C3, Cz, C4, Pz, Oz
Number of poles is tieed up according to signal processing module, data format is transmitted to for R × S, and here, R is equal to 6 and indicates that number of electrodes, S indicate sampled point,
Pass through the iir filter that cutoff frequency is 0.1Hz successively again and the FIR filter that cutoff frequency is 10Hz is pre-processed;
Sub-step C12, feature extraction is carried out to signal, 0ms is carved at the beginning of each single stimulation, EEG signals is divided
Section, every section is referred to as Epoch, and time window length is selected as 600ms, wherein preceding 100ms is baseline, the Epoch of homologous stimulus is folded
Add 5 times, data are down-sampled 25Hz later, totally 15 points, obtain new Epoch, data format is 6 × 15 dimensions, then will be obtained
Epoch splice successively according to electrode sequence, then the feature vector x of single stimulation extraction is 90 × 1 dimensions;
Sub-step C13, grader is trained, the grader selected is Fisher linear discriminant analysis graders, differentiate
Function g (x) is described by formula (1),
G (x)=wT x (1)
In formula, w is weight vectors, and x is feature vector, and training sample is X=[x1,x2,...,xN], sample size N is equal to
500, the wherein sample size of target stimulation is 100, and corresponding tag along sort is 1, and the sample size of non-target stimulation is 400, corresponding point
Class label is 0, and the optimal value w* of w is obtained by Fisher linear discrimination classification devices;
The induction of P300 is related to expectation, and unrelated with stimulation, so to the identification of P300, i.e. grader in brain-computer interface
Be divided into target stimulation class be denoted as 1 and non-target stimulation class be denoted as 0 two classes as a result, and classify according to formula (2),
In formula, label (x) is grader output function,
In practical P300 brain-computer interfaces, data input is n xiThe set of (i=1 ..., n), n indicate stimulation type
Number, has 5 kinds of stimulations, i.e. n=5 to classify by formula (3),
If grader identifies that 1 target stimulation class and n-1 non-target stimulation class, grader representative have instruction to export altogether,
If grader identifies 1 or more target stimulation class altogether, grader is represented to be exported without instruction, subsequently into sub-step B2;
Sub-step C2, P300 on-line checking, specifically include following sub-step:
Sub-step C21, with sub-step C11;
Sub-step C22, with sub-step C12;
Sub-step C23, by the grader application on-line system constructed, after 5 wheel stimulation flickers, the features of 5 kinds of stimulations to
The input of composition and classification device is measured, grader exports the corresponding control command of 5 kinds of stimulations according to formula (1) and formula (3), subsequently into
Step D;
Step D, Compliance control is realized, using the Pioneer3-DX type machines of ActivMedia Robotics companies of the U.S.
People controls, and carries out the data transmission between brain control order and robot by ICP/IP protocol, specifically includes following sub-step
Suddenly:
Sub-step D1, system judge whether there is brain control command information, have then enter sub-step D2, otherwise enter sub-step D3;
Sub-step D2, first carry out brain control order detection, judge whether it meets environmental information, i.e., detection work as barrier
When with robot apart from less than 0.5m, whether robot still receives the brain control order close to barrier, if there is then entering
Step D3, otherwise corresponding brain control command action is executed, i.e., that is, within the brain control order duration into brain control command mode
Brain control order is ↑ or ↓ when, robot at the uniform velocity moves forward or back 0.5m, brain control order is ← or → when, robot at the uniform velocity turn left or
Turn right 30 °, brain control order be 〇 when, robot remains stationary as, whether subsequent system judges the end of control command, end then into
Enter sub-step D1, otherwise wait for control command and terminate;
Sub-step D3, into robot autonomous control model, environmental information is acquired by robotic laser sensor, is passed through
The angular speed of the linear velocity and steering of Fuzzy Discrete Event Systems method calculating robot movement exports control command and judges it
Whether it is finished, is finished and then enters sub-step D1, otherwise wait for control command and be finished.
Claims (2)
1. a kind of brain man-controlled mobile robot system based on P300, including visual stimulus module and be sequentially connected with it subject, electricity
Polar cap module, Neuroscan brain wave acquisitions module, signal processing module, control interface module, Pioneer3-DX humanoid robots
Motion module, the Pioneer3-DX humanoid robots motion module are also connected with visual stimulus module, control interface module also with
Pioneer3-DX humanoid robot context detection modules are connected, it is characterised in that:The visual stimulus module, is to be in for subject
Existing visual stimulus is with evoked brain potential signal P300, including 5-oddball normal forms visual stimulus interface configurations module and distinguishes with it
Connected parameter setting module, stimulation presentation module and stimulation designs module, and the signal processing module is the brain electricity that will be acquired
Signal is converted into control instruction, including preprocessing module and the characteristic extracting module and tagsort module that are sequentially connected with it,
The control interface module, be realize that the interaction of brain control and robot autonomous control is shared, including Compliance control module and and its
The brain control command module and environmental information module being respectively connected with.
2. a kind of implementation method of the brain man-controlled mobile robot system based on P300 according to claim 1, it is characterised in that including
Following steps:
Step A, system initialization, to Neuroscan brain wave acquisition modules, visual stimulus module and Pioneer3-DX type machines
People's motion module carries out Initialize installation, specifically includes following sub-step:
Sub-step A1, initialization Neuroscan brain wave acquisition modules, setting P300 recording electrodes be Fz, C3, Cz, C4, Pz, Oz,
With being averaged as reference for left and right mastoid electrode A1, A2, the impedance of each electrode is less than 5k Ω, and setting sample frequency is 250Hz;
Sub-step A2, initialization visual stimulus module, using 5-oddball experimental paradigms, stimulus type chooses graphical symbol, with
Centered on 〇, ↑, ↓, ←, → be respectively distributed to it is just upper, just under, just left, positive right four orientation arrange, interface is arrived from above
Under be divided into three parts, top shows that experiment technical personnel specifies subject to need the graphical symbol paid close attention to, if experiment skill
Art personnel do not require, this position is shown as empty, and the graphical symbol is fed back at middle part by current time, and lower part is graphical symbol
Interface is stimulated, the whole background of visual stimulus is white, and graphical symbol is black, and flash color is green, and feedback prompts are Huang
Color, stimulation presentation mode are that list dodges, duration of a scintillation 100ms, stimulus intervals 125ms, then visual stimulus interface is set
In the screen upper left corner;
Sub-step A3, initialization Pioneer3-DX humanoid robot motion modules, open MobileSim softwares, create virtual machine
People, and map making is loaded under path, setting robot initial coordinate position, the linear velocity of advance and the angular speed of steering,
Map interface is placed in the screen upper right corner again;
Sub-step A4, check the state of mind of subject, remind its that collected state, open system experiment is kept to enter step B;
Step B, eeg signal acquisition, using electrode at the acquisition of Neuroscan brain wave acquisition modules Fz, C3, Cz, C4, Pz, Oz
Eeg data, acquisition are divided into training stage and test phase, specifically include following sub-step:
Sub-step B1, acquisition EEG signals training data, specifically include following sub-step:
Sub-step B11, need subject to watch the visual stimulus interface for being presented on the screen upper left corner, interface top attentively in gatherer process
It can show the graphical symbol that experiment technical personnel needs subject to pay close attention to successively;
Sub-step B12, experiment start after, subject has the adjustment time of 2s, then stimulation start, 5 kinds of stimulations are each with random sequence
Flicker is primary, referred to as 1 trial, and the time interval between trial and trial is 500ms;
Sub-step B13, the graphical symbol that is shown according to interface top of subject, learn its flashing times by heart successively, go out in the middle part of interface
After existing feedback signal, next graphical symbol for needing to pay close attention to is turned to, training data includes 500 trials altogether, subsequently into
Sub-step C1;
Sub-step B2, acquisition EEG signals test data, specifically include following sub-step:
Sub-step B21, need subject to watch the visual stimulus interface for being presented on the screen upper left corner, interface top attentively in gatherer process
It will not show that graphical symbol, subject first have to the map and robot that observation is presented on the screen upper right corner, confirm and want its fortune
Dynamic direction;
Sub-step B22, with sub-step B12;
Sub-step B23, map and robot obtain according to the observation the direction of motion, learn corresponding graphical symbol flashing times by heart,
In ← indicate at the uniform velocity to turn left 30 °, → indicate at the uniform velocity to turn right 30 °, ↑ indicate at the uniform velocity advance 0.5m, ↓ indicate at the uniform velocity to retreat 0.5m, 〇
The graphical symbol that remains stationary is motionless, and subject can pay close attention to according to the feedback signal adjustment needs shown in the middle part of interface is represented, is surveyed
It includes 400 trials to try data, subsequently into sub-step C2;
Step C, signal processing pre-processes EEG signals, and feature extraction and classification specifically include following sub-step:
Sub-step C1, feature extraction and classification based training, structural classification device model carried out by training data specifically include following son
Step:
Sub-step C11, signal is pre-processed, select eeg signal acquisition in six number of electrodes of Fz, C3, Cz, C4, Pz, Oz
According to signal processing module is transmitted to, data format is R × S dimensions, and here, R is equal to 6 and indicates number of electrodes, and S indicates sampled point, then according to
The FIR filter that the secondary iir filter and cutoff frequency for being 0.1Hz by cutoff frequency is 10Hz is pre-processed;
Sub-step C12, feature extraction is carried out to signal, 0ms is carved at the beginning of each single stimulation, EEG signals is segmented,
Every section is referred to as Epoch, and time window length is selected as 600ms, wherein preceding 100ms is baseline, the Epoch superpositions 5 of homologous stimulus
Secondary, data are down-sampled 25Hz later, totally 15 points, obtain new Epoch, data format is 6 × 15 dimensions, then will obtain
Epoch splices successively according to electrode sequence, then the feature vector x of single stimulation extraction is 90 × 1 dimensions;
Sub-step C13, grader is trained, the grader selected is Fisher linear discriminant analysis graders, discriminant function
G (x) is described by formula (1),
G (x)=wT x (1)
In formula, w is weight vectors, and x is feature vector, and training sample is X=[x1,x2,...,xN], sample size N is equal to 500,
Wherein the sample size of target stimulation is 100, and corresponding tag along sort is 1, and the sample size of non-target stimulation is 400, corresponding contingency table
Label are 0, and the optimal value w* of w is obtained by Fisher linear discrimination classification devices;
The induction of P300 is related to expectation, and unrelated with stimulation, so being divided into the identification of P300, i.e. grader in brain-computer interface
Target stimulation class be denoted as 1 and non-target stimulation class be denoted as 0 two classes as a result, and classify according to formula (2),
In formula, label (x) is grader output function,
In practical P300 brain-computer interfaces, data input is n xiThe set of (i=1 ..., n), n indicate stimulation species number, there is 5
Kind stimulation (〇, ↑, ↓, ←, →), i.e. n=5 is classified by formula (3),
If grader identifies that 1 target stimulation class and n-1 non-target stimulation class, grader representative have instruction to export altogether, if point
Class device identifies 1 or more target stimulation class altogether, then grader is represented exports without instruction, subsequently into sub-step B2;
Sub-step C2, P300 on-line checking, specifically include following sub-step:
Sub-step C21, with sub-step C11;
Sub-step C22, with sub-step C12;
Sub-step C23, by the grader application on-line system constructed, after 5 wheel stimulation flickers, the feature vector structures of 5 kinds of stimulations
Constituent class device inputs, and grader exports the corresponding control command of 5 kinds of stimulations according to formula (1) and formula (3), subsequently into step
D;
Step D, realize Compliance control, using ActivMedia Robotics companies of the U.S. Pioneer3-DX humanoid robots into
Row control carries out the data transmission between brain control order and robot by ICP/IP protocol, specifically includes following sub-step:
Sub-step D1, system judge whether there is brain control command information, have then enter sub-step D2, otherwise enter sub-step D3;
Sub-step D2, first carry out brain control order detection, judge whether it meets environmental information, i.e., detection work as barrier and machine
When device people's distance is less than 0.5m, whether robot still receives the brain control order close to barrier, if there is then entering step
D3, otherwise corresponding brain control command action, i.e. brain control are executed that is, within the brain control order duration into brain control command mode
Order for ↑ or ↓ when, robot at the uniform velocity moves forward or back 0.5m, brain control order is ← or → when, robot at the uniform velocity turns left or turns right
30 °, when brain control order is 〇, robot remains stationary as, and whether subsequent system judges the end of control command, terminates then to enter son
Step D1, otherwise wait for control command and terminate;
Sub-step D3, into robot autonomous control model, environmental information is acquired by robotic laser sensor, by fuzzy
The angular speed of the linear velocity and steering of discrete event system method calculating robot movement exports control command and whether judges it
It is finished, is finished and then enters sub-step D1, otherwise wait for control command and be finished.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810048019.1A CN108415554B (en) | 2018-01-18 | 2018-01-18 | Brain-controlled robot system based on P300 and implementation method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810048019.1A CN108415554B (en) | 2018-01-18 | 2018-01-18 | Brain-controlled robot system based on P300 and implementation method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108415554A true CN108415554A (en) | 2018-08-17 |
CN108415554B CN108415554B (en) | 2020-11-10 |
Family
ID=63125976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810048019.1A Active CN108415554B (en) | 2018-01-18 | 2018-01-18 | Brain-controlled robot system based on P300 and implementation method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108415554B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108836327A (en) * | 2018-09-06 | 2018-11-20 | 电子科技大学 | Intelligent outlet terminal and EEG signal identification method based on brain-computer interface |
CN109445580A (en) * | 2018-10-17 | 2019-03-08 | 福州大学 | Trust Game Experiments system based on brain-computer interface |
CN110244854A (en) * | 2019-07-16 | 2019-09-17 | 湖南大学 | A kind of artificial intelligence approach of multi-class eeg data identification |
CN111007725A (en) * | 2019-12-23 | 2020-04-14 | 昆明理工大学 | Method for controlling intelligent robot based on electroencephalogram neural feedback |
CN111273578A (en) * | 2020-01-09 | 2020-06-12 | 南京理工大学 | Real-time brain-controlled robot system based on Alpha wave and SSVEP signal control and control method |
CN111752392A (en) * | 2020-07-03 | 2020-10-09 | 福州大学 | Accurate visual stimulation control method in brain-computer interface |
CN111956933A (en) * | 2020-08-27 | 2020-11-20 | 北京理工大学 | Alzheimer's disease nerve feedback rehabilitation system |
CN112207816A (en) * | 2020-08-25 | 2021-01-12 | 天津大学 | Brain-controlled mechanical arm system based on view coding and decoding and control method |
CN114237385A (en) * | 2021-11-22 | 2022-03-25 | 中国人民解放军军事科学院军事医学研究院 | Human-computer brain control interaction system based on non-invasive electroencephalogram signals |
CN116492597A (en) * | 2023-06-28 | 2023-07-28 | 南昌大学第一附属医院 | Peripheral-central nerve regulation and control device and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4440661B2 (en) * | 2004-01-30 | 2010-03-24 | 学校法人 芝浦工業大学 | EEG control device and program thereof |
US20120059273A1 (en) * | 2010-09-03 | 2012-03-08 | Faculdades Catolicas, a nonprofit association, Maintainer of the Pontificia Universidade Cotolica | Process and device for brain computer interface |
CN103116279A (en) * | 2013-01-16 | 2013-05-22 | 大连理工大学 | Vague discrete event shared control method of brain-controlled robotic system |
CN103955270A (en) * | 2014-04-14 | 2014-07-30 | 华南理工大学 | Character high-speed input method of brain-computer interface system based on P300 |
-
2018
- 2018-01-18 CN CN201810048019.1A patent/CN108415554B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4440661B2 (en) * | 2004-01-30 | 2010-03-24 | 学校法人 芝浦工業大学 | EEG control device and program thereof |
US20120059273A1 (en) * | 2010-09-03 | 2012-03-08 | Faculdades Catolicas, a nonprofit association, Maintainer of the Pontificia Universidade Cotolica | Process and device for brain computer interface |
CN103116279A (en) * | 2013-01-16 | 2013-05-22 | 大连理工大学 | Vague discrete event shared control method of brain-controlled robotic system |
CN103955270A (en) * | 2014-04-14 | 2014-07-30 | 华南理工大学 | Character high-speed input method of brain-computer interface system based on P300 |
Non-Patent Citations (3)
Title |
---|
XIN’AN FAN等: "A P300 Brain-computer Interface for Controlling a Mobile Robot by Issuing a Motion Command", 《PROCEEDINGS OF 2013 ICME INTERNATIONAL CONFERENCE ON COMPLEX MEDICAL ENGINEERING》 * |
王金甲,杨成杰,胡备: "P300脑机接口控制智能小车系统的设计与实现", 《生物医学工程学杂志》 * |
马征等: "视觉ERP 脑机接口中实验范式的研究进展", 《中国生物医学工程学报》 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108836327A (en) * | 2018-09-06 | 2018-11-20 | 电子科技大学 | Intelligent outlet terminal and EEG signal identification method based on brain-computer interface |
CN109445580A (en) * | 2018-10-17 | 2019-03-08 | 福州大学 | Trust Game Experiments system based on brain-computer interface |
CN110244854A (en) * | 2019-07-16 | 2019-09-17 | 湖南大学 | A kind of artificial intelligence approach of multi-class eeg data identification |
CN111007725A (en) * | 2019-12-23 | 2020-04-14 | 昆明理工大学 | Method for controlling intelligent robot based on electroencephalogram neural feedback |
CN111273578A (en) * | 2020-01-09 | 2020-06-12 | 南京理工大学 | Real-time brain-controlled robot system based on Alpha wave and SSVEP signal control and control method |
CN111752392B (en) * | 2020-07-03 | 2022-07-08 | 福州大学 | Accurate visual stimulation control method in brain-computer interface |
CN111752392A (en) * | 2020-07-03 | 2020-10-09 | 福州大学 | Accurate visual stimulation control method in brain-computer interface |
CN112207816A (en) * | 2020-08-25 | 2021-01-12 | 天津大学 | Brain-controlled mechanical arm system based on view coding and decoding and control method |
CN111956933A (en) * | 2020-08-27 | 2020-11-20 | 北京理工大学 | Alzheimer's disease nerve feedback rehabilitation system |
CN114237385A (en) * | 2021-11-22 | 2022-03-25 | 中国人民解放军军事科学院军事医学研究院 | Human-computer brain control interaction system based on non-invasive electroencephalogram signals |
CN114237385B (en) * | 2021-11-22 | 2024-01-16 | 中国人民解放军军事科学院军事医学研究院 | Man-machine brain control interaction system based on non-invasive brain electrical signals |
CN116492597A (en) * | 2023-06-28 | 2023-07-28 | 南昌大学第一附属医院 | Peripheral-central nerve regulation and control device and storage medium |
CN116492597B (en) * | 2023-06-28 | 2023-11-24 | 南昌大学第一附属医院 | Peripheral-central nerve regulation and control device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108415554B (en) | 2020-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108415554A (en) | A kind of brain man-controlled mobile robot system and its implementation based on P300 | |
CN110222643B (en) | Steady-state visual evoked potential signal classification method based on convolutional neural network | |
CN104083258B (en) | A kind of method for controlling intelligent wheelchair based on brain-computer interface and automatic Pilot technology | |
CN103631941B (en) | Target image searching system based on brain electricity | |
CN107147342A (en) | A kind of induction motor parameter identification system and method | |
CN108836302A (en) | Electrocardiogram intelligent analysis method and system based on deep neural network | |
CN104758130B (en) | A kind of intelligent nursing device and method based on brain-computer interface | |
CN110442232A (en) | The wearable augmented reality robot control system of joint eye movement and brain-computer interface | |
CN103699226A (en) | Tri-modal serial brain-computer interface method based on multi-information fusion | |
CN103955270B (en) | Character high-speed input method of brain-computer interface system based on P300 | |
CN106940593B (en) | Emotiv brain control UAV system and method based on VC++ and Matlab hybrid programming | |
CN103083014B (en) | Method controlling vehicle by electroencephalogram and intelligent vehicle using method | |
CN107463250B (en) | The method for improving P300 spelling device using effect under Mental Workload state | |
CN105030206A (en) | System and method for detecting and positioning brain stimulation target point | |
CN105904459A (en) | Robot control system based on eye electrical signal recognition and design method of system | |
CN108992066A (en) | Portable lower limb behavior pattern real-time identifying system and method based on electromyography signal | |
CN110262658A (en) | A kind of brain-computer interface character input system and implementation method based on reinforcing attention | |
CN113205074A (en) | Gesture recognition method fusing multi-mode signals of myoelectricity and micro-inertia measurement unit | |
CN108509040A (en) | Mixing brain machine interface system based on multidimensional processiug and adaptive learning | |
CN103390193A (en) | Automatic training device for navigation-oriented rat robot, and rat behavior identification method and training method | |
CN105447475A (en) | Independent component analysis based glancing signal sample optimization method | |
CN109521873A (en) | Based on collaborative brain-computer interface control system and signal collaboration method | |
CN110472595B (en) | Electroencephalogram recognition model construction method and device and recognition method and device | |
CN106491251A (en) | One kind is based on non-intrusion type brain-computer interface robotic arm control system and its control method | |
Tang et al. | A shared-control based BCI system: For a robotic arm control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |