CN206249101U - Unmanned plane actuation means based on Steady State Visual Evoked Potential - Google Patents

Unmanned plane actuation means based on Steady State Visual Evoked Potential Download PDF

Info

Publication number
CN206249101U
CN206249101U CN201621196453.7U CN201621196453U CN206249101U CN 206249101 U CN206249101 U CN 206249101U CN 201621196453 U CN201621196453 U CN 201621196453U CN 206249101 U CN206249101 U CN 206249101U
Authority
CN
China
Prior art keywords
unmanned plane
manipulation
steady state
picture
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201621196453.7U
Other languages
Chinese (zh)
Inventor
葛盛
孙高鹏
刘慧�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201621196453.7U priority Critical patent/CN206249101U/en
Application granted granted Critical
Publication of CN206249101U publication Critical patent/CN206249101U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

Steady State Visual Evoked Potential is based on the utility model discloses one kind(Steady State Visual Evoked Potential, SSVEP)Unmanned plane actuation means, belong to Cognitive Neuroscience, information technology and automatically control the technical field for intersecting.The device includes the first manipulation unit, and the first manipulation unit includes:Visual stimulus unit, picture is induced for one group of Steady State Visual Evoked Potential of display simultaneously, and it is abstract by the manipulation instruction of unmanned plane that one group of Steady State Visual Evoked Potential induces picture;EEG Processing unit, processed so as to identify that the Steady State Visual Evoked Potential that manipulator is watched attentively induces picture for watching the EEG signals produced by visual stimulus unit attentively to manipulator, and the manipulation instruction that the Steady State Visual Evoked Potential is induced corresponding to picture is sent to unmanned plane execution.Compared to existing technology, the utility model can effectively improve the manipulation characteristic of unmanned plane, reduce manipulation difficulty.

Description

Unmanned plane actuation means based on Steady State Visual Evoked Potential
Technical field
The utility model is related to a kind of unmanned plane actuation means, more particularly to a kind of based on Steady State Visual Evoked Potential (Steady State Visual Evoked Potential, SSVEP)Unmanned plane actuation means, belong to cognition neural section Learn, information technology and automatically control the technical field for intersecting.
Background technology
Existing unmanned plane control system generally implements manipulation using remote control.Remote control be commonly equipped with two control lever and Multiple function buttons.Wherein, a control lever controls unmanned plane advance in the horizontal plane, retreats, Zuo Fei, and the right side flies, another Control lever controls rise and fall of the unmanned plane in vertical plane, is carried out clockwise in addition with special button control unmanned plane Or rotate counterclockwise.But so remote control manipulation needs operator to correspond to the horizontal plane and vertical plane where unmanned plane respectively To two manipulations of control lever, certain flight attitude of unmanned plane need to be resolved into two control lever and function button are divided Step manipulation, and needs are operated completion by both hands respectively, and the corresponding of complexity is there is between unmanned plane during flying attitude and manipulation process Relation, operator needs skillfully grasp manipulation skill by a large amount of training.
Utility model content
Technical problem to be solved in the utility model is to overcome prior art not enough, there is provided one kind is based on stable state vision The unmanned plane actuation means of Evoked ptential, to effectively improve the manipulation characteristic of unmanned plane, reduce manipulation difficulty.
The utility model specifically solves above-mentioned technical problem using following technical scheme:
Following technical scheme can also be obtained according to identical thinking:
Unmanned plane actuation means based on Steady State Visual Evoked Potential, including the first manipulation unit, first manipulation are single Unit includes:
Visual stimulus unit, picture is induced for one group of Steady State Visual Evoked Potential of display simultaneously, and one group of stable state is regarded It is abstract by the manipulation instruction of unmanned plane to feel that Evoked ptential induces picture;
EEG Processing unit, is processed for watching the EEG signals produced by visual stimulus unit attentively to manipulator So as to identify that the Steady State Visual Evoked Potential that manipulator is watched attentively induces picture, and the Steady State Visual Evoked Potential is induced into picture Manipulation instruction corresponding to face is sent to unmanned plane execution.
Further, the first manipulation unit also includes video acquisition unit, is regarded for Real-time Collection unmanned plane visual angle Frequently, and using the unmanned plane visual angle transmission of video of Real-time Collection show the stable state as visual stimulus unit to visual stimulus unit VEP induces display background during picture.
Preferably, the EEG Processing unit includes:
Brain wave acquisition module, is acquired for watching the EEG signals produced by visual stimulus unit attentively to manipulator;
SSVEP sort modules, the EEG signals for being gathered to brain wave acquisition module carry out feature extraction and classification, from And identify the Steady State Visual Evoked Potential that manipulator is watched attentively and induce picture;
Manipulation signal transmission module, the Steady State Visual Evoked Potential for SSVEP sort modules to be identified induces picture The corresponding manipulation instruction in face is transferred to unmanned plane.
Preferably, the Steady State Visual Evoked Potential induces picture to be one group has the figure of different flicker frequencies and phase Mark.
Preferably, described one group there is different flicker frequencies and the icon of phase to include:Visual stimulus unit is located at respectively The upper and lower, left and right of display background, upper left, lower-left, upper right, the arrow icon on bottom-right location, be corresponding in turn in unmanned plane to Front, rear, left and right, it is left front, left back, right before, right back to flight manipulation instruction, and, positioned at visual stimulus unit show carry on the back Rotate counterclockwise at the underface of scape, upwards, downwards, the arrow icon for turning clockwise, be corresponding in turn in the inverse of unmanned plane Hour hands rotation, upwards flight, the manipulation instruction flown downwards, turn clockwise.
To lift the robustness and adaptability of unmanned plane control system, further, the device also includes can be with the first behaviour The second manipulation unit that control unit mutually switches, and for realizing what the first manipulation unit and the second manipulation unit mutually switched Switching switch.
Compared to existing technology, the utility model has the advantages that:
First, traditional unmanned plane actuation means complete flight manipulation, it is necessary to manipulation instruction is divided by control lever and button Solve the operation to horizontal plane and vertical plane, and need to operate completions respectively by both hands, unmanned plane position and attitude frame of reference and There is a process decomposed and convert between control lever coordinate system, corresponding relation is complicated.The utility model proposes manipulation dress Put, manipulation instruction is induced into picture by the abstract Steady State Visual Evoked Potential for icon represents, and further with unmanned plane visual angle Shooting picture Overlapping display, manipulation instruction is located at the same coordinate system with unmanned plane position and attitude, and manipulation is more directly perceived easy, subtracts Few probability without operation, improves handling.
2nd, traditional unmanned plane actuation means flight manipulation is completed by control lever and button, it is necessary to by unmanned plane certain Individual flight attitude resolves into the substep manipulation to two control lever and function button, between unmanned plane during flying attitude and manipulation process The corresponding relation of complexity is there is, operator needs skillfully grasp manipulation skill by a large amount of training.In manipulation process Coordination between the middle coordination for needing eye and hand and both hands, the heavy load of operator.The utility model proposes manipulation Device, operator need to only watch video attentively just can complete manipulation, and without hand eye coordination and two hands coordination, manipulation is simple directly, nothing Need to train just can grasp.The muscle power and brain burden of operator mitigate, and are adapted to manipulate for a long time.
3rd, traditional unmanned plane control method is, it is necessary to both hands operate completion respectively.The utility model proposes manipulation dress Put, only manipulation need to can be completed by watching without both hands, possibility is provided for the disabled carries out manipulation.And for health Operator, can liberate both hands carries out other more accurate complicated manipulations, is that the more complicated unmanned plane of manipulation function is carried For possible.
Brief description of the drawings
Fig. 1 is the time-domain signal and corresponding frequency-region signal of SSVEP;
Fig. 2 is that the system of one preferred embodiment of the utility model unmanned plane actuation means constitutes schematic diagram;
Fig. 3 is an example of unmanned plane multi-view video and SSVEP induction picture Overlapping displays in visual stimulus unit;
Fig. 4 is the setting of the frequency and phase of SSVEP induction pictures in visual stimulus unit.
Specific embodiment
The technical solution of the utility model is described in detail below in conjunction with the accompanying drawings:
For the problems of existing unmanned plane manipulation technology, resolving ideas of the present utility model are based on stable state vision Evoked ptential carries out unmanned plane manipulation, specially:By the manipulation instruction of unmanned plane it is abstract be that one group of Steady State Visual Evoked Potential is lured Hair picture is simultaneously displayed on visual stimulus unit, by watching to manipulator the EEG signals produced by the visual stimulus unit attentively Processed so as to identify that the Steady State Visual Evoked Potential that manipulator is watched attentively induces picture, and the stable state vision inducting is electric The manipulation instruction that position is induced corresponding to picture is sent to unmanned plane execution.
For ease of public understanding, before being described in detail to technical solutions of the utility model, first to the utility model Involved correlation technique is introduced.
Brain-computer interface(Brain-Computer Interface, BCI)It is the one kind set up between brain and external environment condition Information exchange and control passage.Using this passage, people need not be by language or limb action, it is possible to carry out brain meaning Manipulation of the expression or realization of knowledge to external equipment.Existing brain-computer interface normal form mainly includes being based on slow cortical potential(SCP), P300 event related potentials, Mental imagery event related potential(MI), VEP(VEP)Etc. several induction normal forms.
Wherein VEP(Visual Evoked Potential, VEP)It is when the eyes of people are subject to visual stimulus When, the brain electricity of configuration in brain visual cortex top position(EEG)Signal can change, and this brain electric potential is referred to as vision Evoked ptential.VEP can be divided three classes:Momentary visual Evoked ptential, pseudo noise code VEP, and stable state vision is lured Generating position.
Steady State Visual Evoked Potential SSVEP is the most frequently used induction normal form in current BCI.When eyes observe a fixed frequency When the visual stimulus of rate flicker, one can be produced with frequency of stimulation phase in the EEG signal obtained at brain visual cortex The response of pass(Its time-domain signal and corresponding frequency-region signal are as shown in Figure 1), i.e., have at the fundamental frequency and frequency multiplication of frequency of stimulation Energy distribution higher, such response is referred to as Steady State Visual Evoked Potential(Hereinafter referred to as SSVEP signals).One typical case SSVEP induce the visual stimulus that is flashed by multiple different frequencies of normal form(SSVEP induces picture)Constitute.When observer watches attentively not With visual stimulus when, brain electricity(EEG)Signal can show corresponding frequency distribution feature.Using this corresponding relation, can With according to the frequency distribution feature presented in EEG signal, which kind of visual stimulus push away observer to watch attentively be come counter.If by difference Visual stimulus be specifically intended to corresponding, observer just can realize specific intended by watching specific visual stimulus attentively Output.Relative to other induction normal forms(Such as P300, Mental imagery)BCI for, SSVEP induce normal form BCI generally have There is accuracy and rate of information transmission higher, system and experimental design are easier, and the frequency of training for needing is also fewer, Therefore it is widely used among BCI systems.
Technical solutions of the utility model are described in detail with a preferred embodiment below.
As shown in Fig. 2 the unmanned plane actuation means in the present embodiment, including the first manipulation unit, the first manipulation list Unit includes:Video acquisition unit, visual stimulus unit, EEG Processing unit, wherein, video acquisition unit is used to adopt in real time Collection unmanned plane multi-view video, and using the unmanned plane visual angle transmission of video of Real-time Collection to visual stimulus unit as visual stimulus list Unit display SSVEP induces display background during picture;Visual stimulus unit is used for one group of SSVEP of display simultaneously and induces picture, institute It is abstract by the manipulation instruction of unmanned plane for state one group of SSVEP inducing picture;EEG Processing unit, for manipulator Watch the EEG signals produced by visual stimulus unit attentively to be processed so as to identify that the SSVEP that manipulator is watched attentively induces picture Face, and the manipulation instruction that the SSVEP is induced corresponding to picture is sent to unmanned plane execution.
As shown in Fig. 2 the EEG Processing unit in the present embodiment includes:
Brain wave acquisition module, is acquired for watching the EEG signals produced by visual stimulus unit attentively to manipulator;
SSVEP sort modules, the EEG signals for being gathered to brain wave acquisition module carry out feature extraction and classification, from And identify the SSVEP that manipulator is watched attentively and induce picture;
Manipulation signal transmission module, the SSVEP for SSVEP sort modules to be identified induces the corresponding behaviour of picture Control instruction is transferred to unmanned plane.
The detailed process manipulated to unmanned plane using the device is as follows:
1)Obtain unmanned plane multi-view video:
Video acquisition unit is configured on unmanned plane, is obtained with the video of unmanned plane viewing angles using all-purpose camera Signal.Existing unmanned plane has been typically equipped with airborne video equipment, and video acquisition unit can be directly existing airborne using these Video equipment.The vision signal of collection is wirelessly transferred with WiFi or wired mode real-time Transmission is to visual stimulus unit.
2)The display superposition of visual stimulus unit stimulates:
Visual stimulus unit in the present embodiment is by general purpose display(Can be that LCD display, light-emitting diode display or CRT show Show device etc., it is preferred to use LCD display)Realize stimulating picture to show, operator front is configured at, to facilitate operator to see Examine.The video pictures of unmanned plane viewing angles are shown over the display as background frame.Overlapping display is used on this background Picture is induced in SSVEP is induced.In the present embodiment picture is induced by the use of the arrow for representing steer direction as SSVEP.Such as Fig. 3 institutes Show, in the upper and lower, left and right of the background frame of the video pictures of unmanned plane viewing angles, upper left, lower-left, upper right, bottom right orientation Upper Overlapping display represent to front, rear, left and right, it is left front, left back, right before, the right side after heading arrow as SSVEP induce picture Face, above-mentioned arrow is shown with specific frequency and phase scintillation respectively.In the background painting of the video pictures of unmanned plane viewing angles The underface in face, display represents that rotate counterclockwise, arrow that is upward, downward, turning clockwise induce picture as SSVEP, on State arrow and shown with specific frequency and phase scintillation respectively.In order to adjacent S SVEP is induced into the EEG signal that picture is induced Feature difference is maximized to improve classification accuracy rate, and difference is set to except different SSVEP is induced into picture in the present embodiment Flicker frequency outside, also induce different SSVEP picture and be provided with out of phase, SSVEP induces the frequency+phase of picture Setting it is as shown in Figure 4.The difference of such frequency+phase can further discriminate between adjacent stimulation SSVEP inductions picture and be lured The EEG signal feature of hair, so as to improve classification accuracy rate.
3)Operating personnel select manipulation instruction:
Operator watches the display screen of visual stimulus unit attentively, and the video for observing unmanned plane viewing angles understands unmanned plane institute Place position and attitude.Unmanned plane can be carried out in the horizontal plane centered on current position to front, rear, left and right, it is left front, Before left back, right, the right side after, dextrorotation, turn rotate counterclockwise flight, and in vertical plane upper and lower flight manipulation.According to behaviour Control needs, and operator need to persistently watch corresponding SSVEP attentively and induce picture more than 2 seconds.
4)SSVEP signal acquisitions:
Unmanned plane operator watches the picture that the display screen of visual stimulus unit is presented attentively, by being worn on operator's head The brain wave acquisition module collection of portion occipital region induces the SSVEP signals that picture induces generation by SSVEP.Brain electricity in the present embodiment Acquisition module is made up of EEG electrodes, head hoop, radio transmitting device.Because SSVEP signals are produced by brain visual area, therefore It is to configure EEG electrodes at head occipital region to obtain preferable signal in the top of brain visual area, EEG electrode configurations are in O1- Obtaining optimum signal around Oz-O2.It is preferred that using dry type EEG electrodes to improve wearing comfort and convenience.By dry type EEG Electrode is fixed on elastic head hoop inner side, and head hoop is worn on head and realizes wearing for EEG electrodes by operator.Collect SSVEP signals are transmitted to SSVEP sort modules by radio transmitting device.
5)SSVEP Modulation recognitions:
SSVEP sort modules are placed in unmanned controller, using wireless mode(Also wired mode can be used)Receive From the SSVEP signals that brain wave acquisition module is collected, and SSVEP signals to being received carry out feature extraction and classification, from And identify the SSVEP that manipulator is watched attentively and induce picture.The utility model can use existing various SSVEP Modulation recognitions skills Art, such as minimum absolute retract and selection opertor (least absolute shrinkage and selection Operator, LASSO, Y. Zhang, J. Jin, X.Y. Qing, B. Wang, and X.Y. Wang, " LASSO based stimulus frequency recognition model for SSVEP BCIs,” Biomed. Signal Proces., vol. 7, no. 2, pp. 104-111, Feb. 2012), PLS(partial least squares, PLS, L.J. Trejo, R. Rosipal, and B. Matthews, “Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 14, no. 2, pp. 225–229, Jun. 2006.), Dian Xingxiang Close analysis(Canonical correlation analysis, CCA, Z.L. Lin, C.S. Zhang, W. Wu, and X.R. Gao, “Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs,” IEEE Trans. Biomed. Eng., vol. 54, no. 6, pp. 1172–1176, Jun. 2007.)Deng.Canonical correlation analysis is used in the present embodiment(Canonical correlation analysis, CCA) Method carries out feature extraction and classification to SSVEP signals, judges the species of the SSVEP induction pictures that operator is watched attentively, And be converted to corresponding manipulation instruction.When operator does not watch SSVEP inductions picture or SSVEP attentively in certain time window Signal quality is bad when causing CCA to be calculated maximum correlation coefficient less than 0.3, then set this time window correspondence and refers to without manipulation Order.
6)Manipulation signal transmission:
Manipulation signal transmission module is placed in unmanned controller.In the module, store to front, rear, left and right, Before left front, left back, right, behind the right side, rotate counterclockwise, upwards, downwards, the operating parameter that turns clockwise, hover.Manipulation signal is passed The manipulation instruction that SSVEP sort modules are identified is converted into defeated module the corresponding operating parameter preserved in module, using wireless Or wire signal transmission means realizes the manipulation to unmanned plane.Wherein, correspondence when SSVEP sort modules are judged as without manipulation instruction Manipulation for hovering.
To lift the robustness and adaptability of unmanned plane control system, the utility model no-manned machine distant control device is except using The first above-mentioned manipulation unit, outside implementing to manipulate by SSVEP signals, can also retain traditional utilization control lever and button Implement the manipulation unit of manipulation or increase other control modes, changed by switching switch between various control modes.

Claims (6)

1. the unmanned plane actuation means based on Steady State Visual Evoked Potential, including the first manipulation unit, it is characterised in that described the One manipulation unit includes:
Visual stimulus unit, picture is induced for one group of Steady State Visual Evoked Potential of display simultaneously, and one group of stable state vision is lured It is abstract by the manipulation instruction of unmanned plane that generating position induces picture;
EEG Processing unit, for manipulator is watched attentively the EEG signals produced by visual stimulus unit carry out treatment so as to Identify that the Steady State Visual Evoked Potential that manipulator is watched attentively induces picture, and the Steady State Visual Evoked Potential is induced into picture institute Corresponding manipulation instruction is sent to unmanned plane execution.
2. unmanned plane actuation means as claimed in claim 1, it is characterised in that the first manipulation unit also includes video acquisition Unit, for Real-time Collection unmanned plane multi-view video, and gives visual stimulus list by the unmanned plane visual angle transmission of video of Real-time Collection Unit shows the display background during Steady State Visual Evoked Potential induction picture as visual stimulus unit.
3. unmanned plane actuation means as claimed in claim 1, it is characterised in that the EEG Processing unit includes:
Brain wave acquisition module, is acquired for watching the EEG signals produced by visual stimulus unit attentively to manipulator;
SSVEP sort modules, the EEG signals for being gathered to brain wave acquisition module carry out feature extraction and classification, so as to know Do not go out the Steady State Visual Evoked Potential induction picture that manipulator is watched attentively;
Manipulation signal transmission module, the Steady State Visual Evoked Potential for SSVEP sort modules to be identified induces picture phase Corresponding manipulation instruction is transferred to unmanned plane.
4. unmanned plane actuation means as claimed in claim 1, it is characterised in that the Steady State Visual Evoked Potential induces picture and is One group of icon with different flicker frequencies and phase.
5. unmanned plane actuation means as claimed in claim 4, it is characterised in that described a group has different flicker frequencies and phase Icon include:Respectively positioned at upper and lower, left and right, upper left, lower-left, upper right, the bottom-right location of visual stimulus unit display background On the arrow icon, be corresponding in turn in unmanned plane to front, rear, left and right, it is left front, left back, right before, right back to flight manipulation Instruction, and, rotate counterclockwise at the underface of visual stimulus unit display background, upwards, downwards, turn clockwise The arrow icon, be corresponding in turn to the rotate counterclockwise in unmanned plane, flight, the manipulation flying downwards, turn clockwise refer to upwards Order.
6. unmanned plane actuation means as described in any one of Claims 1 to 5, it is characterised in that the device also includes can be with first The second manipulation unit that manipulation unit mutually switches, and for realizing that the first manipulation unit mutually switches with the second manipulation unit Switching switch.
CN201621196453.7U 2016-11-07 2016-11-07 Unmanned plane actuation means based on Steady State Visual Evoked Potential Active CN206249101U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201621196453.7U CN206249101U (en) 2016-11-07 2016-11-07 Unmanned plane actuation means based on Steady State Visual Evoked Potential

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201621196453.7U CN206249101U (en) 2016-11-07 2016-11-07 Unmanned plane actuation means based on Steady State Visual Evoked Potential

Publications (1)

Publication Number Publication Date
CN206249101U true CN206249101U (en) 2017-06-13

Family

ID=58999980

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201621196453.7U Active CN206249101U (en) 2016-11-07 2016-11-07 Unmanned plane actuation means based on Steady State Visual Evoked Potential

Country Status (1)

Country Link
CN (1) CN206249101U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111487988A (en) * 2020-03-03 2020-08-04 天津大学 Brain-controlled unmanned aerial vehicle method based on steady-state visual evoked potential brain-computer interface
CN113434040A (en) * 2021-06-07 2021-09-24 西北工业大学 Brain-computer interface technical method based on augmented reality induction

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111487988A (en) * 2020-03-03 2020-08-04 天津大学 Brain-controlled unmanned aerial vehicle method based on steady-state visual evoked potential brain-computer interface
CN111487988B (en) * 2020-03-03 2022-04-15 天津大学 Brain-controlled unmanned aerial vehicle method based on steady-state visual evoked potential brain-computer interface
CN113434040A (en) * 2021-06-07 2021-09-24 西北工业大学 Brain-computer interface technical method based on augmented reality induction
CN113434040B (en) * 2021-06-07 2024-01-05 西北工业大学 Brain-computer interface technical method based on augmented reality induction

Similar Documents

Publication Publication Date Title
CN106371451A (en) Unmanned aerial vehicle manipulation method and device based on steady state visual evoked potential
US20190387995A1 (en) Brain-Computer Interface Based Robotic Arm Self-Assisting System and Method
CN107168346A (en) A kind of asynchronous system brain control UAS based on wearable display
CN101711709B (en) Method for controlling electrically powered artificial hands by utilizing electro-coulogram and electroencephalogram information
CN104799984B (en) Assistance system for disabled people based on brain control mobile eye and control method for assistance system
CN109992113A (en) A kind of MI-BCI system and its control method induced based on more scenes
CN103995582B (en) Brain-computer interface character input method and system based on steady-state visual evoked potential (SSVEP)
Holewa et al. Emotiv EPOC neuroheadset in brain-computer interface
CN110534180B (en) Deep learning human-computer interaction motor imagery brain-computer interface system and training method
CN108762303A (en) A kind of portable brain control UAV system and control method based on Mental imagery
CN108415565A (en) The machine integrated intelligent control method of unmanned plane brain and technology
CN107224273B (en) Central-peripheral nerve closed-loop rehabilitation training method and system based on optical brain imaging nerve feedback
CN111110982A (en) Hand rehabilitation training method based on motor imagery
CN112114670B (en) Man-machine co-driving system based on hybrid brain-computer interface and control method thereof
CN110727353A (en) Control component control method and device based on two-dimensional intention definition
CN206162388U (en) Mutual wearing system of brain machine
CN106491251B (en) Non-invasive brain-computer interface-based robot arm control system and control method thereof
CN206249101U (en) Unmanned plane actuation means based on Steady State Visual Evoked Potential
Fang et al. Brain–computer interface integrated with augmented reality for human–robot interaction
CN110716578A (en) Aircraft control system based on hybrid brain-computer interface and control method thereof
CN103294192A (en) LED lamp switch control device and control method thereof based on motor imagery
Li et al. An adaptive P300 model for controlling a humanoid robot with mind
Edlinger et al. A hybrid brain-computer interface for improving the usability of a smart home control
CN110658810A (en) Individual combat unmanned weapon control system based on SSVEP brain-computer interface
CN108319367B (en) Brain-computer interface method based on motion initiation evoked potential

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant