CN110134243A - A kind of brain control mechanical arm shared control system and its method based on augmented reality - Google Patents
A kind of brain control mechanical arm shared control system and its method based on augmented reality Download PDFInfo
- Publication number
- CN110134243A CN110134243A CN201910416699.2A CN201910416699A CN110134243A CN 110134243 A CN110134243 A CN 110134243A CN 201910416699 A CN201910416699 A CN 201910416699A CN 110134243 A CN110134243 A CN 110134243A
- Authority
- CN
- China
- Prior art keywords
- mechanical arm
- brain
- control
- augmented reality
- steady state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention relates to a kind of brain control mechanical arm shared control system and its method based on augmented reality, the system includes computer vision means, Steady State Visual Evoked Potential brain-computer interface device, Compliance control device, augmented reality device and mechanical arm, the computer vision means for identification in working space object color and location information;Augmented reality device is for generating visual stimulus signal and acting on user;Steady State Visual Evoked Potential brain-computer interface device obtains the EEG signals of user and is converted to vision human brain control command;Compliance control device controls mechanical arm using Compliance control strategy;Mechanical arm realizes that target object grabs under the control of Compliance control device.The present invention combines realization control function with mechanical arm autonomous control based on computer vision using human brain control command, realize more natural human-computer interaction, reduce the psychological burden of user, enhance applicability of the mechanical arm in complex environment, to improve the practicability and reliability of whole system.
Description
Technical field
The invention belongs to robotic technology field, especially a kind of brain control mechanical arm Compliance control system based on augmented reality
System and its method.
Background technique
No matter disabled person or the elderly, face inconvenience in daily life.Especially for serious fortune
For the crowd of dynamic obstacle, it can live on one's own life, reintegrate into society as its urgent need.
In recent years, robot assisted device is increasingly used in improving the independence of disabled person or the elderly and life matter
The functions such as rehabilitation training, displacement walk help, feeding treatment, cleaning-nursing, aiding upright may be implemented in amount.And existing robot
Auxiliary device generallys use traditional man-machine interface, such as mouse, keyboard, voice, and user is needed to have remaining movement energy
Power.And the crowd with severe motor disabilities is because of motion control capabilities needed for not having manipulation conventional human interface, robot
Manipulation to it with biggish challenge.Brain-computer interface is that a kind of completely new do not depend on is established between human brain and external equipment
In the output channel of peripheral nerve and muscle, to realize the direct interaction of human brain and external environment.Brain-computer interface is introduced
In the control of robot, i.e., so-called brain man-controlled mobile robot is expected to solve the problems, such as this.
Upper limb such as has a meal because being related to more daily routines, wears the clothes, and plays important role in daily life.It opens
The brain control mechanical arm for sending out a kind of ability that can give the certain crawl object of user is just particularly important.Currently, brain control
The research of mechanical arm is still in the exploratory stage.The existing big control plan for mostly using brain-computer interface to directly control mechanical arm of research
Slightly, the feasibility of brain control mechanical arm is only demonstrated.And Direct control strategy needs user continually to send order, in turn
It is easy to cause user battle-weary.
Summary of the invention
It is an object of the invention to overcome the deficiencies in the prior art, propose that a kind of design is reasonable, easy to use and accurately may be used
The brain control mechanical arm shared control system and its method based on augmented reality leaned on.
The present invention solves its technical problem and adopts the following technical solutions to achieve:
A kind of brain control mechanical arm shared control system based on augmented reality, including computer vision means, stable state vision
Evoked ptential brain-computer interface device, Compliance control device, augmented reality device and mechanical arm, the computer vision means are used
The color and location information of object in identification working space;The augmented reality device is for generating visual stimulus signal simultaneously
Act on user;The Steady State Visual Evoked Potential brain-computer interface device obtains the EEG signals of user and is converted to view
Feel human brain control command;The Compliance control device controls mechanical arm using Compliance control strategy;The mechanical arm exists
Target object crawl is realized under the control of Compliance control device.
The augmented reality device uses Hololens holographic glasses.
The computer vision means by the camera that is fixed on the end-effector of mechanical arm to working space into
Row shooting.
Augmented reality device is synchronous by udp protocol realization with Steady State Visual Evoked Potential brain-computer interface device;It is described total
It enjoys control device and is communicated by ICP/IP protocol with mechanical arm.
The measurement that the Steady State Visual Evoked Potential brain-computer interface device passes through the acquisition user occipital region position O1, Oz, O2
Electrode obtains EEG signals, and measuring electrode includes being mounted on the reference electrode at left mastoid process and the ground electrode at right mastoid process.
A kind of control method of the brain control mechanical arm shared control system based on augmented reality, comprising the following steps:
The color and location information of object in step 1, computer vision means identification working space;
Step 2 induces Steady State Visual Evoked Potential by augmented reality device, passes through Steady State Visual Evoked Potential brain-machine
Interface obtains EEG signals and exports human brain control command after analysis is handled;
The people for the intention crawl object that step 3, Compliance control device are sent according to Steady State Visual Evoked Potential brain-computer interface
The color and location information of brain control command and Computer Vision Recognition object are directly controlled again using first advanced control
Compliance control strategy manipulates mechanical arm to complete target object crawl.
The concrete methods of realizing of the step 1 the following steps are included:
(1) the camera of computer vision means is fixed on the end-effector of mechanical arm, is risen in end-effector
Working space is shot when to certain altitude, interception wherein a frame image as original image;
(2) image restoration processing is carried out to original image, reduce because illumination bring influences, make boundary in image more
Clearly, color is more nearly reality, saves as restored image;
(3) restored image is transformed into HSV space and is handled, adjusted the threshold value of each color, pass through color threshold
To identify the color of object in working space;By camera take pictures height and working space in mark point actual coordinate, meter
Calculate the location information of object in working space.
The step 2 augmented reality device encodes different viewing commands using different frequency of stimulation, the augmented reality
The stimulation interface of device is made of two hierarchical levels of menus, and first order menu sends the object of certain color of intended operation for user
Order, Level-2 menu send the viewing command for directly manipulating mechanical arm for user.
Tri- lead of brain top area O1, Oz, O2 of the step 2 Steady State Visual Evoked Potential brain-computer interface acquisition user
Eeg data, and using Canonical Correlation Analysis detection Steady State Visual Evoked Potential frequency content, to identify user
Viewing command.
The processing method of Compliance control device in the step 3 are as follows:
(1) regarded according to the viewing command of the first order menu of Steady State Visual Evoked Potential brain-computer interface output and computer
The color and location information of the object provided are provided, form robotic command, and mechanical arm, machine are sent to by ICP/IP protocol
Tool arm is automatically moved to the upper area near target object;
(2) the viewing command exported according to the Level-2 menu of Steady State Visual Evoked Potential brain-computer interface forms robot
Order, and mechanical arm is sent to by ICP/IP protocol, directly control the mobile completion crawl of mechanical arm.
The advantages and positive effects of the present invention are:
The present invention combines realization control with mechanical arm autonomous control based on computer vision using human brain control command
Function, induces Steady State Visual Evoked Potential by augmented reality device, user be not necessarily to visual stimulus and mechanical arm it
Between distract attention repeatedly, to realize more natural human-computer interaction, while controlling being total to of directly controlling again by first advanced
The strategy for enjoying control can reduce the psychological burden of user, enhance applicability of the mechanical arm in complex environment, to improve
The practicability and reliability of whole system.
Detailed description of the invention
Fig. 1 is control principle drawing in the present invention;
Fig. 2 is the first order menu of augmented reality device in the present invention;
Fig. 3 is the Level-2 menu of augmented reality device in the present invention.
Specific embodiment
The embodiment of the present invention is further described below in conjunction with attached drawing.
A kind of brain control mechanical arm shared control system based on augmented reality, as shown in Figure 1, including that computer vision fills
It sets, Steady State Visual Evoked Potential brain-computer interface device, Compliance control device, augmented reality device and mechanical arm, the calculating
Machine sighting device for identification in working space object color and location information;The augmented reality device is for generating view
Feel stimulus signal and acts on user;The Steady State Visual Evoked Potential brain-computer interface device obtains the brain telecommunications of user
Number and be converted to vision human brain control command;The Compliance control device controls mechanical arm using Compliance control strategy,
That is: near mechanical arm is automatically moved to first above target object, then user passes through brain-computer interface step by step again
Manipulation mechanical arm grabs to complete target object.
A kind of control method of the brain control mechanical arm shared control system based on augmented reality the following steps are included:
The color and location information of object in step 1, computer vision means identification working space.
Concrete methods of realizing in this step the following steps are included:
(1) camera of computer vision means is fixed on the end-effector of mechanical arm, the front end effect taken pictures
Answer device to rise to certain altitude to shoot working space, interception wherein a frame image as original image P1;
(2) image restoration processing is carried out to original image P1, reduces because illumination bring influences, makes boundary in image more
It is clear to add, and color is more nearly reality, saves as restored image P2;
(3) restored image P2 is transformed into HSV space and is handled, adjust the threshold value of each color.Pass through color
Threshold value identifies the color of object in working space.Due to camera take pictures height and working space in mark point practical seat
Mark is it is known that can calculate the location information of object in working space based on P2.To realize the face to object in working space
The identification of color and position.
Step 2 induces Steady State Visual Evoked Potential by augmented reality device, passes through Steady State Visual Evoked Potential brain-machine
Interface obtains EEG signals and exports human brain control command after analysis is handled.
In this step, augmented reality device realizes visual stimulus function using Microsoft Hololens holographic glasses.Vision
Stimulation is to make to present by Unity3D, and different viewing commands is encoded using different frequency of stimulation.Stimulate interface by two
Grade menu composition, first order menu send the order of the object of certain color of intended operation for user, and Level-2 menu is for making
User sends the viewing command for directly manipulating mechanical arm.The switching of first order menu and Level-2 menu is sent by user
Viewing command is completed;Every further menu needs to wait for regular hour (such as 4.2 seconds) after sending viewing command, under just sending
One viewing command, waiting time are mainly that mechanical arm reserves the operating time.
As shown in Figure 2.First order menu includes 8 orders, wherein face of preceding 6 orders for intention crawl target object
Color is followed successively by red, green, yellow, black, blue, purple.Remaining 2 orders are respectively revocation last time operation and switch to Level-2 menu.
The corresponding frequency of stimulation of this 8 orders is respectively 8Hz, 9Hz, 10Hz, 11Hz, 8.5Hz, 9.5Hz, 10.5Hz and 11.5Hz.
As shown in figure 3, Level-2 menu includes 9 orders, wherein preceding 6 orders are for controlling end-effector from current
Position moves to X, Y, relative position on Z plane, and the distance that end-effector moves every time is 10 millimeters.Remaining 3 orders point
It Wei not be for controlling the closure of pneumatic clamper and opening and switching to first order menu.The corresponding frequency of stimulation difference of this 9 orders
For 8Hz, 9.5Hz, 11Hz, 8.5Hz, 10Hz, 11.5Hz, 9Hz, 10.5Hz and 12Hz.First order menu is presented in system first.
The visual stimulus signal function that augmented reality device generates is pacified in user in the user occipital region position O1, Oz, O2
Measuring electrode is put, left mastoid process places reference electrode, and right mastoid process places ground electrode.
Steady State Visual Evoked Potential brain-computer interface acquires brain top area O1, Oz, O2 of user by brain wave acquisition module
The eeg data of three leads, and using the frequency content of Canonical Correlation Analysis detection Steady State Visual Evoked Potential, with identification
The viewing command of user.
Visual stimulus caused by augmented reality device and brain wave acquisition module synchronize be to be realized by udp protocol.
The life for the intention crawl object that step 3, Compliance control device are sent according to Steady State Visual Evoked Potential brain-computer interface
The color and location information of order and Computer Vision Recognition object control the Compliance control directly controlled again using first advanced
Near strategy, i.e. mechanical arm are automatically moved to first above target object, then user passes through one step of brain-computer interface again
One step manipulates mechanical arm to complete target object crawl.
In this step, the processing method of the Compliance control device are as follows:
According to the viewing command and computer vision of the output of the first order menu of Steady State Visual Evoked Potential brain-computer interface
The color and location information of the object of offer form robotic command, and the control of mechanical arm is sent to by ICP/IP protocol
Device, mechanical arm are then programmed to automatically be moved to the upper area near target object;
According to the viewing command that the Level-2 menu of Steady State Visual Evoked Potential brain-computer interface exports, machine human life is formed
It enables, and is sent to the controller of mechanical arm by ICP/IP protocol, directly control the mobile completion crawl of mechanical arm.
Use process of the invention are as follows:
Implementation procedure after augmented reality device generates visual stimulus are as follows:
1, user watches either objective in the current set of menu attentively, and visual stimulus caused by augmented reality device passes through udp protocol
It is synchronized with brain wave acquisition module.
2, according to synchronization signal, 3 EEG signals of above-mentioned acquisition are segmented.The dish being presently according to user
It is single, calculate separately the canonical correlation coefficient of segmentation EEG signals and the corresponding reference signal of each frequency of stimulation;Gained will be calculated
The corresponding frequency identification of maximum correlation coefficient be the frequency watched attentively of user.
3, according to the viewing command identified, mechanical arm executes corresponding operation;
4, after mechanical arm completes corresponding operating, return step 1 repeats step 1 to step 3, carries out command recognition next time.
It is emphasized that embodiment of the present invention be it is illustrative, without being restrictive, therefore the present invention includes
Be not limited to embodiment described in specific embodiment, it is all by those skilled in the art according to the technique and scheme of the present invention
The other embodiments obtained, also belong to the scope of protection of the invention.
Claims (10)
1. a kind of brain control mechanical arm shared control system based on augmented reality, it is characterised in that: including computer vision means,
Steady State Visual Evoked Potential brain-computer interface device, Compliance control device, augmented reality device and mechanical arm, the computer view
Feel device for identification in working space object color and location information;The augmented reality device is for generating visual stimulus
Signal simultaneously acts on user;The Steady State Visual Evoked Potential brain-computer interface device obtains the EEG signals of user and turns
It is changed to vision human brain control command;The Compliance control device controls mechanical arm using Compliance control strategy;The machine
Tool arm realizes that target object grabs under the control of Compliance control device.
2. a kind of brain control mechanical arm shared control system based on augmented reality according to claim 1, it is characterised in that:
The augmented reality device uses Hololens holographic glasses.
3. a kind of brain control mechanical arm shared control system based on augmented reality according to claim 1, it is characterised in that:
The computer vision means shoot working space by the camera being fixed on the end-effector of mechanical arm.
4. a kind of brain control mechanical arm shared control system based on augmented reality according to claim 1, it is characterised in that:
Augmented reality device is synchronous by udp protocol realization with Steady State Visual Evoked Potential brain-computer interface device;The Compliance control dress
It sets and is communicated by ICP/IP protocol with mechanical arm.
5. a kind of brain control mechanical arm shared control system based on augmented reality according to claim 1, it is characterised in that:
The Steady State Visual Evoked Potential brain-computer interface device is obtained by the measuring electrode of the acquisition user occipital region position O1, Oz, O2
EEG signals, measuring electrode include being mounted on the reference electrode at left mastoid process and the ground electrode at right mastoid process.
6. a kind of control of the brain control mechanical arm shared control system as described in any one of claim 1 to 5 based on augmented reality
Method, it is characterised in that the following steps are included:
The color and location information of object in step 1, computer vision means identification working space;
Step 2 induces Steady State Visual Evoked Potential by augmented reality device, passes through Steady State Visual Evoked Potential brain-computer interface
It obtains EEG signals and exports human brain control command after analysis is handled;
The human brain control for the intention crawl object that step 3, Compliance control device are sent according to Steady State Visual Evoked Potential brain-computer interface
The color and location information of system order and Computer Vision Recognition object, using the first advanced shared control for controlling and directly controlling again
System strategy manipulation mechanical arm grabs to complete target object.
7. a kind of control method of brain control mechanical arm shared control system based on augmented reality according to claim 6,
It is characterized by: the concrete methods of realizing of the step 1 the following steps are included:
(1) the camera of computer vision means is fixed on the end-effector of mechanical arm, rises to one in end-effector
Determine to shoot working space when height, interception wherein a frame image as original image;
(2) image restoration processing is carried out to original image, reduces because illumination bring influences, be more clear the boundary in image,
Color is more nearly reality, saves as restored image;
(3) restored image is transformed into HSV space and is handled, adjust the threshold value of each color, known by color threshold
The color of object in other working space;By camera take pictures height and working space in mark point actual coordinate, calculate
The location information of object in working space.
8. a kind of control method of brain control mechanical arm shared control system based on augmented reality according to claim 6,
It is characterized by: the step 2 augmented reality device encodes different viewing commands using different frequency of stimulation, the enhancing is existing
The stimulation interface that actual load is set is made of two hierarchical levels of menus, and first order menu sends the object of certain color of intended operation for user
Order, Level-2 menu send the viewing command for directly manipulating mechanical arm for user.
9. a kind of control method of brain control mechanical arm shared control system based on augmented reality according to claim 6,
It is characterized by: brain top area O1, Oz, O2 tri- of the step 2 Steady State Visual Evoked Potential brain-computer interface acquisition user is led
The eeg data of connection, and using the frequency content of Canonical Correlation Analysis detection Steady State Visual Evoked Potential, it is used with identifying
The viewing command of person.
10. a kind of control method of brain control mechanical arm shared control system based on augmented reality according to claim 6,
It is characterized by: in the step 3 Compliance control device processing method are as follows:
(1) mentioned according to the viewing command of the first order menu of Steady State Visual Evoked Potential brain-computer interface output and computer vision
The color and location information of the object of confession form robotic command, and are sent to mechanical arm by ICP/IP protocol, and mechanical arm is certainly
The upper area being moved near target object mainly;
(2) the viewing command exported according to the Level-2 menu of Steady State Visual Evoked Potential brain-computer interface forms robotic command,
And mechanical arm is sent to by ICP/IP protocol, directly control the mobile completion crawl of mechanical arm.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910416699.2A CN110134243A (en) | 2019-05-20 | 2019-05-20 | A kind of brain control mechanical arm shared control system and its method based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910416699.2A CN110134243A (en) | 2019-05-20 | 2019-05-20 | A kind of brain control mechanical arm shared control system and its method based on augmented reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110134243A true CN110134243A (en) | 2019-08-16 |
Family
ID=67571259
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910416699.2A Pending CN110134243A (en) | 2019-05-20 | 2019-05-20 | A kind of brain control mechanical arm shared control system and its method based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110134243A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110597201A (en) * | 2019-09-05 | 2019-12-20 | 佛山市云米电器科技有限公司 | Intelligent household system controlled by AR equipment and control method thereof |
CN110956701A (en) * | 2019-10-24 | 2020-04-03 | 中国人民解放军军事科学院国防科技创新研究院 | Life support system and life support method |
CN111728608A (en) * | 2020-06-29 | 2020-10-02 | 中国科学院上海高等研究院 | Augmented reality-based electroencephalogram signal analysis method, device, medium and equipment |
CN113625769A (en) * | 2021-09-07 | 2021-11-09 | 中国人民解放军军事科学院军事医学研究院 | Unmanned aerial vehicle formation multi-mode control system based on electroencephalogram signals |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105549743A (en) * | 2016-01-18 | 2016-05-04 | 中国医学科学院生物医学工程研究所 | Robot system based on brain-computer interface and implementation method |
CN106671084A (en) * | 2016-12-20 | 2017-05-17 | 华南理工大学 | Mechanical arm self-directed auxiliary system and method based on brain-computer interface |
CN107885124A (en) * | 2017-11-21 | 2018-04-06 | 中国运载火箭技术研究院 | Brain eye cooperative control method and system in a kind of augmented reality environment |
CN109366508A (en) * | 2018-09-25 | 2019-02-22 | 中国医学科学院生物医学工程研究所 | A kind of advanced machine arm control system and its implementation based on BCI |
-
2019
- 2019-05-20 CN CN201910416699.2A patent/CN110134243A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105549743A (en) * | 2016-01-18 | 2016-05-04 | 中国医学科学院生物医学工程研究所 | Robot system based on brain-computer interface and implementation method |
CN106671084A (en) * | 2016-12-20 | 2017-05-17 | 华南理工大学 | Mechanical arm self-directed auxiliary system and method based on brain-computer interface |
CN107885124A (en) * | 2017-11-21 | 2018-04-06 | 中国运载火箭技术研究院 | Brain eye cooperative control method and system in a kind of augmented reality environment |
CN109366508A (en) * | 2018-09-25 | 2019-02-22 | 中国医学科学院生物医学工程研究所 | A kind of advanced machine arm control system and its implementation based on BCI |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110597201A (en) * | 2019-09-05 | 2019-12-20 | 佛山市云米电器科技有限公司 | Intelligent household system controlled by AR equipment and control method thereof |
CN110956701A (en) * | 2019-10-24 | 2020-04-03 | 中国人民解放军军事科学院国防科技创新研究院 | Life support system and life support method |
CN111728608A (en) * | 2020-06-29 | 2020-10-02 | 中国科学院上海高等研究院 | Augmented reality-based electroencephalogram signal analysis method, device, medium and equipment |
CN113625769A (en) * | 2021-09-07 | 2021-11-09 | 中国人民解放军军事科学院军事医学研究院 | Unmanned aerial vehicle formation multi-mode control system based on electroencephalogram signals |
CN113625769B (en) * | 2021-09-07 | 2024-05-10 | 中国人民解放军军事科学院军事医学研究院 | Unmanned aerial vehicle formation multi-mode control system based on electroencephalogram signals |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110134243A (en) | A kind of brain control mechanical arm shared control system and its method based on augmented reality | |
US11602300B2 (en) | Brain-computer interface based robotic arm self-assisting system and method | |
CN111631726B (en) | Upper limb function evaluation device and method and upper limb rehabilitation training system and method | |
CN103838378B (en) | A kind of wear-type eyes control system based on pupil identification positioning | |
CN108392795B (en) | A kind of healing robot Multimode Controlling Method based on multi-information fusion | |
CN106020470B (en) | Adaptive domestic environment control device and its control method based on brain-computer interface | |
CN101889928B (en) | Head gesture recognition technology-based wheelchair control method | |
CN109366508A (en) | A kind of advanced machine arm control system and its implementation based on BCI | |
CN109623835B (en) | Wheelchair manipulator system based on multi-mode information fusion | |
CN106774856B (en) | Exchange method and interactive device based on lip reading | |
CN105549743A (en) | Robot system based on brain-computer interface and implementation method | |
CN110442232A (en) | The wearable augmented reality robot control system of joint eye movement and brain-computer interface | |
CN106504751A (en) | Self adaptation lip reading exchange method and interactive device | |
Huang et al. | An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries | |
CN105710885A (en) | Service-oriented movable manipulator system | |
CN108646915A (en) | The method and system of object is captured in conjunction with three-dimensional eye tracking and brain-computer interface control machinery arm | |
CN111096796B (en) | Full-automatic vein puncture robot multilayer control system | |
CN112348851B (en) | Moving target tracking system and mixed reality operation auxiliary system | |
CN111571587B (en) | Brain-controlled mechanical arm dining assisting system and method | |
CN107589628A (en) | A kind of holographic projector and its method of work based on gesture identification | |
CN109011097A (en) | A kind of recovery training method based on Mental imagery and augmented reality | |
Zhang et al. | Study on robot grasping system of SSVEP-BCI based on augmented reality stimulus | |
CN111399652A (en) | Multi-robot hybrid system based on layered SSVEP and visual assistance | |
CN109223441A (en) | A kind of human upper limb rehabilitation training and movement auxiliary system based on Kinect sensor | |
CN106214163A (en) | The artificial psychology of a kind of lower limb malformation postoperative straightening rehabilitation teaches device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190816 |