CN110209073A - The manned mobile platform system of brain-machine interaction based on augmented reality - Google Patents
The manned mobile platform system of brain-machine interaction based on augmented reality Download PDFInfo
- Publication number
- CN110209073A CN110209073A CN201910466601.4A CN201910466601A CN110209073A CN 110209073 A CN110209073 A CN 110209073A CN 201910466601 A CN201910466601 A CN 201910466601A CN 110209073 A CN110209073 A CN 110209073A
- Authority
- CN
- China
- Prior art keywords
- mobile platform
- augmented reality
- control
- interface
- brain
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 47
- 230000003993 interaction Effects 0.000 title claims abstract description 27
- 210000004556 brain Anatomy 0.000 claims abstract description 10
- 230000033001 locomotion Effects 0.000 claims description 22
- 238000004891 communication Methods 0.000 claims description 15
- 230000008447 perception Effects 0.000 claims description 10
- 238000000034 method Methods 0.000 claims description 9
- 230000000007 visual effect Effects 0.000 claims description 9
- 230000007177 brain activity Effects 0.000 claims description 6
- 230000001953 sensory effect Effects 0.000 claims description 5
- 230000008859 change Effects 0.000 claims description 4
- 230000007613 environmental effect Effects 0.000 claims description 3
- 230000004927 fusion Effects 0.000 claims description 3
- 230000000638 stimulation Effects 0.000 claims description 3
- 230000002452 interceptive effect Effects 0.000 abstract description 3
- 239000011521 glass Substances 0.000 description 5
- 238000007654 immersion Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 210000001508 eye Anatomy 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention designs a kind of manned mobile platform system of the brain-machine interaction based on augmented reality, analyzes the site environment in face of tested object by augmented reality to determine unwanted option.There should be the autonomous ability permission for determining interaction content between subject and system in brain machine interface system, rather than rely solely on fixed mode and determine man-machine interaction content.Irrelevant information blindly can be excluded by practical control environment dynamic regulation user interface rather than largely using default flash pattern, to further promote interactive efficiency.Irrelevant information blindly can be excluded by practical control environment dynamic regulation user interface rather than largely using default flash pattern, to further promote interactive efficiency.
Description
Technical field
The present invention designs a kind of manned mobile platform system of the brain-machine interaction based on augmented reality, passes through augmented reality
The site environment in face of tested object is analyzed to determine unwanted option.Environment dynamic regulation user interface is controlled by practical
Rather than irrelevant information can be largely blindly excluded using default flash pattern, to further promote the effect of interaction
Rate.
Background technique
In recent years, the manned mobile platform based on brain-computer interface technology has obtained galore developing.Many equipment controls are former
Type system is developed.Usually, the application system of view-based access control model P300 is needed to be tested with a user interface help and be lured
P300 electric potential signal is sent out, so that output order controls external equipment.User interface passes through in the control instruction presented for subject
The mode of superposition flicker stimulates induces signal.Therefore user interface acts as the work of tested object Yu control object interactive window
With.In traditional application system, control instruction is presented using static user interface in most of solution.It is so-called quiet
What state user interface referred to is exactly that the position that control instruction is presented in the user interface is finger that is fixed, selecting for tested object
The quantity and type of order do not change.
A kind of is typically that subject is sitting on intelligent wheel chair using static user interface, and the electricity in front is placed in by paying close attention to
Brain display screen issues motion control instruction to system.The movement of wheelchair needs to be tested the control that concern is presented on computer display screen
Order, is constantly assigned as moved downward a step, and travel forward a step, turns left the orders such as 45 degree to complete.It is clear that this
Movement control mode needs to expend longer time to complete a remote motion control.Not due to EEG signals itself
It is a kind of signal of high s/n ratio, still has not low error rate to the judgement of the signal at present.The number of instructions that required by task is wanted
More, the probability that system generates error control instruction is also bigger.Therefore this step-type control mode is not efficiently to control
Method processed.Meanwhile this is also easy to bring biggish brain burden for tested object.
As the application function based on brain-computer interface constantly extends, the number of control instruction is also being continuously increased therewith.This
The problem that the mode that command option is presented to subject is directly be easy to cause control efficiency lower by kind for P300 system.
For this problem, some researchs just propose the control normal form based on region.In the normal form based on region, control
System order is divided into different groups according to functional similarity.The output of order is divided into two stages.Subject is first first
Control group belonging to a stage select command.Second stage then, user interface enter relevant group, are then tested
Specific command selection is carried out again.In the embedded web page server application of a control normal form based on region, system exists
Also the control efficiency of system has been taken into account while being integrated with many control instructions.
Summary of the invention
It is an object of the invention to should have interaction content between autonomous decision subject and system in brain machine interface system
Ability permission, rather than rely solely on fixed mode and determine man-machine interaction content.It is used by the practical environment dynamic regulation that controls
Family interface rather than blindly using default flash pattern can largely exclude irrelevant information, thus further promoted interaction
Efficiency.
To realize the above-mentioned technical purpose, technical scheme is as follows:
A kind of manned mobile platform system of brain-machine interaction based on augmented reality, including brain-computer interface module and controlled outside
Equipment;Wherein brain-computer interface module includes eeg collection system, core control system, augmented reality device and environment sensing system
System;Controlled external equipment includes manned mobile platform and controlled electric appliance equipment;
Context aware systems obtain the environment understanding information of system ambient enviroment by sensory perceptual system ambient environmental conditions,
The environment understanding information of middle system ambient enviroment includes the moving area information of current manned mobile platform;Environment sensing simultaneously
System obtains the executable operation information of current controllable controlled external equipment by the current controllable controlled external equipment of perception,
Perceiving controlled external equipment includes the state of identification current controllable controlled electric appliance equipment and each controlled external equipment, position letter
Breath;
Augmented reality device is by the environment understanding information of the collected system ambient enviroment of context aware systems and currently
The command icon that the executable operation information of controllable controlled external equipment generates Dynamically Announce interface is presented to subject, works as enhancing
When real world devices alternately flash candidate command icon, eeg collection system continuous collecting brain activity signal, when flash command figure
When being designated as subject and being intended to the command icon of selection, subject brain can generate P300 signal;Eeg collection system is from brain activity
This P300 signal, the control that eeg collection system accordingly refers to the corresponding command icon of this P300 signal are picked out in overall signal
System instruction passes to core control system;
Core control system generates corresponding control instruction according to brain mind figure, and exports control instruction to quilt accordingly
External equipment is controlled, realizes the control to controlled external equipment.
The acquisition methods of the moving area information of current manned mobile platform of the present invention are:
Several navigation target points are set in system ambient enviroment, current manned movement is determined by navigation by recognition target point
The moving area information of platform;
Digital label is provided on each navigation target point, digital label content includes: the unique of corresponding navigation target point
The two-dimensional coordinate information of id information, corresponding navigation target point;Context aware systems include the identification device for identifying the digital label,
It is the digital label content that can extract each navigation target point by identification device, obtains the removable area of current manned mobile platform
Domain information.
The acquisition methods of the executable operation information of current controllable controlled external equipment of the present invention are:
It is respectively arranged with digital label in controlled electric appliance equipment and manned mobile platform in controlled external equipment, number
Word label substance includes: the corresponding title for being controlled external equipment, the corresponding unique ID information for being controlled external equipment, corresponds to outside controlled
The executable operation information of the two-dimensional coordinate information of portion's equipment and corresponding controlled external equipment;The wherein number of manned mobile platform
Word label substance further includes 3 d pose information;
Context aware systems include the identification device for identifying the digital label, be can extract by identification device each controlled outer
The digital label content of portion's equipment obtains the executable operation information of current controllable controlled external equipment.
The Dynamically Announce interface of augmented reality device of the present invention is specific as follows:
Including scene interface, manned mobile platform control interface and appliance control interface, three are alternately called when system is run
Kind interface jumps interface according to specific selection;
Manned mobile platform control interface can jump to scene interface and equipment control for controlling manned mobile platform
Interface;
Scene interface is based on subject visual field scene build, and selection tries potential interested target and life from the subject visual field
At scene interface is according to subject visual field scene dynamic change;
Appliance control interface is to be tested to control external control interface, and subject selects outside one in scene user's dynamic interface
Portion controls equipment as current control target, then opens corresponding appliance control interface, and pass through the phase on appliance control interface
It should order and current control target is operated.
Manned mobile platform control interface of the present invention includes the order for controlling manned mobile platform posture, opens scene
The order at interface, Hung system order and restart system command.
The order of the manned mobile platform posture of control of the present invention includes step-by-step movement order and autonomous order;
Step-by-step movement order has 5, including to the left, to the right, forward, turn left 45 ° and turn right 45 °, step-by-step movement order adjustment carry
People's mobile platform posture;
Autonomous order has 1, and system gives a navigation target point coordinate, manned mobile platform autokinetic movement to finger
Positioning is set, and when manned mobile platform starts autokinetic movement, other than Hung system order, new control life is no longer received from system
It enables.
Appliance control interface of the present invention includes the order of subject control external equipment, is tested in scene dynamic interface
When selecting an external equipment, then the appliance control interface of the external equipment is called to carry out related manipulation.
The means of communication between augmented reality device of the present invention, context aware systems and eeg collection system are as follows:
Augmented reality device, context aware systems and eeg collection system are interconnected at a local area network using wireless communication
Interior, each section independently executes respective main task, and executes communication task between each section.
Augmented reality device, context aware systems and eeg collection system of the present invention rely on Multi-thread control, director
Business and communication task respectively exclusively enjoy a thread.
When executing communication task, eeg collection system is as the server communicated in local area network, augmented reality device and ring
Border sensory perceptual system is connected as two independent clients with eeg collection system;
Eeg collection system obtains perception information from context aware systems, and perception information is encoded to core control system
In, core control system presentation P300, which is stimulated, gives augmented reality device, is supplied to user and carries out human-computer interaction;
After subject makes a choice, core control system determines P300 stimulation and according to the ring presented with context aware systems
Border understands that the fusion of information and executable operation determines the control object that subject selects in the scene.
Following technical effect can achieve using the present invention:
Simplified control process frees subject from many and diverse order transmission task, improves control efficiency.Using increasing
Strong reality technology, user's control system feeling of immersion is strong, and user experience is good.User is controlled using the dynamic based on environment perception technology
Interface controls environment dynamic regulation by practical, excludes in user interface preset options the incoherent option under a certain scene,
Further promote the efficiency of human-computer interaction.
Detailed description of the invention
Fig. 1 is a kind of manned mobile platform system structure chart of brain-machine interaction based on augmented reality of the invention.
Specific embodiment
As shown in Figure 1, a kind of manned mobile platform system of brain-machine interaction based on augmented reality, including brain-computer interface module
1 and controlled external equipment 7;Wherein brain-computer interface module 1 includes eeg collection system 2, core control system 3, augmented reality dress
Set 4 and context aware systems 5;External controlled device includes manned mobile platform 6 and controlled electric appliance equipment;
For augmented reality device 4 currently on the market there are mainly three types of the display platforms of mainstream, one is immersions to show, a kind of
It is that monocular is shown, another kind is Binocular displays.Immersion, which is shown, provides three-dimensional vision for wearer by graphics technology
Experience, it is stronger using feeling of immersion.Monocular shows that only providing augmented reality for one eye eyeball shows.Opposite, Binocular displays can be
Two eyes provide display.Immersion shows that the main representative of glasses is the HoloLens glasses of Microsoft.Monocular shows flat
The main representative of platform has the Google Glass of Google.The main representative of Binocular displays platform has the Moverio Bt system of Ai Pusen
Column glasses.HoloLens glasses also belong to one kind of binocular ophthalmoscope.Manned mobile platform 6 can be wheelchair.Controlled external equipment 7
Including air-conditioning 8 and TV 9 and other electric appliances 10.
Context aware systems 5 obtain the environment understanding information of system ambient enviroment by sensory perceptual system ambient environmental conditions
11, wherein the environment understanding information of system ambient enviroment includes the moving area information of current manned mobile platform 6;Ring simultaneously
Border sensory perceptual system 5 obtains the executable of current controllable controlled external equipment 7 by the current controllable controlled external equipment 7 of perception
Operation information, perceiving controlled external equipment 7 includes identification current controllable controlled external equipment 7 and each controlled external equipment 7
State, location information;
Augmented reality device 4 is by the environment understanding information of the collected system ambient enviroment of context aware systems 5 and works as
The command icon 16 that the executable operation information 12 of preceding controllable controlled external equipment generates Dynamically Announce interface is presented to subject,
When augmented reality equipment alternately flashes candidate command icon, 2 continuous collecting brain activity signal of eeg collection system works as flashing
When command icon is that subject is intended to the command icon of selection, subject brain can generate P300 signal 14;Eeg collection system 2 from
This P300 signal 14 is picked out in brain activity overall signal, eeg collection system 2 is accordingly by this corresponding order of P300 signal
The control instruction 15 that icon refers to passes to core control system 3;
Core control system 3 generates corresponding control instruction according to brain mind figure, and exports control instruction to corresponding
Controlled external equipment 7, realizes the control to controlled external equipment 7.
The acquisition methods of the moving area information of current manned mobile platform 6 of the present invention are:
Several navigation target points are set in system ambient enviroment, current manned movement is determined by navigation by recognition target point
The moving area information of platform 6;
Digital label is provided on each navigation target point, digital label content includes: the unique of corresponding navigation target point
The two-dimensional coordinate information of id information, corresponding navigation target point;Context aware systems 5 include the identification dress for identifying the digital label
It sets, is the digital label content that can extract each navigation target point by identification device, obtain the removable of current manned mobile platform 6
Dynamic area information.
The acquisition methods of the executable operation information of current controllable external controlled device of the present invention are:
It is respectively arranged with digital label on controlled external equipment 7 and manned mobile platform 6 in external controlled device,
Digital label content includes: the title of corresponding controlled external equipment 7, the unique ID information of corresponding controlled external equipment 7, corresponding quilt
Control the two-dimensional coordinate information of external equipment 7 and the executable operation information of corresponding controlled external equipment 7;Wherein manned movement is flat
The digital label content of platform 6 further includes 3 d pose information;
Context aware systems 5 include the identification device for identifying the digital label, be can extract by identification device each controlled
The digital label content of external equipment 7 obtains the executable operation information of current controllable controlled external equipment 7.
The Dynamically Announce interface of augmented reality device 4 of the present invention is specific as follows:
Including scene interface, 6 control interface of manned mobile platform and appliance control interface, three are alternately called when system is run
Kind interface jumps interface according to specific selection;
Manned 6 control interface of mobile platform can jump to scene interface and equipment control for controlling manned mobile platform 6
Interface processed;
Scene interface is based on subject visual field scene build, and selection tries potential interested target and life from the subject visual field
At scene interface is according to subject visual field scene dynamic change;
Appliance control interface is to be tested to control external control interface, and subject selects outside one in scene user's dynamic interface
Portion controls equipment as current control target, then opens corresponding appliance control interface, and pass through the phase on appliance control interface
It should order and current control target is operated.
6 control interface of manned mobile platform of the present invention includes the order for controlling manned 6 posture of mobile platform, opens field
The order at scape interface, Hung system order and restart system command.
The order of manned 6 posture of mobile platform of control of the present invention includes step-by-step movement order and autonomous order;
Step-by-step movement order has 5, including to the left, to the right, forward, turn left 45 ° and turn right 45 °, step-by-step movement order adjustment carry
6 posture of people's mobile platform;
Autonomous order has 1, and system gives a navigation target point coordinate, and manned 6 autokinetic movement of mobile platform arrives
Designated position other than Hung system order, no longer receives new control when manned mobile platform 6 starts autokinetic movement from system
System order.
Appliance control interface of the present invention includes the order of subject control external equipment, is tested in scene dynamic interface
When selecting an external equipment, then the appliance control interface of the external equipment is called to carry out related manipulation.
The means of communication between augmented reality device 4 of the present invention, context aware systems 5 and eeg collection system 2 are as follows:
Augmented reality device 4, context aware systems 5 and eeg collection system 2 are interconnected at an office using wireless communication
In the net of domain, each section independently executes respective main task, and executes communication task between each section.
Augmented reality device 4, context aware systems 5 and eeg collection system 2 of the present invention rely on Multi-thread control, main
Task and communication task respectively exclusively enjoy a thread.
When executing communication task, eeg collection system 2 as the server communicated in local area network, augmented reality device 4 and
Context aware systems 5 are connected as two independent clients with eeg collection system 2;
Eeg collection system 2 obtains perception information from context aware systems 5, and perception information is encoded to core control system
In 3, core control system 3 is presented P300 and stimulates to augmented reality device 4, is supplied to user and carries out human-computer interaction;
After subject makes a choice, core control system 3 determines P300 stimulation and according to being presented with context aware systems 5
The fusion of environment understanding information and executable operation determines the control object that subject selects in the scene.
Claims (10)
1. a kind of manned mobile platform system of brain-machine interaction based on augmented reality, it is characterised in that:
Including brain-computer interface module and controlled external equipment;Wherein brain-computer interface module includes eeg collection system, core control
System, augmented reality device and context aware systems;Controlled external equipment includes manned mobile platform and controlled electric appliance equipment;
Context aware systems obtain the environment understanding information of system ambient enviroment by sensory perceptual system ambient environmental conditions, wherein being
The environment understanding information of system ambient enviroment includes the moving area information of current manned mobile platform;Context aware systems simultaneously
The executable operation information of current controllable controlled external equipment, perception are obtained by perceiving current controllable controlled external equipment
Controlled external equipment includes state, the location information of identification current controllable controlled electric appliance equipment and each controlled external equipment;
Augmented reality device is controllable by the environment understanding information of the collected system ambient enviroment of context aware systems and currently
Controlled external equipment executable operation information generate Dynamically Announce interface command icon be presented to subject, work as augmented reality
When equipment alternately flashes candidate command icon, eeg collection system continuous collecting brain activity signal, when flash command icon is
When subject is intended to the command icon of selection, subject brain can generate P300 signal;Eeg collection system is whole from brain activity
This P300 signal is picked out in signal, eeg collection system accordingly refers to the control that the corresponding command icon of this P300 signal refers to
Order passes to core control system;
Core control system generates corresponding control instruction according to brain mind figure, and exports outside control instruction to corresponding be controlled
Portion's equipment realizes the control to controlled external equipment.
2. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as described in claim 1, it is characterised in that:
The acquisition methods of the moving area information of the current manned mobile platform are:
Several navigation target points are set in system ambient enviroment, current manned mobile platform is determined by navigation by recognition target point
Moving area information;
Digital label is provided on each navigation target point, digital label content includes: unique ID letter of corresponding navigation target point
The two-dimensional coordinate information of breath, corresponding navigation target point;Context aware systems include the identification device for identifying the digital label, are passed through
Identification device is the digital label content that can extract each navigation target point, obtains the moving area letter of current manned mobile platform
Breath.
3. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as described in claim 1, it is characterised in that:
The acquisition methods of the executable operation information of the current controllable controlled external equipment are: the quilt in controlled external equipment
Digital label is respectively arranged on control electrical equipment and manned mobile platform, digital label content includes: corresponding controlled outer
The title of portion's equipment, the unique ID information of corresponding controlled external equipment, corresponding controlled external equipment two-dimensional coordinate information and
The executable operation information of corresponding controlled external equipment;Wherein the digital label content of manned mobile platform further includes 3 d pose
Information;
Context aware systems include the identification device for identifying the digital label, can extract each controlled outside by identification device and set
Standby digital label content obtains the executable operation information of current controllable controlled external equipment.
4. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as described in claim 1, it is characterised in that:
The Dynamically Announce interface of the augmented reality device is specific as follows:
Including scene interface, manned mobile platform control interface and appliance control interface, three kinds of boundaries are alternately called when system is run
Face jumps interface according to specific selection;
Manned mobile platform control interface can jump to scene interface and appliance control interface for controlling manned mobile platform;
Scene interface is based on subject visual field scene build, and selection tries potential interested target and generates, field from the subject visual field
Scape interface is according to subject visual field scene dynamic change;
Appliance control interface is that subject controls external control interface, and subject selects an external control in scene user's dynamic interface
Control equipment then opens corresponding appliance control interface, and pass through the corresponding life on appliance control interface as current control target
It enables and current control target is operated.
5. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as claimed in claim 4, it is characterised in that:
The manned mobile platform control interface includes the order for controlling manned mobile platform posture, the life for opening scene interface
It enables, Hung system order and restart system command.
6. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as claimed in claim 4, it is characterised in that:
The order of the manned mobile platform posture of control includes step-by-step movement order and autonomous order;
Step-by-step movement order has 5, including to the left, to the right, forward, turn left 45 ° and to turn right 45 °, step-by-step movement order adjusts manned shifting
Moving platform posture;
Autonomous order has 1, and system gives a navigation target point coordinate, manned mobile platform autokinetic movement to specific bit
It sets, when manned mobile platform starts autokinetic movement, other than Hung system order, new control command is no longer received from system.
7. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as claimed in claim 4, it is characterised in that:
The appliance control interface includes the order of subject control external equipment, and subject selects outside one in scene dynamic interface
When portion's equipment, then the appliance control interface of the external equipment is called to carry out related manipulation.
8. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as described in claim 1, it is characterised in that:
The means of communication between the augmented reality device, context aware systems and eeg collection system are as follows:
Augmented reality device, context aware systems and eeg collection system are interconnected in a local area network using wireless communication,
Each section independently executes respective main task, and executes communication task between each section.
9. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as claimed in claim 8, it is characterised in that:
The augmented reality device, context aware systems and eeg collection system rely on Multi-thread control, and main task and communication are appointed
Business respectively exclusively enjoys a thread.
10. a kind of manned mobile platform system of brain-machine interaction based on augmented reality as claimed in claim 9, feature exist
In:
When executing communication task, eeg collection system is as the server communicated in local area network, augmented reality device and environment sense
Know that system is connected as two independent clients with eeg collection system;
Eeg collection system obtains perception information from context aware systems, perception information is encoded in core control system, core
Heart control system presentation P300, which is stimulated, gives augmented reality device, is supplied to user and carries out human-computer interaction;
After subject makes a choice, core control system determines P300 stimulation and is managed according to the environment presented with context aware systems
The fusion of solution information and executable operation determines the control object that subject selects in the scene.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910466601.4A CN110209073A (en) | 2019-05-31 | 2019-05-31 | The manned mobile platform system of brain-machine interaction based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910466601.4A CN110209073A (en) | 2019-05-31 | 2019-05-31 | The manned mobile platform system of brain-machine interaction based on augmented reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110209073A true CN110209073A (en) | 2019-09-06 |
Family
ID=67789812
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910466601.4A Pending CN110209073A (en) | 2019-05-31 | 2019-05-31 | The manned mobile platform system of brain-machine interaction based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110209073A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110824979A (en) * | 2019-10-15 | 2020-02-21 | 中国航天员科研训练中心 | Unmanned equipment control system and method |
CN112859628A (en) * | 2021-01-19 | 2021-05-28 | 华南理工大学 | Intelligent home control method based on multi-mode brain-computer interface and augmented reality |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090063866A1 (en) * | 2007-08-29 | 2009-03-05 | Jiri Navratil | User authentication via evoked potential in electroencephalographic signals |
CN104083258A (en) * | 2014-06-17 | 2014-10-08 | 华南理工大学 | Intelligent wheel chair control method based on brain-computer interface and automatic driving technology |
CN106339091A (en) * | 2016-08-31 | 2017-01-18 | 博睿康科技(常州)股份有限公司 | Augmented reality interaction method based on brain-computer interface wearing system |
CN106681494A (en) * | 2016-12-07 | 2017-05-17 | 华南理工大学 | Environment control method based on brain computer interface |
CN107346179A (en) * | 2017-09-11 | 2017-11-14 | 中国人民解放军国防科技大学 | Multi-moving-target selection method based on evoked brain-computer interface |
-
2019
- 2019-05-31 CN CN201910466601.4A patent/CN110209073A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090063866A1 (en) * | 2007-08-29 | 2009-03-05 | Jiri Navratil | User authentication via evoked potential in electroencephalographic signals |
CN104083258A (en) * | 2014-06-17 | 2014-10-08 | 华南理工大学 | Intelligent wheel chair control method based on brain-computer interface and automatic driving technology |
CN106339091A (en) * | 2016-08-31 | 2017-01-18 | 博睿康科技(常州)股份有限公司 | Augmented reality interaction method based on brain-computer interface wearing system |
CN106681494A (en) * | 2016-12-07 | 2017-05-17 | 华南理工大学 | Environment control method based on brain computer interface |
CN107346179A (en) * | 2017-09-11 | 2017-11-14 | 中国人民解放军国防科技大学 | Multi-moving-target selection method based on evoked brain-computer interface |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110824979A (en) * | 2019-10-15 | 2020-02-21 | 中国航天员科研训练中心 | Unmanned equipment control system and method |
CN112859628A (en) * | 2021-01-19 | 2021-05-28 | 华南理工大学 | Intelligent home control method based on multi-mode brain-computer interface and augmented reality |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11900527B2 (en) | Devices, methods, and graphical user interfaces for interacting with three-dimensional environments | |
Lotte et al. | Combining BCI with virtual reality: towards new applications and improved BCI | |
EP3140719B1 (en) | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects | |
US20170095383A1 (en) | Intelligent wheel chair control method based on brain computer interface and automatic driving technology | |
CN112507799A (en) | Image identification method based on eye movement fixation point guidance, MR glasses and medium | |
Putze et al. | Augmented reality interface for smart home control using SSVEP-BCI and eye gaze | |
CN109496331A (en) | The context aware of user interface | |
CN110377053B (en) | Flight control method and device of unmanned aerial vehicle | |
CN110209073A (en) | The manned mobile platform system of brain-machine interaction based on augmented reality | |
JP7367132B2 (en) | EEG interface device | |
US11426878B2 (en) | Method for controlling robot based on brain-computer interface and apparatus for controlling meal assistance robot thereof | |
CN109241900B (en) | Wearable device control method and device, storage medium and wearable device | |
CN109984911B (en) | Massage equipment with virtual reality function and control method thereof | |
CN109558004A (en) | A kind of control method and device of human body auxiliary robot | |
Ma et al. | Trigger motion and interface optimization of an eye-controlled human-computer interaction system based on voluntary eye blinks | |
Fejtová et al. | Hands-free interaction with a computer and other technologies | |
CN108319367A (en) | A kind of brain-machine interface method | |
CN112860068A (en) | Man-machine interaction method, device, electronic equipment, medium and computer program product | |
CN110721443A (en) | Running assisting method based on physiological data interaction | |
CN114115534B (en) | Relationship enhancement system and method based on room type interactive projection | |
US20160139414A1 (en) | Information processing method and wearable device | |
CN110362201A (en) | Brain-computer interaction structured environment control method, system and medium based on environment understanding | |
KR20190085466A (en) | Method and device to determine trigger intent of user | |
US10409464B2 (en) | Providing a context related view with a wearable apparatus | |
CN111967333B (en) | Signal generation method, system, storage medium and brain-computer interface spelling device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190906 |