CN107885124A - Brain eye cooperative control method and system in a kind of augmented reality environment - Google Patents

Brain eye cooperative control method and system in a kind of augmented reality environment Download PDF

Info

Publication number
CN107885124A
CN107885124A CN201711166187.2A CN201711166187A CN107885124A CN 107885124 A CN107885124 A CN 107885124A CN 201711166187 A CN201711166187 A CN 201711166187A CN 107885124 A CN107885124 A CN 107885124A
Authority
CN
China
Prior art keywords
eye
augmented reality
helmet
brain
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711166187.2A
Other languages
Chinese (zh)
Other versions
CN107885124B (en
Inventor
代京
王振亚
刘冬
王琳娜
李旗挺
程奇峰
张宏江
袁本立
宋盛菊
雍颖琼
阳佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Launch Vehicle Technology CALT
Original Assignee
China Academy of Launch Vehicle Technology CALT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Launch Vehicle Technology CALT filed Critical China Academy of Launch Vehicle Technology CALT
Priority to CN201711166187.2A priority Critical patent/CN107885124B/en
Publication of CN107885124A publication Critical patent/CN107885124A/en
Application granted granted Critical
Publication of CN107885124B publication Critical patent/CN107885124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Abstract

Brain eye cooperative control method and system in a kind of augmented reality environment, for the portable job requirements of spacefarer, carry out Helmet Mounted Display and the structure-integrated design of viewpoint tracking mode eye tracker, the integrated embedded augmented reality operating environment that exploitation matches with digital helmet first, realize the selection of the mechanical arm free degree and action pre-selection based on eye-tracking, then the identification to brain-computer interface stable state vision inducting normal form control information is passed through, realize that the stable of mechanical arm six degree of freedom control action is intended to understand and finely controlled, finally under the conditions of the digital integration helmet, realize Collaborative Control of the EEG signals with vision tracking signal to controlled device.Compared with prior art, the present invention solves the problems, such as weaker simple EEG signals control ability, robustness and stablizes insufficient, has good use value.

Description

Brain eye cooperative control method and system in a kind of augmented reality environment
Technical field
The present invention relates to being related to human-computer fusion control field, the brain eye Collaborative Control in particularly a kind of augmented reality environment Method and system.
Background technology
In China's Future Manned Space Flight project, the in-orbit task of spacefarer (such as space station, large space scientific experiment platform Spacecraft maintainable technology on-orbit, assembling, maintenance task) increase increasingly, it is necessary to which spacefarer, which wears space suit, goes out cabin progress space walking Mission requirements are constantly lifted, and the complexity and arduousness of operation will be far super current.
Spacefarer carries out the individual capability faced in extra-vehicular working under the conditions of weightlessness, individual movement are limited etc. Conventional control passage is limited under the conditions of limited, space suit, less important work system control passage is insufficient, the operation visual field is insufficient and lacks The problems such as real time job is instructed.For example, international space station provides the complexity such as big machinery arm for the operation of spacecraft in-orbit service Operation instrument, Chinese Space station also deploys big machinery arm, and subsequent deployment small-sized machine arm is completed into in-orbit service operation. But having not yet to see provides the mechanical device of cabin operation auxiliary for spacefarer.
The content of the invention
Present invention solves the technical problem that it is:Overcome the deficiencies in the prior art, there is provided in a kind of augmented reality environment Brain eye cooperative control method and system, solve the problems, such as that spacefarer goes out job information deficiency in the man-machine coordination operation of cabin, realize The performance enhancement of " people-machine-ring " in spacefarer's Mission Operations.
The present invention technical solution be:A kind of brain eye cooperative control method in augmented reality environment, including it is as follows Step:
(1) Helmet Mounted Display of integration construct structure, viewpoint tracking mode eye tracker, obtain digital helmet;Described number Prefix helmet can integrate eye-tracking and augmented reality module, there is provided the spacecraft working scene under augmented reality;Described head Helmet display can provide the display function to augmented reality scene;Described viewpoint tracking mode eye tracker can be realized to eyeball The tracking of action is rotated, stared, realizes the dynamic control of eye;
(2) integrated embedded device that is being matched with digital helmet and being capable of augmented reality operating environment is built;
(3) vision tracking signal is read according to viewpoint tracking mode eye tracker and realizes the mechanical arm free degree based on eye-tracking Selection, action pre-selection, then brain-computer interface in digital helmet read EEG signals realize the amendment of mechanical arm six degree of freedom, Action control.
Described viewpoint tracking mode eye tracker includes knockdown hololens, and Helmet Mounted Display extracts including EEG signals Device, and the motion process of mechanical arm is shown in the virtual simulation environment of integrated embedded device structure.
Described integrated embedded device can establish the space background environment of virtual visualization.
Described viewpoint tracking mode eye tracker is provided with the eye-tracking algorithm based on corneal reflection, and foundation is based on staring and eye The mechanical arm free degree triggering of ball sliding trace and control model.
Described Helmet Mounted Display is provided with the triggering of EEG signals mechanical arm six degree of freedom and the control model of different frequency.
Described viewpoint tracking mode eye tracker calculates Purkinje image point using Purkinje image point as benchmark, according to gray value With pupil in the position of iris, and then obtain the input direction of sight.
The regional location that described Helmet Mounted Display determines to point to using LDA models.
A kind of brain eye cooperative control system in augmented reality environment, including digital helmet, integrated embedded device, numeral The helmet includes Helmet Mounted Display, viewpoint tracking mode eye tracker;Wherein:
Helmet Mounted Display can provide the display function to augmented reality scene;It is real that EEG signals are read according to brain-computer interface Existing mechanical arm six degree of freedom amendment, action control;
The tracking of action can be realized to Rotation of eyeball, stared to viewpoint tracking mode eye tracker, realize the dynamic control of eye;According to regarding Point tracking mode eye tracker reads vision tracking signal and realizes the mechanical arm free degree selection based on eye-tracking, action pre-selection;
Integrated embedded device matched with digital helmet and can augmented reality operating environment show, establish virtual visualization Space background environment.
Described viewpoint tracking mode eye tracker includes knockdown hololens, and Helmet Mounted Display extracts including EEG signals Device, and the motion process of mechanical arm is shown in the virtual simulation environment of integrated embedded device structure.
Described Helmet Mounted Display is provided with the triggering of EEG signals mechanical arm six degree of freedom and the control model of different frequency, adopts The regional location for determining to point to LDA models.
The present invention compared with prior art the advantages of be:
(1) present invention proposes a kind of new people of a variety of man-machine interaction means Application of composite such as brain control, eye control, augmented reality Machine merges Intelligent control technological frame, provides a new less important work system control passage for spacefarer, and build one The visual environment of full working scene and easily auxiliary information acquiring way, significantly improve astronaut operation efficiency.Utilize brain machine New brain eye Collaborative Control mode is realized in interaction technique and Visual Tracking, design, and it is man-machine that emphasis solution spacefarer goes out cabin In work compound the problem of control passage deficiency;
(2) present invention by spacefarer's Mission Operations " people-machine-ring " ability expand utilize using augmented reality as The digital helmet development on basis, the augmented reality that operating environment, task instruction and Job Operations are provided for spacefarer show solve The problem of spacefarer goes out job information deficiency in the man-machine coordination operation of cabin, realizes in spacefarer's Mission Operations " people-machine-ring " Performance enhancement.
Brief description of the drawings
Fig. 1 is the brain eye cooperative control method flow chart in a kind of augmented reality environment.
Embodiment
The present invention is directed to the application scenarios of astronaut space extra-vehicular working, a kind of brain control of innovative proposition, eye control, enhancing The new human-computer fusion Intelligent control technological frame of a variety of man-machine interaction means Application of composite such as reality, one is provided for spacefarer New less important work system control passage, and the visual environment of one full working scene of structure and easily auxiliary information acquisition Approach, significantly improve astronaut operation efficiency.The present invention realizes by using brain-machine interaction technology and Visual Tracking, design New brain eye Collaborative Control mode, emphasis solve the problems, such as that spacefarer goes out control passage deficiency in the man-machine coordination operation of cabin, real The ability of " people-machine-ring ", which is expanded, in existing spacefarer's Mission Operations utilizes the digital helmet based on augmented reality to grind System, the augmented reality that operating environment, task instruction and Job Operations are provided for spacefarer shows, solving spacefarer, to go out cabin man-machine The performance enhancement of " people-machine-ring " in work compound the problem of job information deficiency, is realized in spacefarer's Mission Operations.
1st, space helmet display and the structure-integrated design of viewpoint tracking mode eye tracker
The digital helmet display function that the present invention uses provides instruction and presented for brain-machine interaction, passes through vision tracing detection With brain-machine signal detection, the parsing being intended to control is realized, digital helmet is the hardware platform of brain eye Collaborative Control.
1) display and the design of eye-tracking hardware integrationization
Digital helmet display in the present invention is a independent computer equipment, its built-in CPU, GPU and special Hologram processor.Transparent display screen is contained on the eyeglass of black, stereo sound effect system makes operator also can while observation Hear voice messaging, while a whole set of also built-in sensor of the helmet is used for realizing every preparatory function.Double transparent glasses lens are each The content of right and left eyes is shown, by the brain of user, synthesizes real 3D hologram images, but user is simultaneously it can be seen that true The object of the real space.Computer on the helmet can independently realize holographic calculating.As long as user with sight and voice with it is built-in Multiple functional computer interaction.
The helmet has knockdown hololens, complicated sensor array, and needs to handle all expressions in the eyes, sound And the processor of surrounding environment.As the important component of the helmet, depth perception answers camera clearly to identify operation The eye motion of personnel.
Digital helmet further comprises CPU, GPU, and holographic process unit HPU.CPU and GPU only be responsible for startup program and Hologram image is shown, HPU then handles the data from sensor and camera in real time, it ensure that can be rapid to all operations Reaction.
2) vision tracing detection module
Digital helmet projects the common outside line light velocity to extract feature based on active, and 8 low-power consumption should be set around each eye Infrared LED, eyeball tracking sensor are placed under ophthalmic len, it is desirable to both visual line of sight will not be had an impact, and can tracking uses The pupil activity of person.
Detection means uses Development of Modular, it is incorporated into well among the helmet, while can be according to user's Eyeglass is placed in ad-hoc location by characteristic, comfortably clearly experiences virtual reality, and tracking speed reaches 120~380Hz, can kept up with The movement velocity of eyes, the accurate, eyeball tracking of low time delay is realized, can both keep definition, and can reduces spinning sensation.
2nd, the embedded augmented reality Environment Design based on digital helmet
For spacecraft space operation smart steering mission requirements, the present invention establishes augmented reality scene in digital helmet Environment, support is provided for the novel human-machine interaction control model based on EEG signals and eye movement.The interaction scenarios are with space flight Device space operating environment is background, including mechanical arm modeling, background modeling, the modeling of control track etc..
Mechanical arm models:Three-dimensional modeling is carried out according to real sixdegree-of-freedom simulation.Each free degree individually models, knot Structure details and size are identical with real structure, are then grouped together in order., can be with to the power supply and signal cable of mechanical arm It need not model, to reduce the technical difficulty in animation and software development, and it is not any to the action control of mechanical arm Influence.Each free degree do not influence the state of the other free degree in motion, and still, whole mechanical arm is according to from the bottom up Master slave relation acts.When i.e. base rotates, its upper part is driven;When manipulator rotates, driving mechanical is made manually, but subassembly moves When making, master unit does not follow motion.The color of each free degree has two kinds of normal texture color and highlight color.Normal texture face Color is rendered using the outward appearance of real sixdegree-of-freedom simulation as texture.Highlight color is red, has characterized the free degree It is selected, stepped or continuous action will be carried out.
Background modeling:Several celestial body models are nearby established, with nine major planets of Solar System Simulator, in respective elliptic orbit Upper operation.The surface textures of celestial body model are consistent with the celestial body in true space, to operator with the impression in true space. Space background is black, and the bright spot interspersed above fragmentarily characterizes the star of distant place.The truth of overall situation and space Unanimously.Mechanical arm is positioned on celestial body surface, is characterized, shown at mechanical arm with a big plane with celestial body surface texture textures In space environment, give user a kind of impression truly immersed.
Control track modeling:To by grabbing object, establishing out two kinds of spheroid and regular hexahedron.Held thing original state It is placed on ground.Crawl to demo mode and action is put down, be fabricated to two groups of actions of mechanical arm continuously by developer Animation, movement locus meet the operating principle of mechanical arm, and only one degree of freedom action, is then changed to next free degree every time Action.Actuation time is consistent with the operation made to the effector response of real sixdegree-of-freedom simulation.
3rd, EEG signals and eye move the fusion method of signal
The present invention proposes a kind of intelligent control and method of operating based on the fusion of brain eye, and this method is mainly in space flight During device actual job, the problems such as EEG's Recognition is slower, eye-tracking and EEG signals are blended, first by regarding Point estimation determines relative position relation substantially, then realizes that more accurate position positions using eeg signal classification.
1) eye-tracking algorithm
In terms of eye-tracking, the technical principle that uses of the present invention for:When the face that people is irradiated to an infrared auxiliary light source During portion, reflection image is formed on cornea eye surface, this reflection image is referred to as pul (Purkinje) spot by the emperor himself;Human eye is being stared at regarding not With position content when, eyeball can occur to rotate accordingly.It is assumed that experimenter it is motionless in the case of, due to infraluminescence Fixed during the position of diode, and eyeball is an approximate spheroid, when Rotation of eyeball, it is believed that Purkinje image point Absolute position is constant;And iris will occur to change accordingly in the position of pupil, such infrared auxiliary light source is on iris The Purkinje image point of formation will change in the relative position relation of iris, and the determination of this relative position relation can pass through Image procossing is realized.
Image procossing is based on following characteristics information:
Purkinje image point is most bright, and pupil is most dark in iris;
Purkinje image point is essentially a bit, and pupil is an approximate circle in iris.So can be by the big of gray value It is small to find out Purkinje image point and pupil in the position of iris, then by the relative position relation between them, draw and regard The input direction of line.
Eye-tracking control logic:The selection that the free degree moves forward and reverse (clockwise/counterclockwise), Ran Houli are carried out first Determined now to want the free degree for producing motion with the gaze signal of eyeball sight.When eye watches the corresponding free degree attentively, machine In the free degree persistent movement can occur for tool arm.
Move signal identification using eye, when sight watches the free degree region of selection attentively, after gaze duration 3s, it is corresponding from " red " is shown by spending region, if now continuing to watch 2s attentively, corresponding free degree motion can be triggered.Corresponded to when people watches mechanical arm attentively The free degree when, the free degree is in persistent movement state, when operator removes sight, then motion stop.In eye-tracking In control model, it is directly proportional that freedom degree rotating degree and people watch duration attentively.
2) EEG signals information recognizer
On the basis of eye-tracking, the algorithm that eeg signal classification identification of the present invention is selected is linear discriminent analysis (Linear discriminant analysis), also known as Fisher linear discriminants (Fisher linear discriminant).LDA has quite varied application in the field of brain-computer interfaces based on P300, and has reached and made us full The effect of meaning.A kind of effective Feature Extraction Method of linear discriminant.It, which is analyzed, can extract classification information, compress dimension. It projects to higher-dimension sample in optimal discriminant vector space, ensures that original sample has in the new subspace after projection and most preferably may be used Separation property.
The detailed process of LDA algorithm is:
N number of sample X=[x for being characterized as d dimensions1,x2,…,xn] T, wherein in classification ω1In have N1Individual sample, in addition N2Individual sample is in classification ω2In.Projecting direction is represented with ω, then sample X is represented by after ω projections:Y=ωTx.Work as x For two dimension when, the straight line that need to only find can separate characteristic point after a projection can (direction ω).
For test sample after Projection Character, its decision value adds the symbol of sum to determine class categories, that is, corresponding Control output.In the present invention tests, often wheel, which stimulates, all can respectively trigger a P300 signal.When judging online, due to P300 ripples It is positive, so only needing after stimulation is applied mechanically into LDA model projections, judges maximal projection value, and can after intersecting to it Obtain control output target.
The present invention can generate the stimulation interface needed for the control of brain machine in AR environment.So that SSVEP is stimulated as an example, it is necessary to In augmented reality environment, the stimulation block of different frequency flicker is established, the stimulation of correlation is formed to operator.The stimulation should be with electronics Stimulation form on screen is consistent, and guarantor receives the validity stimulated, and stimulation can be presented by the way of menu mode.
4th, case is verified
In terms of EEG signals test, the present invention constructs offline and in wire module, and EEG signals are entered using leaving-one method Row test.
The motion state of 6 frees degree of mechanical arm is as follows
During Collaborative Control, the stopping criterion of off-line training is set to by the present invention:Ten wheels of continuous two examinations time are flat Equal accuracy is all more than 0.85.Meanwhile in order to ensure off-line module can accurate modeling, be unlikely to take too long again, Examination number is controlled between 5~20, after the optimization that offline dynamic stops, being cooperateed with by the subject's brain eye trained up Control accuracy can reach 95% or so, and EEG's Recognition accuracy reaches more than 90%.
The content not being described in detail in description of the invention belongs to the known technology of those skilled in the art.

Claims (10)

1. the brain eye cooperative control method in a kind of augmented reality environment, it is characterised in that comprise the following steps:
(1) Helmet Mounted Display of integration construct structure, viewpoint tracking mode eye tracker, obtain digital helmet;Described digital head Helmet can integrate eye-tracking and augmented reality module, there is provided the spacecraft working scene under augmented reality;The described helmet shows Show that device can provide the display function to augmented reality scene;Described viewpoint tracking mode eye tracker can be realized to be turned to eyeball The tracking of action is moved, stared, realizes the dynamic control of eye;
(2) integrated embedded device that is being matched with digital helmet and being capable of augmented reality operating environment is built;
(3) vision tracking signal is read according to viewpoint tracking mode eye tracker and realizes that the mechanical arm free degree based on eye-tracking is selected Select, act pre-selection, then the brain-computer interface in digital helmet reads EEG signals and realizes the amendment of mechanical arm six degree of freedom, moves Control.
2. the brain eye cooperative control method in a kind of augmented reality environment according to claim 1, it is characterised in that:It is described Viewpoint tracking mode eye tracker include knockdown hololens, Helmet Mounted Display includes EEG signals extraction element, and is collecting The motion process of mechanical arm is shown into the virtual simulation environment of embedded equipment structure.
3. the brain eye cooperative control method in a kind of augmented reality environment according to claim 1 or 2, it is characterised in that: Described integrated embedded device can establish the space background environment of virtual visualization.
4. the brain eye cooperative control method in a kind of augmented reality environment according to claim 1 or 2, it is characterised in that: Described viewpoint tracking mode eye tracker is provided with the eye-tracking algorithm based on corneal reflection, and foundation is based on staring and eyeball slides rail The mechanical arm free degree triggering of mark and control model.
5. the brain eye cooperative control method in a kind of augmented reality environment according to claim 1 or 2, it is characterised in that: Described Helmet Mounted Display is provided with the triggering of EEG signals mechanical arm six degree of freedom and the control model of different frequency.
6. the brain eye cooperative control method in a kind of augmented reality environment according to claim 4, it is characterised in that:It is described Viewpoint tracking mode eye tracker using Purkinje image point as benchmark, Purkinje image point and pupil are calculated in iris according to gray value Position, and then obtain the input direction of sight.
7. the brain eye cooperative control method in a kind of augmented reality environment according to claim 5, it is characterised in that:It is described Helmet Mounted Display using LDA models determine point to regional location.
8. the brain eye cooperative control system in a kind of augmented reality environment, it is characterised in that including digital helmet, integrated embedded Device, digital helmet include Helmet Mounted Display, viewpoint tracking mode eye tracker;Wherein:
Helmet Mounted Display can provide the display function to augmented reality scene;EEG signals are read according to brain-computer interface and realize machine The amendment of tool arm six degree of freedom, action control;
The tracking of action can be realized to Rotation of eyeball, stared to viewpoint tracking mode eye tracker, realize the dynamic control of eye;According to viewpoint with Track formula eye tracker reads vision tracking signal and realizes the mechanical arm free degree selection based on eye-tracking, action pre-selection;
Integrated embedded device matched with digital helmet and can augmented reality operating environment show, establish virtual visualization too Empty background environment.
9. the brain eye cooperative control system in a kind of augmented reality environment according to claim 8, it is characterised in that:It is described Viewpoint tracking mode eye tracker include knockdown hololens, Helmet Mounted Display includes EEG signals extraction element, and is collecting The motion process of mechanical arm is shown into the virtual simulation environment of embedded equipment structure.
10. the brain eye cooperative control system in a kind of augmented reality environment according to claim 8, it is characterised in that:Institute The Helmet Mounted Display stated is provided with the triggering of EEG signals mechanical arm six degree of freedom and the control model of different frequency, using LDA models It is determined that the regional location pointed to.
CN201711166187.2A 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment Active CN107885124B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711166187.2A CN107885124B (en) 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711166187.2A CN107885124B (en) 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment

Publications (2)

Publication Number Publication Date
CN107885124A true CN107885124A (en) 2018-04-06
CN107885124B CN107885124B (en) 2020-03-24

Family

ID=61778321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711166187.2A Active CN107885124B (en) 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment

Country Status (1)

Country Link
CN (1) CN107885124B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109582131A (en) * 2018-10-29 2019-04-05 中国航天员科研训练中心 The asynchronous mixing brain-machine interface method of one kind and system
CN109634407A (en) * 2018-11-08 2019-04-16 中国运载火箭技术研究院 It is a kind of based on control method multimode man-machine heat transfer agent synchronous acquisition and merged
CN109875777A (en) * 2019-02-19 2019-06-14 西安科技大学 It is a kind of with the wheelchair for taking object function and its to take object control method
CN110109550A (en) * 2019-05-14 2019-08-09 太原理工大学 A kind of VR immersion is outer planet detection demo system
CN110134243A (en) * 2019-05-20 2019-08-16 中国医学科学院生物医学工程研究所 A kind of brain control mechanical arm shared control system and its method based on augmented reality
CN110286755A (en) * 2019-06-12 2019-09-27 Oppo广东移动通信有限公司 Terminal control method, device, electronic equipment and computer-readable storage medium
CN110412996A (en) * 2019-06-18 2019-11-05 中国人民解放军军事科学院国防科技创新研究院 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
CN111728608A (en) * 2020-06-29 2020-10-02 中国科学院上海高等研究院 Augmented reality-based electroencephalogram signal analysis method, device, medium and equipment
CN113199469A (en) * 2021-03-23 2021-08-03 中国人民解放军63919部队 Space arm system, control method for space arm system, and storage medium
CN114237388A (en) * 2021-12-01 2022-03-25 辽宁科技大学 Brain-computer interface method based on multi-mode signal recognition
EP4177714A1 (en) 2021-11-03 2023-05-10 Sony Group Corporation Audio-based assistance during extravehicular activity

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11243400B1 (en) * 2020-07-17 2022-02-08 Rockwell Collins, Inc. Space suit helmet having waveguide display

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103293673A (en) * 2013-06-03 2013-09-11 卫荣杰 Cap integrated with display, eye tracker and iris recognition instrument
CN103955269A (en) * 2014-04-09 2014-07-30 天津大学 Intelligent glass brain-computer interface method based on virtual real environment
CN105710885A (en) * 2016-04-06 2016-06-29 济南大学 Service-oriented movable manipulator system
US20160187654A1 (en) * 2011-02-28 2016-06-30 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
CN106671084A (en) * 2016-12-20 2017-05-17 华南理工大学 Mechanical arm self-directed auxiliary system and method based on brain-computer interface
CN106774885A (en) * 2016-12-15 2017-05-31 上海眼控科技股份有限公司 A kind of vehicle-mounted eye movement control system
CN107066085A (en) * 2017-01-12 2017-08-18 惠州Tcl移动通信有限公司 A kind of method and device based on eyeball tracking control terminal
CN107145086A (en) * 2017-05-17 2017-09-08 上海青研科技有限公司 A kind of Eye-controlling focus device and method for exempting from calibration

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160187654A1 (en) * 2011-02-28 2016-06-30 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
CN103293673A (en) * 2013-06-03 2013-09-11 卫荣杰 Cap integrated with display, eye tracker and iris recognition instrument
CN103955269A (en) * 2014-04-09 2014-07-30 天津大学 Intelligent glass brain-computer interface method based on virtual real environment
CN105710885A (en) * 2016-04-06 2016-06-29 济南大学 Service-oriented movable manipulator system
CN106774885A (en) * 2016-12-15 2017-05-31 上海眼控科技股份有限公司 A kind of vehicle-mounted eye movement control system
CN106671084A (en) * 2016-12-20 2017-05-17 华南理工大学 Mechanical arm self-directed auxiliary system and method based on brain-computer interface
CN107066085A (en) * 2017-01-12 2017-08-18 惠州Tcl移动通信有限公司 A kind of method and device based on eyeball tracking control terminal
CN107145086A (en) * 2017-05-17 2017-09-08 上海青研科技有限公司 A kind of Eye-controlling focus device and method for exempting from calibration

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109582131B (en) * 2018-10-29 2021-09-07 中国航天员科研训练中心 Asynchronous hybrid brain-computer interface method
CN109582131A (en) * 2018-10-29 2019-04-05 中国航天员科研训练中心 The asynchronous mixing brain-machine interface method of one kind and system
CN109634407A (en) * 2018-11-08 2019-04-16 中国运载火箭技术研究院 It is a kind of based on control method multimode man-machine heat transfer agent synchronous acquisition and merged
CN109634407B (en) * 2018-11-08 2022-03-04 中国运载火箭技术研究院 Control method based on multi-mode man-machine sensing information synchronous acquisition and fusion
CN109875777A (en) * 2019-02-19 2019-06-14 西安科技大学 It is a kind of with the wheelchair for taking object function and its to take object control method
CN110109550A (en) * 2019-05-14 2019-08-09 太原理工大学 A kind of VR immersion is outer planet detection demo system
CN110134243A (en) * 2019-05-20 2019-08-16 中国医学科学院生物医学工程研究所 A kind of brain control mechanical arm shared control system and its method based on augmented reality
CN110286755A (en) * 2019-06-12 2019-09-27 Oppo广东移动通信有限公司 Terminal control method, device, electronic equipment and computer-readable storage medium
CN110412996A (en) * 2019-06-18 2019-11-05 中国人民解放军军事科学院国防科技创新研究院 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
CN111728608A (en) * 2020-06-29 2020-10-02 中国科学院上海高等研究院 Augmented reality-based electroencephalogram signal analysis method, device, medium and equipment
CN113199469A (en) * 2021-03-23 2021-08-03 中国人民解放军63919部队 Space arm system, control method for space arm system, and storage medium
EP4177714A1 (en) 2021-11-03 2023-05-10 Sony Group Corporation Audio-based assistance during extravehicular activity
CN114237388A (en) * 2021-12-01 2022-03-25 辽宁科技大学 Brain-computer interface method based on multi-mode signal recognition
CN114237388B (en) * 2021-12-01 2023-08-08 辽宁科技大学 Brain-computer interface method based on multi-mode signal identification

Also Published As

Publication number Publication date
CN107885124B (en) 2020-03-24

Similar Documents

Publication Publication Date Title
CN107885124A (en) Brain eye cooperative control method and system in a kind of augmented reality environment
CN110070944B (en) Social function assessment training system based on virtual environment and virtual roles
CN107656613B (en) Human-computer interaction system based on eye movement tracking and working method thereof
CN107067856B (en) Medical simulation training system and method
US20220207809A1 (en) Using three-dimensional scans of a physical subject to determine positions and/or orientations of skeletal joints in the rigging for a virtual character
US20210358214A1 (en) Matching meshes for virtual avatars
Al-Rahayfeh et al. Eye tracking and head movement detection: A state-of-art survey
Biocca et al. Immersive virtual reality technology
Biocca Virtual reality technology: A tutorial
US10235807B2 (en) Building holographic content using holographic tools
CN106873778A (en) A kind of progress control method of application, device and virtual reality device
CN103019377A (en) Head-mounted visual display equipment-based input method and device
CN106484115A (en) For strengthening the system and method with virtual reality
EP3759542B1 (en) Head scan alignment using ocular registration
CN107656505A (en) Use the methods, devices and systems of augmented reality equipment control man-machine collaboration
CN109773807B (en) Motion control method and robot
CN103207667A (en) Man-machine interaction control method and application thereof
CN110688910A (en) Method for realizing wearable human body basic posture recognition
CN106406501A (en) Method and device for controlling rendering
Hu et al. StereoPilot: A wearable target location system for blind and visually impaired using spatial audio rendering
Wu et al. Omnidirectional mobile robot control based on mixed reality and semg signals
Mania et al. Gaze-aware displays and interaction
CN111459276A (en) Motion capture glove of virtual human hand reality system and virtual reality system
CN112201097A (en) VR-based bionic human body model and use method thereof
Albrecht et al. Mopedt: A modular head-mounted display toolkit to conduct peripheral vision research

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant