CN107885124B - Brain and eye cooperative control method and system in augmented reality environment - Google Patents

Brain and eye cooperative control method and system in augmented reality environment Download PDF

Info

Publication number
CN107885124B
CN107885124B CN201711166187.2A CN201711166187A CN107885124B CN 107885124 B CN107885124 B CN 107885124B CN 201711166187 A CN201711166187 A CN 201711166187A CN 107885124 B CN107885124 B CN 107885124B
Authority
CN
China
Prior art keywords
eye
helmet
freedom
degree
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711166187.2A
Other languages
Chinese (zh)
Other versions
CN107885124A (en
Inventor
代京
王振亚
刘冬
王琳娜
李旗挺
程奇峰
张宏江
袁本立
宋盛菊
雍颖琼
阳佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Launch Vehicle Technology CALT
Original Assignee
China Academy of Launch Vehicle Technology CALT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Launch Vehicle Technology CALT filed Critical China Academy of Launch Vehicle Technology CALT
Priority to CN201711166187.2A priority Critical patent/CN107885124B/en
Publication of CN107885124A publication Critical patent/CN107885124A/en
Application granted granted Critical
Publication of CN107885124B publication Critical patent/CN107885124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Automation & Control Theory (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A brain-eye cooperative control method and a brain-eye cooperative control system in an augmented reality environment aim at the portable operation requirement of astronauts to carry out the structure integrated design of a helmet display and a viewpoint tracking type eye tracker. Compared with the prior art, the method solves the problems of weak control capability, poor robustness and poor stability of a single electroencephalogram signal, and has good use value.

Description

Brain and eye cooperative control method and system in augmented reality environment
Technical Field
The invention relates to the field of human-computer fusion control, in particular to a brain-eye cooperative control method and system in an augmented reality environment.
Background
In future manned space projects in China, on-orbit tasks (such as on-orbit maintenance, assembly and maintenance tasks of spacecrafts such as space stations and large space scientific experiment platforms) of spacemen are gradually increased, the task demand that the spacemen go out of a space suit to walk in the space is continuously improved, and the complexity and the difficulty of operation are far beyond the current.
The problems of limited individual capability of a spaceman in the process of carrying out the cabin-out operation, limited conventional control channel under the condition of space suit, insufficient control channel of an auxiliary operation system, insufficient operation visual field, lack of real-time operation guidance and the like under the conditions of weightlessness environment, limited individual movement and the like. For example, the international space station provides complex operation tools such as large mechanical arms for the on-orbit service operation of the spacecraft, large mechanical arms are deployed in the space station in China, and the on-orbit service operation is completed by subsequently deploying small mechanical arms. However, no mechanical device for assisting the outbound operation of the astronaut is available at present.
Disclosure of Invention
The technical problem solved by the invention is as follows: the method and the system overcome the defects of the prior art, provide a brain-eye cooperative control method and a brain-eye cooperative control system in an augmented reality environment, solve the problem of insufficient operation information in the man-machine cooperative operation of the spaceman going out of the cabin, and realize the performance enhancement of 'man-machine-ring' in the task operation of the spaceman.
The technical solution of the invention is as follows: a brain-eye cooperative control method in an augmented reality environment comprises the following steps:
(1) constructing a helmet display and a viewpoint tracking type eye tracker which are of an integrated structure to obtain a digital helmet; the digital helmet can integrate an eye tracking and augmented reality module and provide a spacecraft operation scene in augmented reality; the helmet display can provide a display function for an augmented reality scene; the viewpoint tracking type eye tracker can track eyeball rotation and gaze movement and realize eye movement control;
(2) constructing an integrated embedded device which is matched with the digital helmet and can enhance the real operation environment;
(3) the method comprises the steps of reading a visual tracking signal according to a viewpoint tracking type eye tracker to realize mechanical arm freedom degree selection and action preselection based on eye tracking, and then reading an electroencephalogram signal according to a brain-computer interface in a digital helmet to realize mechanical arm six-freedom degree correction and action control.
The viewpoint tracking type eye tracker comprises a combined holographic lens, the helmet display comprises an electroencephalogram signal extraction device, and the motion process of the mechanical arm is displayed in a virtual simulation environment built by the integrated embedded device.
The integrated embedded device can establish a virtual visual space background environment.
The viewpoint tracking type eye tracker is provided with an eye tracking algorithm based on corneal reflection, and a mechanical arm freedom degree triggering and control mode based on gaze and eyeball sliding tracks is established.
The helmet display is provided with electroencephalogram signal mechanical arm six-degree-of-freedom triggering and control modes with different frequencies.
The viewpoint tracking type eye tracker takes the Purkinje point as a measuring reference, and calculates the positions of the Purkinje point and the pupil on the iris according to the gray value so as to obtain the input direction of the sight.
The helmet display adopts an LDA model to determine the position of the pointed area.
A brain and eye cooperative control system in an augmented reality environment comprises a digital helmet and an integrated embedded device, wherein the digital helmet comprises a helmet display and a viewpoint tracking type eye tracker; wherein:
the helmet display can provide a display function for an augmented reality scene; reading an electroencephalogram signal according to a brain-computer interface to realize six-degree-of-freedom correction and action control of a mechanical arm;
the viewpoint tracking type eye tracker can track the rotation and staring motion of the eyeball and realize the control of the eye movement; reading a visual tracking signal according to a viewpoint tracking type eye tracker to realize the freedom degree selection and action preselection of the mechanical arm based on eye tracking;
the integrated embedded device is matched with the digital helmet and can enhance the display of the real operation environment and establish a virtual visual space background environment.
The viewpoint tracking type eye tracker comprises a combined holographic lens, the helmet display comprises an electroencephalogram signal extraction device, and the motion process of the mechanical arm is displayed in a virtual simulation environment built by the integrated embedded device.
The helmet display is provided with electroencephalogram signal mechanical arm six-degree-of-freedom triggering and control modes with different frequencies, and the LDA model is adopted to determine the pointed area position.
Compared with the prior art, the invention has the advantages that:
(1) the invention provides a novel man-machine fusion intelligent control technical framework which is compositely applied by various man-machine interaction means such as brain control, eye control, augmented reality and the like, provides a novel auxiliary operation system control channel for astronauts, constructs a visual environment of a full operation scene and a convenient auxiliary information acquisition way, and obviously improves the operation efficiency of the astronauts. By utilizing a brain-computer interaction technology and a visual tracking technology, a novel brain-eye cooperative control mode is designed and realized, and the problem of insufficient control channels in man-machine cooperative work of the astronaut during the outbound is mainly solved;
(2) the invention develops the capability of 'man-machine-ring' in the task operation of the astronaut by utilizing the digital helmet based on the augmented reality technology, provides the augmented reality display of the operation environment, the operation guidance and the operation for the astronaut, solves the problem of insufficient operation information in the man-machine cooperative operation of the astronaut during the cabin exit, and realizes the performance enhancement of 'man-machine-ring' in the task operation of the astronaut.
Drawings
Fig. 1 is a flowchart of a brain-eye cooperative control method in an augmented reality environment.
Detailed Description
Aiming at the application scene of spaceman out-of-space operation, the invention innovatively provides a novel man-machine fusion intelligent control technical framework which is compositely applied by various man-machine interaction means such as brain control, eye control, augmented reality and the like, provides a novel auxiliary operation system control channel for the spaceman, constructs a visual environment of a full operation scene and a convenient auxiliary information acquisition way, and obviously improves the operation efficiency of the spaceman. The invention designs and realizes a novel brain-eye cooperative control mode by utilizing a brain-machine interaction technology and a visual tracking technology, mainly solves the problem of insufficient control channels in man-machine cooperative work when a astronaut goes out of a cabin, realizes the capability expansion of 'man-machine-ring' in the task work of the astronaut, utilizes the development of a digital helmet based on an augmented reality technology to provide an operation environment, operation guidance and augmented reality display of operation for the astronaut, solves the problem of insufficient operation information in the man-machine cooperative work when the astronaut goes out of the cabin, and realizes the performance enhancement of 'man-machine-ring' in the task work of the astronaut.
1. Structure integrated design of aerospace helmet display and viewpoint tracking type eye tracker
The digital helmet display function adopted by the invention provides instruction presentation for brain-computer interaction, and the analysis of the control intention is realized through visual tracking detection and brain-computer signal detection, and the digital helmet is a hardware platform for brain-eye cooperative control.
1) Display and eye tracking hardware integration design
The digital helmet display is an independent computer device, and a CPU, a GPU and a special holographic processor are arranged in the digital helmet display. The black lens comprises a transparent display screen, the stereo sound effect system enables an operator to hear voice information while observing, and the helmet is also internally provided with a whole set of sensor for realizing various preset functions. The double transparent lenses respectively display the content of the left eye and the right eye, and the real 3D holographic image is synthesized through the brain of the user, but the user can see the object in the real space at the same time. The computer on the helmet can realize holographic calculation autonomously. The user only needs to use the eyes and the voice to interact with the built-in computer with complete functions.
The helmet has a combined holographic lens, a complex sensor array, and a processor that needs to handle all of the eye, sound, and surrounding environment. As an important component of the helmet, the depth sensing camera can clearly identify the eye action of an operator.
The digital helmet further comprises a CPU, a GPU and a holographic processing unit HPU. The CPU and GPU are only responsible for starting programs and displaying holographic images, and the HPU processes data from the sensors and cameras in real time, which ensures that all operations can be quickly responded to.
2) Visual tracking detection module
The digital helmet extracts features based on active projection and outer line light speed, 8 low-power consumption infrared LEDs are arranged around each eye, and the eyeball tracking sensor is arranged below the spectacle lenses, so that the visual range is not influenced, and the pupil activity of a user can be tracked.
Detection device adopts the modularization development, makes its fine integration among the helmet, can arrange the lens in specific position according to user's characteristic simultaneously, comfortable clear impression virtual reality, and tracking speed reaches 120 ~ 380Hz, can follow the velocity of motion of eyes, realizes accurate, the eyeball tracking of low time delay, both can keep the definition, can reduce dizzy sense again.
2. Embedded augmented reality environment design based on digital helmet
Aiming at the requirement of an intelligent control task for space operation of a spacecraft, the invention establishes an augmented reality scene environment in a digital helmet and provides support for a novel man-machine interaction control mode based on an electroencephalogram signal and an eye movement track. The interaction scene takes a spacecraft space operation environment as a background, and comprises mechanical arm modeling, background modeling, control track modeling and the like.
Mechanical arm modeling: and carrying out three-dimensional modeling according to the real six-degree-of-freedom mechanical arm. Each degree of freedom is modeled separately, the structural details and dimensions are the same as the real structure, and then combined together in sequence. Modeling of power supply and signal cables of the mechanical arm is not needed, so that technical difficulty in animation design and software development is reduced, and no influence is caused on motion control of the mechanical arm. When each degree of freedom moves, the state of the other degree of freedom is not influenced, but the whole mechanical arm moves according to the master-slave relationship from bottom to top. Namely, the upper part of the base is driven when the base rotates; when the manipulator rotates, the manipulator is driven to move, but when the sub-components move, the main component does not move along with the manipulator. The colors of each degree of freedom are two types, namely normal texture colors and highlight colors. And rendering the normal texture color by taking the appearance of the real six-degree-of-freedom mechanical arm as the texture. The highlight is red in color, indicating that the degree of freedom has been selected and that a step or continuous motion is to be performed.
Background modeling: several planet models are built nearby to simulate the nine planets of the solar system to run on respective elliptical orbits. The surface map of the planet model is consistent with the planet in the real space, so that the operator can feel in the real space. The space background is black, and the bright spots on the space background, which are decorated with star points, represent distant stars. The whole situation is consistent with the real situation of the outer space. The mechanical arm is placed on the surface of a planet and is represented by a large plane with a planet surface texture map, so that the mechanical arm is shown in a space environment, and a real immersion feeling is provided for a user.
Modeling a control track: and establishing two types of spheres and regular hexahedrons for the gripped object. The object to be held is placed on the ground in an initial state. For the grabbing and putting actions in the demonstration mode, two groups of actions of the mechanical arm are made into continuous animations by developers, the action track conforms to the operation principle of the mechanical arm, only one degree of freedom action is required each time, and then the action is changed into the action with the next degree of freedom. The action time is consistent with the operation response of the real six-freedom-degree mechanical arm to a controller.
3. Fusion method of electroencephalogram signal and eye movement signal
The invention provides an intelligent control and control method based on brain-eye fusion, which mainly aims at the problems of slow electroencephalogram signal identification and the like in the actual operation process of a spacecraft, integrates eye movement tracking and electroencephalogram signals, firstly determines a rough relative position relationship through viewpoint estimation, and then realizes more accurate position location by utilizing electroencephalogram signal classification.
1) Eye tracking algorithm
In the aspect of eye tracking, the technical principle adopted by the invention is as follows: when a human face is irradiated with an infrared auxiliary light source, a reflection image is formed on the surface of the cornea of the eye, and the reflection image is called Purkinje (Purkinje) spot; when the human eyes stare at different positions of the content, the eyeballs can rotate correspondingly. Under the condition that the head of an experimenter is assumed to be immobile, because the position of the infrared light emitting diode is fixed, and the eyeball is an approximate sphere, when the eyeball rotates, the absolute position of the purkinje spot can be considered to be unchanged; the position of the iris on the pupil changes correspondingly, so that the relative position relationship between the purkinje spot formed on the iris by the infrared auxiliary light source and the iris changes, and the determination of the relative position relationship can be realized through image processing.
The image processing is based on the following characteristic information:
the purkinje spot is brightest, while the pupil is darkest in the iris;
the purkinje spot is essentially a point and the pupil is an approximate circle on the iris. Therefore, the positions of the Purkinje point and the pupil on the iris can be found according to the gray value, and then the input direction of the sight line can be obtained according to the relative position relation between the Purkinje point and the pupil.
Eye tracking control logic: firstly, the selection of the movement forward and reverse (clockwise/anticlockwise) of the degree of freedom is carried out, and then the degree of freedom which is wanted to generate the movement at the moment is determined by using the gaze signal of the eye line. When the eye is gazing at the corresponding degree of freedom, the mechanical arm continues to move in the degree of freedom.
And by utilizing eye movement signal identification, when the line of sight gazes at the selected freedom degree area, after gazing time is 3s, the corresponding freedom degree area displays red, and at the moment, if the line of sight continues to gaze for 2s, corresponding freedom degree movement is triggered. When a person watches the corresponding degree of freedom of the mechanical arm, the degree of freedom is in a continuous motion state, and when the operator removes the sight line, the motion is stopped. In the eye tracking control mode, the degree of freedom rotation is proportional to the human gaze duration.
2) Electroencephalogram signal information identification algorithm
On the basis of eye movement tracking, the algorithm selected by the electroencephalogram signal classification and identification is Linear discriminant analysis (also called Fisher Linear discriminant). LDA has very extensive application in the field of brain-computer interfaces based on P300, and achieves satisfactory effect. Linear discrimination is a well-established feature extraction method. The analysis can extract classification information and compress dimension. It projects the high-dimensional sample to the optimal discrimination vector space, and after projection, it ensures the original sample has the best separability in the new subspace.
The specific process of the LDA algorithm is as follows:
for N samples characterized by d dimensions, X ═ X1,x2,…,xn]T, wherein in the class ω1In which is N1One sample, another N2Samples in the class omega2In (1). The projection direction is represented by ω, and the sample X after ω projection can be represented as: y ═ ωTx. When x is two-dimensional, only one straight line (with the direction of ω) capable of separating the feature points after projection needs to be found.
After the test sample is subjected to feature projection, the symbol of the summation of the decision values determines the classification category, namely the corresponding control output. In the present experiment, the P300 signal was triggered once for each stimulation cycle. During online judgment, as the P300 wave is positive, the maximum projection value is judged only by applying the stimulation to the LDA model for projection, and a control output target can be obtained after the maximum projection value is crossed.
The invention can generate a stimulation interface required by brain-computer control in an AR environment. For example, in an augmented reality environment, the SSVEP stimulation needs to establish stimulation blocks flashing at different frequencies to form related stimulation for an operator. The stimulus should be consistent with the stimulus form on the electronic screen, ensuring the effectiveness of the person receiving the stimulus, which can be presented in a menu-type manner.
4. Case verification
In the aspect of electroencephalogram signal testing, the invention constructs off-line and on-line modules and adopts a leave-one-out method to test the electroencephalogram signals.
The motion state of the mechanical arm with 6 degrees of freedom is as follows
Figure BDA0001476341020000071
Figure BDA0001476341020000081
In the cooperative control process, the stopping standard of the off-line training is set as follows: the average accuracy of ten rounds of two consecutive trials is above 0.85. Meanwhile, in order to ensure that the offline module can accurately model and does not take too long, the number of tests is controlled to be 5-20, after the offline dynamic stop optimization, the brain-eye cooperative control accuracy of the fully trained testee can reach about 95%, and the brain-electric signal identification accuracy reaches over 90%.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (5)

1. A brain-eye cooperative control method in an augmented reality environment is characterized by comprising the following steps:
(1) constructing a helmet display and a viewpoint tracking type eye tracker which are of an integrated structure to obtain a digital helmet; the digital helmet can integrate an eye tracking and augmented reality module and provide a spacecraft operation scene in augmented reality; the helmet display can provide a display function for an augmented reality scene; the viewpoint tracking type eye tracker can track eyeball rotation and gaze movement and realize eye tracking control; the eye tracking control logic is as follows: firstly, selecting the forward and reverse directions of the movement of the degree of freedom, then determining the degree of freedom which is required to move at the moment by using a gaze signal of an eyeball sight line, and enabling the mechanical arm to continuously move on the degree of freedom when the eye gazes at the corresponding degree of freedom;
(2) constructing an integrated embedded device which is matched with the digital helmet and can enhance the real operation environment;
(3) reading a visual tracking signal according to a viewpoint tracking type eye tracker to realize the freedom degree selection and action preselection of a mechanical arm based on eye tracking, and then reading an electroencephalogram signal according to a brain-computer interface in a digital helmet to realize the six-freedom-degree correction and action control of the mechanical arm;
the viewpoint tracking type eye tracker comprises a combined holographic lens, the helmet display comprises an electroencephalogram signal extraction device, the motion process of the mechanical arm is displayed in a virtual simulation environment built by an integrated embedded device, and different-frequency flickering interfaces for steady-state visual evoked stimulation are provided;
the viewpoint tracking type eye tracker takes the Purkinje point as a measuring reference, and calculates the positions of the Purkinje point and the pupil on the iris according to the gray value so as to obtain the input direction of the sight;
and calculating the maximum projection value by adopting an LDA (latent dirichlet allocation) model for SSVEP (steady state visual evoked potential) stroboscopic information watched by vision in the helmet display to obtain the pointed area position.
2. The method according to claim 1, wherein the method comprises: the integrated embedded device can establish a virtual visual space background environment.
3. The method according to claim 1, wherein the method comprises: the viewpoint tracking type eye tracker is provided with an eye tracking algorithm based on corneal reflection, and a mechanical arm freedom degree triggering and control mode based on gaze and eyeball sliding tracks is established.
4. The method according to claim 1, wherein the method comprises: the helmet display is provided with electroencephalogram signal mechanical arm six-degree-of-freedom triggering and control modes with different frequencies.
5. A brain-eye cooperative control system in an augmented reality environment is characterized by comprising a digital helmet and an integrated embedded device, wherein the digital helmet comprises a helmet display and a viewpoint tracking type eye tracker, and the system comprises:
the helmet display can provide a display function for an augmented reality scene; reading an electroencephalogram signal by establishing a virtual stroboscopic stimulation interface, and realizing six-degree-of-freedom correction and action control of a mechanical arm;
the viewpoint tracking type eye tracker can track the rotation and staring motion of the eyeball and realize the tracking control of the eye movement; reading a visual tracking signal according to a viewpoint tracking type eye tracker to realize the freedom degree selection and action preselection of the mechanical arm based on eye tracking;
the integrated embedded device is matched with the digital helmet and can enhance the display of a real operation environment and establish a virtual visual space background environment;
the viewpoint tracking type eye tracker comprises a combined holographic lens, the helmet display comprises an electroencephalogram signal extraction device, and the motion process of the mechanical arm is displayed in a virtual simulation environment built by an integrated embedded device;
the helmet display is provided with electroencephalogram signal mechanical arm six-degree-of-freedom triggering and control modes with different frequencies, and the LDA model is adopted to determine the pointed area position.
CN201711166187.2A 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment Active CN107885124B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711166187.2A CN107885124B (en) 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711166187.2A CN107885124B (en) 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment

Publications (2)

Publication Number Publication Date
CN107885124A CN107885124A (en) 2018-04-06
CN107885124B true CN107885124B (en) 2020-03-24

Family

ID=61778321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711166187.2A Active CN107885124B (en) 2017-11-21 2017-11-21 Brain and eye cooperative control method and system in augmented reality environment

Country Status (1)

Country Link
CN (1) CN107885124B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109582131B (en) * 2018-10-29 2021-09-07 中国航天员科研训练中心 Asynchronous hybrid brain-computer interface method
CN109634407B (en) * 2018-11-08 2022-03-04 中国运载火箭技术研究院 Control method based on multi-mode man-machine sensing information synchronous acquisition and fusion
CN109875777B (en) * 2019-02-19 2021-08-31 西安科技大学 Fetching control method of wheelchair with fetching function
CN110109550A (en) * 2019-05-14 2019-08-09 太原理工大学 A kind of VR immersion is outer planet detection demo system
CN110134243A (en) * 2019-05-20 2019-08-16 中国医学科学院生物医学工程研究所 A kind of brain control mechanical arm shared control system and its method based on augmented reality
CN110286755B (en) * 2019-06-12 2022-07-12 Oppo广东移动通信有限公司 Terminal control method and device, electronic equipment and computer readable storage medium
CN110412996A (en) * 2019-06-18 2019-11-05 中国人民解放军军事科学院国防科技创新研究院 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
CN111728608A (en) * 2020-06-29 2020-10-02 中国科学院上海高等研究院 Augmented reality-based electroencephalogram signal analysis method, device, medium and equipment
US11243400B1 (en) * 2020-07-17 2022-02-08 Rockwell Collins, Inc. Space suit helmet having waveguide display
CN113199469B (en) * 2021-03-23 2022-07-08 中国人民解放军63919部队 Space arm system, control method for space arm system, and storage medium
EP4177714A1 (en) 2021-11-03 2023-05-10 Sony Group Corporation Audio-based assistance during extravehicular activity
CN114237388B (en) * 2021-12-01 2023-08-08 辽宁科技大学 Brain-computer interface method based on multi-mode signal identification

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160187654A1 (en) * 2011-02-28 2016-06-30 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
CN103293673B (en) * 2013-06-03 2015-04-01 卫荣杰 Cap integrated with display, eye tracker and iris recognition instrument
CN103955269A (en) * 2014-04-09 2014-07-30 天津大学 Intelligent glass brain-computer interface method based on virtual real environment
CN105710885B (en) * 2016-04-06 2017-08-11 济南大学 Service type mobile manipulator
CN106774885A (en) * 2016-12-15 2017-05-31 上海眼控科技股份有限公司 A kind of vehicle-mounted eye movement control system
CN106671084B (en) * 2016-12-20 2019-11-15 华南理工大学 A kind of autonomous householder method of mechanical arm based on brain-computer interface
CN107066085B (en) * 2017-01-12 2020-07-10 惠州Tcl移动通信有限公司 Method and device for controlling terminal based on eyeball tracking
CN107145086B (en) * 2017-05-17 2023-06-16 上海青研科技有限公司 Calibration-free sight tracking device and method

Also Published As

Publication number Publication date
CN107885124A (en) 2018-04-06

Similar Documents

Publication Publication Date Title
CN107885124B (en) Brain and eye cooperative control method and system in augmented reality environment
CN107656613B (en) Human-computer interaction system based on eye movement tracking and working method thereof
Al-Rahayfeh et al. Eye tracking and head movement detection: A state-of-art survey
O'regan et al. A sensorimotor account of vision and visual consciousness
US6625299B1 (en) Augmented reality technology
CN106873778A (en) A kind of progress control method of application, device and virtual reality device
CN103443742A (en) Systems and methods for a gaze and gesture interface
Hu et al. StereoPilot: A wearable target location system for blind and visually impaired using spatial audio rendering
O'Regan et al. Acting out our sensory experience
Dennett Surprise, surprise
CN110841266A (en) Auxiliary training system and method
Gallese et al. Mirror neurons: A sensorimotor representation system
Ryan et al. The existence of internal visual memory representations
Block Behaviorism revisited
El Hafi et al. Stare: Realtime, wearable, simultaneous gaze tracking and object recognition from eye images
Pylyshyn Seeing, acting, and knowing
Scholl et al. Change blindness, Gibson, and the sensorimotor theory of vision
Antonya Accuracy of gaze point estimation in immersive 3d interaction interface based on eye tracking
WO2021106552A1 (en) Information processing device, information processing method, and program
Goodale Real action in a virtual world
WO2023237023A1 (en) Image processing method and apparatus, storage medium, and head-mounted display device
Xu et al. Research on the Gaze Direction of Head-Eye Data Fusion
Bartolomeo et al. Visual awareness relies on exogenous orienting of attention: Evidence from unilateral neglect
Cohen Whither visual representations? Whither qualia?
Weiss Evaluation of Augmented Reality and Wearable Sensors to Assess Neurovestibular and Sensorimotor Performance in Astronauts for Extravehicular Activity Readiness

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant