CN104367320B - Leading eye determines method and apparatus - Google Patents

Leading eye determines method and apparatus Download PDF

Info

Publication number
CN104367320B
CN104367320B CN201410643724.8A CN201410643724A CN104367320B CN 104367320 B CN104367320 B CN 104367320B CN 201410643724 A CN201410643724 A CN 201410643724A CN 104367320 B CN104367320 B CN 104367320B
Authority
CN
China
Prior art keywords
eyes
information
leading
eye
myoelectric
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410643724.8A
Other languages
Chinese (zh)
Other versions
CN104367320A (en
Inventor
刘浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201410643724.8A priority Critical patent/CN104367320B/en
Publication of CN104367320A publication Critical patent/CN104367320A/en
Priority to US15/525,040 priority patent/US10646133B2/en
Priority to PCT/CN2015/085024 priority patent/WO2016070653A1/en
Application granted granted Critical
Publication of CN104367320B publication Critical patent/CN104367320B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/398Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1101Detecting tremor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

This application provides a kind of leading eye and determine method and apparatus, relate to Wearable device field.Described method includes: obtain the First view power information of first eyes of user;According to described First view power information and a reference information, determine whether described first eyes are leading eyes.Other equipment that described method and apparatus, beneficially user are worn, according to determining that result is arranged automatically, promote Consumer's Experience.

Description

Leading eye determines method and apparatus
Technical field
The application relates to Wearable device field, particularly relate to a kind of leading eye determine method and Equipment.
Background technology
In recent years, with the development of Wearable device, Intelligent spire lamella, Intelligent bracelet, intelligence Glasses etc. progress into the life of people, enrich and facilitate the life of people greatly.Can wear Wearing formula equipment due to small volume, general interaction capabilities is poor.Therefore, user is typically intended to It has preferable self-recognizability, thus reduces the setting operation of user.
The mankind are when regarding thing, and eyes role often differs, wherein at a glance often necessarily Preponderating in degree, becoming positioning, the main bearer causing fusion, this eye is referred to as leading eye. Leading eye is one of mankind's lateral dominance functional character more typically.Leading eye determines that result can use In game experiencing, the viewing experience etc. that improve user, such as taken aim at by leading eye in shooting game The immersion interactive experience of user will definitely be improved.
If the leading eye of user can be determined by Wearable device, then can be as certainly Body or the input of other equipment, reduce the setting operation of user, improve the experience of user.
Content of the invention
The purpose of the application is: provide the leading eye of one to determine method and apparatus.
An aspect according at least one embodiment of the application, provides a kind of leading eye and determines Method, described method includes:
Obtain the first myoelectric information of first eyes of user;
According to described first myoelectric information and a reference information, determine that whether described first eyes are Leading eye.
According to another aspect of at least one embodiment of the application, provide a kind of leading eye true Locking equipment, described equipment includes:
One first acquisition module, for obtaining the first myoelectric information of first eyes of user;
One determining module, for according to described first myoelectric information and a reference information, determines institute State whether the first eyes are leading eyes.
According to another aspect of at least one embodiment of the application, provide a kind of leading eye true Determining method, described method includes:
Obtain the first body-sensing information of first eyes of user;
According to described first body-sensing information and a reference information, determine that whether described first eyes are Leading eye.
According to another aspect of at least one embodiment of the application, provide a kind of leading eye true Locking equipment, described equipment includes:
One first acquisition module, for obtaining the first body-sensing information of first eyes of user;
One determining module, for according to described first body-sensing information and a reference information, determines institute State whether the first eyes are leading eyes.
Described in the embodiment of the present application, leading eye determines method and apparatus, obtains first eyes of user The first myoelectric information, and then according to described first myoelectric information and a reference information determine Whether the first eyes are leading eyes, thus provide a kind of determination method of leading eye, are conducive to The equipment that user wears, according to determining that result is arranged automatically, promotes Consumer's Experience.
Brief description
Fig. 1 is the flow chart that leading eye described in one embodiment of the application determines method;
Fig. 2 is the electro-ocular signal contrast schematic diagram of described leading eye and non-dominant eye;
Fig. 3 is the refinement flow chart of step S140a described in one embodiment of the application;
Fig. 4 is the refinement flow chart of step S140a described in another embodiment of the application;
Fig. 5 is the temperature information contrast schematic diagram of described leading eye and non-dominant eye;
Fig. 6 is the refinement flow chart of step S140b described in one embodiment of the application;
Fig. 7 is the refinement flow chart of step S140b described in another embodiment of the application;
Fig. 8 is the electromyographic signal contrast schematic diagram of described leading eye and non-dominant eye;
Fig. 9 is the refinement flow chart of step S140c described in one embodiment of the application;
Figure 10 is the refinement flow chart of step S140c described in another embodiment of the application;
Figure 11 is described leading eye and non-dominant eye corresponding EEG signals contrast schematic diagram respectively;
Figure 12 is the refinement flow chart of step S140d described in one embodiment of the application;
Figure 13 is the refinement flow chart of step S140d described in another embodiment of the application;
Figure 14 is the flow chart that described in one embodiment of the application, leading eye determines method;
Figure 15 is the flow chart that described in another embodiment of the application, leading eye determines method;
Figure 16 is the flow chart that described in another embodiment of the application, leading eye determines method;
Figure 17 is the flow chart that described in another embodiment of the application, leading eye determines method;
Figure 18 is that described in one embodiment of the application, leading eye determines that the modular structure of equipment is shown It is intended to;
Figure 19 is the modular structure that described in one embodiment of the application, leading eye determines equipment Schematic diagram;
Figure 20 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 21 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 22 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 23 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 24 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 25 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 26 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 27 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 28 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 29 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 30 is the module knot that leading eye described in another embodiment of the application determines equipment Structure schematic diagram;
Figure 31 is that described in one embodiment of the application, leading limbs determine that the hardware configuration of device shows It is intended to.
Detailed description of the invention
Below in conjunction with the accompanying drawings and embodiment, the detailed description of the invention of the application is made further in detail Explanation.Following example are used for illustrating the application, but are not limited to scope of the present application.
It will be appreciated by those skilled in the art that in embodiments herein, the sequence number of following each step Size be not meant to the priority of execution sequence, the execution sequence of each step should with its function and Internal logic determines, and should not constitute any restriction to the implementation process of the embodiment of the present application.
Fig. 1 is the flow chart that leading eye described in one embodiment of the application determines method, described side Method can be dominated such as one and realize on eye determination equipment.As it is shown in figure 1, described method includes:
S120: obtain the first body-sensing information of first eyes of user;
S140: according to described first body-sensing information and a reference information, determine described first eyes It whether is leading eye.
Method described in the embodiment of the present application, obtains the first body-sensing information of eyes of user, enters And determine whether described first eyes are main according to described first body-sensing information and a reference information Lead eye, thus provide a kind of leading eye and determine method, the beneficially Wearable device of user According to determining that result is arranged automatically, promote Consumer's Experience.
Below with reference to detailed description of the invention, describe the work(of described step S120 and S140 in detail Energy.
S120: obtain the first body-sensing information of first eyes of user.
Wherein, described first eyes are left eye or the right eyes of user.
Described first body-sensing information can be eye electricity (EOG) information of described first eyes, flesh Electricity (EMG) information, temperature information, or described first eyes corresponding brain electric (EEG) letter Breaths etc., it can be obtained by corresponding sensor or acquisition system.Such as, described First view The eye power information of eyeball can be obtained by electric transducer at least at a glance, the myoelectricity of described first eyes Information can be obtained by least one myoelectric sensor, and the temperature information of described first eyes is permissible Being obtained by least one temperature sensor, the described first corresponding brain electric information of eyes can pass through One brain electric transducer obtains.
Wherein, the myoelectric information of described first eyes can be the described first corresponding muscle of eyes Myoelectric information;The temperature information of described first eyes can be the eyeball of described first eyes Temperature;In the case that described first eyes are left eyes, its corresponding brain electric information can be big The corresponding brain electric information in brain FP1 region, in the case that described first eyes are right eyes, it is right The brain electric information answered can be the corresponding brain electric information in brain FP2 region.
S140: according to described first body-sensing information and a reference information, determine described first eyes It whether is leading eye.
A) described first body-sensing information can be the eye power information of described first eyes, i.e. first Eye power information.Described step S140 is:
S140a: according to described First view power information and a reference information, determine described First view Whether eyeball is leading eye.
Inventor finds under study for action, and the Rotation of eyeball of people is substantially all in the range of 0-60 degree. When the Rotation of eyeball of user, can make to produce between the retina of eyeball and cornea electrical potential difference.Eye Electric transducer can be by the dynamic produced electrical potential difference of electrode record eye.Exist when eye moves deflection angle In the range of 30 degree, described electrical potential difference and eyeball deflection angle meet linear relationship;And when eye is dynamic partially When corner is between 30 degree to 60 degree, described electrical potential difference and eyeball deflection angle meet sinusoidal song Line relation.
In addition, inventor finds in research process, the dynamic frequency of the leading eye eye of people and deflection Angle is higher than non-dominant eye.Inventor finds in research process, within a period of time, uses phase Same sample frequency, gathers eye respectively to the position that the leading eye of user is identical with non-dominant eye After the signal of telecommunication, the averaged amplitude value of the electro-ocular signal of the leading eye of discovery is apparently higher than non-dominant eye The averaged amplitude value of electro-ocular signal.
Fig. 2 is the contrast schematic diagram of the electro-ocular signal of the leading eye of user and non-dominant eye, horizontal seat The mark expression time, ordinate represents the range value of electro-ocular signal, and solid-line curve represents the eye of leading eye Signal of telecommunication curve, imaginary curve represents the electro-ocular signal curve of non-dominant eye.It will be seen that it is leading The range value of the electro-ocular signal of eye generally individually can be higher than the range value of the electro-ocular signal of non-dominant eye. Based on above-mentioned principle, it is possible to achieve the determination to leading eye.
Wherein, in the application, various body-sensing information (include a power information, brain electric information, myoelectricity Information) range value, refer to the amplitude of the corresponding waveform of corresponding body-sensing information, its always be non-negative Value.
In one embodiment, described reference information is the second of second eyes of described user Eye power information.Described method can also include:
S130a: obtain the Second Sight power information of the second eyes of described user as described reference Information.
Such as, eye electric transducer described in two groups, the simultaneously First view to described user can be set Eyeball and the second eyes gather eye power information, and the eye power information that will gather the second eyes, i.e. institute State Second Sight power information, as described reference information.
In present embodiment, described step S140a can be by relatively more described First view power information Averaged amplitude value and the size of averaged amplitude value of described Second Sight power information, determine described Whether one eyes are leading eyes.The averaged amplitude value of described First view power information is described First view In power information, the mean value of multiple sampled points corresponding eye electricity range value, similar, and described second The averaged amplitude value of eye power information is the corresponding eye of multiple sampled points in described Second Sight power information The mean value of electricity range value.By using mean value, single sampled point can be avoided to occur adopting Cause in the case of sample error determining mistake, improve and determine the degree of accuracy.Concrete, such as Fig. 3 institute Showing, described step S140a may include that
S141a: the averaged amplitude value in response to described First view power information is more than described Second Sight The averaged amplitude value of power information, determines that described first eyes are leading eyes;
S142a: the averaged amplitude value in response to described First view power information is less than described Second Sight The averaged amplitude value of power information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the left eye according to described user The threshold value that the averaged amplitude value of the averaged amplitude value of power information and right eye power information determines.Specifically , as shown in Figure 4, described step S140a may include that
S141a ': the averaged amplitude value in response to described First view power information is more than described threshold value, Determine that described first eyes are leading eyes;
S142a ': the averaged amplitude value in response to described First view power information is less than described threshold value, Determine that described first eyes are not leading eyes.
Such as, gather right eye power information and the left eye power information of described user in advance, and carry out point Analysis process, it is assumed that it is interval that the averaged amplitude value of described right eye power information falls into first (Romin,Romax), it is assumed that it is interval that the averaged amplitude value of described left eye power information falls into second (Lomin,Lomax), assume that right eye is leading eye simultaneously, then have Lomax< Romin, then May determine that described threshold value is Mo, and Lomax< Mo< Romin.It is to say, it is described Threshold value MoIt is a numerical value between the described first interval and described second interval.
Therefore, if the averaged amplitude value of described First view power information is more than described threshold value Mo, then Thinking that it falls into the described first interval, described first eyes are the leading eyes of described user;If The averaged amplitude value of described First view power information is less than described threshold value Mo, then it is assumed that it falls into institute Stating the second interval, described first eyes are not the leading eyes of described user.
General, the averaged amplitude value of the eye power information of leading eye can be than the eye telecommunications of non-dominant eye The averaged amplitude value of breath is high by more than 5%, can rationally arrange described threshold value M accordinglyo
B) described first body-sensing information can be the temperature information of described first eyes, the i.e. first temperature Degree information.Described step S140 is:
S140b: according to described first temperature information and a reference information, determine described First view Whether eyeball is leading eye.
Inventor finds under study for action, and the blood supply of eyeball is from ophthalmic artery.Ophthalmic artery is dynamic in neck Arteries and veins enters socket of the eye through canalis opticus after separating, and is divided into two independent systems: one is retinal centre blood Guard system, supply retina in several layers;Two is ciliary vascular system, and supply removes retinal centre Eyeball other parts outside artery supply.Inventor finds under study for action, due to natural evolution As a result, the blood vessel of the retinal centre vascular system and ciliary vascular system of dominating eye compares non-dominant Eye thicker, amount of blood supply is bigger.Therefore, the temperature than non-dominant eye for the temperature of leading eye is caused High.
In addition, inventor also finds in research process, the eye dynamic frequency of the leading eye of people and partially Gyration is apparently higher than non-dominant eye, and eye moves and can produce heat, therefore, also causes leading eye Temperature is higher than the temperature of non-dominant eye.
Fig. 5 is the contrast schematic diagram of the temperature information of the leading eye of user and non-dominant eye, horizontal seat The mark expression time, ordinate represents the temperature of eyes, and the first curve 510 represents the temperature of leading eye Degree signal curve, the second curve 520 represents the temperature signal curve of non-dominant eye.It will be seen that The temperature value of leading eye generally individually can be higher than the temperature value of non-dominant eye.Based on above-mentioned principle, can To realize the determination to leading eye.
In one embodiment, described reference information is the temperature of second eyes of described user Information.Described method can also include:
S130b: obtain second temperature information of the second eyes of described user as described reference Information.
Such as, two groups of temperature sensors can be set, simultaneously to first eyes of described user and Second eyes gather corresponding temperature information, and by the temperature information of described second eyes, and i.e. the Two temperature informations are as described reference information.
In present embodiment, described step S140b can be by relatively more described first temperature information Mean value and the size of mean value of described second temperature information, determine that described first eyes are No is leading eye.The mean value of described first temperature information is multiple in described first temperature information The mean value of the corresponding temperature value of sampled point, similar, the mean value of described second temperature information It is the mean value of the corresponding temperature value of multiple sampled points in described second temperature information.By using Mean value, can avoid single sampled point to cause determining mistake in the case of there is sampling error, Improve and determine the degree of accuracy.Concrete, seeing Fig. 6, described step S140b may include that
S141b: the mean value in response to described first temperature information is believed more than described second temperature The mean value of breath, determines that described first eyes are leading eyes;
S142b: the mean value in response to described first temperature information is believed less than described second temperature The mean value of breath, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the right eye according to described user The threshold value that temperature information and left eye temperature information determine.Concrete, see Fig. 7, described step Rapid S140b may include that
S141b ': the mean value in response to described first temperature information is more than described threshold value, determines Described first eyes are leading eyes;
S142b ': the mean value in response to described first temperature information is less than described threshold value, determines Described first eyes are not leading eyes.
Such as, gather left eye temperature information and the right eye temperature information of described user in advance, go forward side by side Row analyzing and processing, it is assumed that it is interval that the mean value of described left eye temperature information falls into first (Ltmin,Ltmax), it is assumed that it is interval that the mean value of described right eye temperature information falls into second (Rtmin,Rtmax), and assume that right eye is leading eye, then have Ltmax< Rtmin, then May determine that described threshold value is Mt, and Ltmax< Mt< Rtmin.It is to say, described threshold Value MtIt is a numerical value between the described first interval and described second interval.
Therefore, if the mean value of described first temperature information is more than described threshold value Mt, then it is assumed that It falls into the described second interval, and described first eyes are the leading eyes of described user;If it is described The mean value of the first temperature information is less than described threshold value Mt, then it is assumed that it falls into the described first interval, Described first eyes are not the leading eyes of described user.
General, the temperature of the leading eye of described user can be higher than the temperature of non-dominant eye 0.1~1.2 DEG C, according to this temperature difference, described threshold value M can be rationally sett
C) described first body-sensing information can be the myoelectric information of described first eyes, i.e. first Myoelectric information.Described step S140 is:
S140c: according to described first myoelectric information and a reference information, determine described First view Whether eyeball is leading eye.
As it was noted above, inventor finds under study for action, the eye dynamic frequency of the leading eye of people and partially Gyration is apparently higher than non-dominant eye.In other words, the contraction frequency of the muscle of the leading eye of control Contraction frequency and amplitude with the muscle higher than control non-dominant eye for the amplitude.
Inventor also finds in research process, when muscle shrinks with different loads, The range value of myoelectric information and muscular strength are proportional, and the tension force that i.e. muscle produces is bigger, myoelectricity The range value of information is bigger.Further, when muscle is with 40%MVC (maximum static muscular strength) When following intensity is shunk, muscular strength is linear with the range value of myoelectricity;When muscle is used During more than 60%MVC intensity, muscular strength is also linear with the range value of myoelectricity, but now Straight slope bigger.And muscular strength between 40%-60%MVC when, muscular strength and myoelectric information Range value between linear relationship do not exist, but still proportional relation.
Fig. 8 is the muscle of a period of time interior leading eye to control and the muscle of control non-dominant eye enters The contrast schematic diagram of the myoelectric information that row detection obtains, abscissa represents the time, and ordinate represents The range value of electromyographic signal, solid-line curve represents the electromyographic signal curve of leading eye, and imaginary curve represents The electromyographic signal curve of non-dominant eye.It will be seen that the range value one of the electromyographic signal of leading eye As all can be higher than the range value of electromyographic signal of non-dominant eye.Based on above-mentioned principle, it is possible to achieve Determination to leading eye.
In one embodiment, described reference information is the second of second eyes of described user Myoelectric information.Described method can also include:
S130c: obtain second myoelectric information of the second eyes of described user as described reference Information.
Such as, two groups of myoelectric sensors can be set, simultaneously to first eyes of described user and Second eyes gather corresponding myoelectric information, and by the myoelectric information of described second eyes, and i.e. the Two myoelectric informations, as described reference information.
Wherein, each eye has six muscle to control the rotation of eyeball.This six muscle are respectively It is upper and lower rectus, inside and outside rectus and upper and lower oblique.They are by oculomotor nerve, coaster god Warp and abducent nerve are arranged.Inside and outside rectus is responsible for eyeball and is outwardly or inwardly rotated;Upper and lower When rectus shrinks, eyeball turns or lower turn, also makes adversion of eyeball simultaneously;Musculus obliquus dorsalis mainly makes eye Ball inward turning, also makes subduction simultaneously and turns outward;Inferior oblique muscle mainly makes to turn outside eyeball, also simultaneously Make to turn on eyeball and outer turn.Described myoelectric information can be by described in whole or in part six Muscle carries out detection and obtains.
In present embodiment, described step S140c can be by relatively more described first myoelectric information Averaged amplitude value and the size of averaged amplitude value of described second myoelectric information, determine described Whether one eyes are leading eyes.The averaged amplitude value of described first myoelectric information is described first flesh In power information, the mean value of multiple sampled points corresponding myoelectric amplitude value, similar, and described second The averaged amplitude value of myoelectric information is the corresponding flesh of multiple sampled points in described second myoelectric information The mean value of electricity range value.By using mean value, single sampled point can be avoided to occur adopting Cause in the case of sample error determining mistake, improve and determine the degree of accuracy.Concrete, see Fig. 9, Described step S140c may include that
S141c: the averaged amplitude value in response to described first myoelectric information is more than described second flesh The averaged amplitude value of power information, determines that described first eyes are leading eyes;
S142c: the averaged amplitude value in response to described first myoelectric information is less than described second flesh The averaged amplitude value of power information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the right eye according to described user The threshold value that the averaged amplitude value of the averaged amplitude value of myoelectric information and left eye myoelectric information determines. Concrete, seeing Figure 10, described step S140c may include that
S141c ': the averaged amplitude value in response to described first myoelectric information is more than described threshold value, Determine that described first eyes are leading eyes;
S142c ': the averaged amplitude value in response to described first myoelectric information is less than described threshold value, Determine that described first eyes are not leading eyes.
Such as, gather right eye myoelectric information and the left eye myoelectric information of described user in advance, go forward side by side Row analyzing and processing, it is assumed that it is interval that the averaged amplitude value of described right eye myoelectric information falls into first (Rmmin,Rmmax), it is assumed that the averaged amplitude value of described left eye myoelectric information falls into the secondth district Between (Lmmin,Lmmax), and assume that right eye is leading eye, then have Lmmax< Rmmin, Then may determine that described threshold value is Mm, and Lmmax< Mm< Rmmin.It is to say, Described threshold value MmIt is a numerical value between the described first interval and described second interval.
Therefore, if the averaged amplitude value of described first myoelectric information is more than described threshold value Mm, Then thinking that it falls into the described first interval, described first eyes are the leading eyes of described user;As The averaged amplitude value of really described first myoelectric information is less than described threshold value Mm, then it is assumed that it falls into Described second interval, described first eyes are not the leading eyes of described user.
General, the averaged amplitude value of the myoelectric information of leading eye can be believed than the myoelectricity of non-dominant eye The averaged amplitude value of breath is high by more than 5%, can rationally arrange described threshold value M accordinglyt
D) described first body-sensing information can be the described first corresponding brain electric information of eyes, i.e. First brain electric information.
Described step S140 is:
S140d: according to described first brain electric information and a reference information, determine described First view Whether eyeball is leading eye.
As it was noted above, when the Rotation of eyeball of user, can make the retina of eyeball and cornea it Between produce electrical potential difference.Inventor finds in research process, and described electrical potential difference can be reflected in simultaneously It in EEG signals, is such as reflected on brain in the corresponding EEG signals in FP1, FP2 region
Figure 11 is the right of the corresponding EEG signals of left eye of user and the corresponding EEG signals of right eye Ratio schematic diagram, wherein, the corresponding EEG signals of left eye are in FP1 station acquisition, and right eye is corresponding EEG signals are in FP2 station acquisition, and abscissa represents the time, and ordinate represents EEG signals Range value, solid-line curve represents the EEG signals curve of leading eye, and imaginary curve represents non-dominant eye EEG signals curve.It will be seen that the range value of the EEG signals of leading eye generally individually can be higher than The range value of the EEG signals of non-dominant eye.Based on above-mentioned principle, it is possible to achieve to leading eye Determine.
In one embodiment, described reference information is that second eyes of described user are corresponding Second brain electric information.Described method can also include:
S130d: obtain corresponding second brain electric information of the second eyes of described user as described Reference information.
Such as, two groups of brain electric transducers can be set, simultaneously in the FP1 region of described brain and FP2 region, gathers the first eyes and the corresponding brain electric information of the second eyes, and by the second eyes Corresponding brain electric information, the i.e. second brain electric information, as described reference information.
In present embodiment, described step S140d can be by relatively more described first brain electric information Averaged amplitude value and the size of averaged amplitude value of described second brain electric information, determine described Whether one eyes are leading eyes.The averaged amplitude value of described first brain electric information is described first brain In power information, the mean value of multiple sampled points corresponding brain electricity range value, similar, and described second The averaged amplitude value of brain electric information is the corresponding brain of multiple sampled points in described second brain electric information The mean value of electricity range value.By using mean value, single sampled point can be avoided to occur adopting Cause in the case of sample error determining mistake, improve and determine the degree of accuracy.Concrete, see figure 12, described step S140d may include that
S141d: the averaged amplitude value in response to described first brain electric information is more than described second brain The averaged amplitude value of power information, determines that described first eyes are leading eyes;
S142d: the averaged amplitude value in response to described first brain electric information is less than described second brain The averaged amplitude value of power information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the left eye brain electricity according to user The threshold value that the averaged amplitude value of the averaged amplitude value of information and right eye brain electric information determines.Specifically , seeing Figure 13, described step S140d may include that
S141d ': the averaged amplitude value in response to described first brain electric information is more than described threshold value, Determine that described first eyes are leading eyes;
S142d ': the averaged amplitude value in response to described first brain electric information is less than described threshold value, Determine that described first eyes are not leading eyes.
Such as, gather left eye brain electric information and the right eye brain electric information of described user in advance, go forward side by side Row analyzing and processing, it is assumed that it is interval that the averaged amplitude value of described right eye brain electric information falls into first (Remin,Remax), it is assumed that it is interval that the averaged amplitude value of described left eye brain electric information falls into second (Lemin,Lemax), and assume that right eye is leading eye, then have Lemax< Remin, then May determine that described threshold value is Me, and Lemax< Me< Remin.It is to say, it is described Threshold value MeIt is a numerical value between the described first interval and described second interval.
Therefore, if the averaged amplitude value of described first brain electric information is more than described threshold value Me, then Thinking that it falls into the described first interval, described first eyes are leading eyes;If described first brain The averaged amplitude value of power information is less than described threshold value Me, then it is assumed that it falls into the described second interval, Described first eyes are not leading eyes.
General, the averaged amplitude value of the eye power information of leading eye can be than the eye telecommunications of non-dominant eye The averaged amplitude value of breath is high by more than 5%, can rationally arrange described threshold value M accordinglye
Seeing Figure 14, in one embodiment, described method can also include:
S150: perform operation according to determination result.
Such as, according to determining that result shows that described first eyes are leading eyes, if user is It is shot at game, user can be pointed out to use described first eyes to aim at, to improve use The immersion interactive experience at family;Or, if user is watching 3D film, can to The different visual angles that the leading eye at family and non-dominant eye carry out Autostereoscopic 3D shows, to improve use The visual experience at family.
Seeing Figure 15, in one embodiment, described method can also include:
S160: receive the input information of described user.
Wherein, user can perform input by modes such as such as voice, button, gestures.
Seeing Figure 16, in one embodiment, described input information is leading eye information, i.e. Which eye is the information of leading eye, and described method can also include:
S170: described input information is leading eye information, according to described input information and determination knot Really, determine that described first eyes are left eye or right eye.
Such as, described input information shows that the right eye of described user is leading eye, and described really Determine result and show that described first eyes are leading eyes, then may determine that described first eyes are right eyes.
After determining described first eyes and being left eye or right eye, left and right can be carried out further Eye is related to be arranged, and such as corresponding for the right eye eyeglass number of degrees is automatically adjusted to the right side with user Eye myopia degree matches.
Seeing Figure 17, in one embodiment, described method can also include:
S180: described input information includes that described first eyes are left eye or right eye, according to described Input information and determination result, determine that the left eye of described user or right eye are leading eyes.
Such as, described input information shows that described first eyes are left eyes, determines that result shows institute Stating the first eyes is leading eye, then may determine that the left eye of described user is leading eye, Ke Yiji Record this information, for other application call.
Additionally, the embodiment of the present application also provides a kind of computer-readable medium, including be performed The computer-readable instruction of below Shi Jinhang operation: perform in above-mentioned Fig. 1 illustrated embodiment Step S120 of method and the operation of S140.
To sum up, method described in the embodiment of the present application, can be according to the first of first eyes of user Body-sensing information and a reference information determine the leading eye whether described first eyes are described users, And Consumer's Experience can be improved according to determining that result performs corresponding operating.
Figure 18 is the modular structure signal that leading eye described in one embodiment of the invention determines equipment Figure, described leading eye determines that equipment can be arranged at intelligent helmet, intelligence as a functional module In the Wearable device such as energy glasses, intelligence wig, certainly also can be as independent wearing Wear formula equipment for user.As shown in figure 18, described equipment 1800 may include that
One first acquisition module 1810, for obtaining the first body-sensing information of first eyes of user;
One determining module 1820, is used for according to described first body-sensing information and a reference information, really Whether fixed described first eyes are leading eyes.
Equipment described in the embodiment of the present application, obtains the first body-sensing information of first eyes of user, And then determine that whether described first eyes are according to described first body-sensing information and a reference information Leading eye, thus provide a kind of leading eye equipment, the beneficially Wearable device root of user According to determining that result is arranged automatically, promote Consumer's Experience.
Below with reference to detailed description of the invention, describe described first acquisition module 1810 He in detail The function of determining module 1820.
Described first acquisition module 1810, for obtaining the first body-sensing letter of first eyes of user Breath.
Wherein, described first eyes are left eye or the right eyes of user.
Described first body-sensing information can be eye electricity (EOG) information of described first eyes, flesh Electricity (EMG) information, temperature information, or described first eyes corresponding brain electric (EEG) letter Breaths etc., it can be obtained by corresponding sensor or acquisition system.Such as, described First view The eye power information of eyeball can be obtained by electric transducer at least at a glance, the myoelectricity of described first eyes Information can be obtained by least one myoelectric sensor, and the temperature information of described first eyes is permissible Being obtained by least one temperature sensor, the described first corresponding brain electric information of eyes all can lead to A BP system of crossing obtains.
Wherein, the myoelectric information of described first eyes can be the described first corresponding muscle of eyes Myoelectric information;The temperature information of described first eyes can be the eyeball of described first eyes Temperature;In the case that described first eyes are left eyes, its corresponding brain electric information can be big The corresponding brain electric information in brain FP1 region, in the case that described first eyes are right eyes, it is right The brain electric information answered can be the corresponding brain electric information in brain FP2 region.
Described determining module 1820, is used for according to described first body-sensing information and a reference information, Determine whether described first eyes are leading eyes.
A) described first body-sensing information can be the eye power information of described first eyes, i.e. First view Power information.Described determining module 1820, for according to described First view power information and a reference Information, determines whether described first eyes are leading eyes.
In one embodiment, described reference information is the second of second eyes of described user Eye power information, sees Figure 19, and described equipment 1800 also includes:
One second acquisition module 1830a, for obtaining the Second Sight of second eyes of described user Power information is as described reference information.
Accordingly, described determining module 1820, for putting down in response to described First view power information Equal range value, more than the averaged amplitude value of described Second Sight power information, determines that described first eyes are Leading eye;And, the averaged amplitude value in response to described First view power information is less than described second The averaged amplitude value of eye power information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information is the left eye telecommunications according to described user The threshold value that the averaged amplitude value of the averaged amplitude value of breath and right eye power information determines.Concrete, Seeing Figure 20, described equipment 1800 also includes:
One first determining module 1840a, is used for the average of the left eye power information according to described user The averaged amplitude value of range value and right eye power information determines a threshold value as described reference information.
Accordingly, described determining module 1820, for putting down in response to described First view power information Equal range value is more than described threshold value, determines that described first eyes are leading eyes;And, in response to The averaged amplitude value of described First view power information is less than described threshold value, determines described first eyes not It is leading eye.
Such as, described first determining module 1840a can gather the right eye electricity of described user in advance Information and left eye power information, and analyze and process, it is assumed that the average width of described right eye power information Angle value falls into the first interval (Romin,Romax), it is assumed that the average amplitude of described left eye power information Value falls into the second interval (Lomin,Lomax), assume that right eye is leading eye simultaneously, then have Lomax< Romin, then may determine that described threshold value is Mo, and Lomax< Mo< Romin。 It is to say, described threshold value MoIt is between the described first interval and described second interval Numerical value.
Therefore, if the averaged amplitude value of described First view power information is more than described threshold value Mo, Then thinking that it falls into the described first interval, described first eyes are the leading eyes of described user;As The averaged amplitude value of really described First view power information is less than described threshold value Mo, then it is assumed that it falls into Described second interval, described first eyes are not the leading eyes of described user.
B) described first body-sensing information can be the temperature information of described first eyes, the i.e. first temperature Degree information.Described determining module 1820, for according to described first temperature information and a reference Information, determines whether described first eyes are leading eyes.
In one embodiment, described reference information is the second of second eyes of described user Temperature information, sees Figure 21, and described equipment 1800 also includes:
One second acquisition module 1830b, for obtaining the second temperature of second eyes of described user Degree information is as described reference information.
Accordingly, described determining module 1820, for putting down in response to described first temperature information Average, more than the mean value of described second temperature information, determines that described first eyes are leading eyes; And, in response to mean value the putting down less than described second temperature information of described first temperature information Average, determines that described first eyes are not leading eyes.
In another embodiment, described reference information is the left eye temperature according to described user The threshold value that the mean value of the mean value of information and right eye temperature information determines, sees Figure 22, Described equipment 1800 also includes:
One first determining module 1840b, putting down for the left eye temperature information according to described user The mean value of average and right eye temperature information determines a threshold value as described reference information.
Accordingly, described determining module 1820, for putting down in response to described first temperature information Average is more than described threshold value, determines that described first eyes are leading eyes;And, in response to described The mean value of the first temperature information is less than described threshold value, determines that described first eyes are not leading eyes.
Such as, described first determining module 1840b can gather the left eye temperature of described user in advance Degree information and right eye temperature information, and analyze and process, it is assumed that described left eye temperature information Mean value falls into the first interval (Ltmin,Ltmax), it is assumed that described right eye temperature information average Value falls into the second interval (Rtmin,Rtmax), and assume that right eye is leading eye, then have Ltmax< Rtmin, then may determine that described threshold value is Mt, and Ltmax< Mt< Rtmin。 It is to say, described threshold value MtIt is the number between the described first interval and described second interval Value.
Therefore, if the mean value of described first temperature information is more than described threshold value Mt, then it is assumed that It falls into the described second interval, and described first eyes are the leading eyes of described user;If it is described The mean value of the first temperature information is less than described threshold value Mt, then it is assumed that it falls into the described first interval, Described first eyes are not the leading eyes of described user.
C) described first body-sensing information can also is that the myoelectric information of described first eyes, i.e. first Myoelectric information.Described determining module 1820, for according to described first myoelectric information and a ginseng Examine information, determine whether described first eyes are leading eyes.
In one embodiment, described reference information is the second of second eyes of described user Myoelectric information, sees Figure 23, and described equipment 1800 can also include:
One second acquisition module 1830c, for obtaining the second flesh of second eyes of described user Power information is as described reference information.
Accordingly, described determining module 1820, for putting down in response to described first myoelectric information Equal range value, more than the averaged amplitude value of described second myoelectric information, determines that described first eyes are Leading eye;And, the averaged amplitude value in response to described first myoelectric information is less than described second The averaged amplitude value of myoelectric information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information is the left eye myoelectricity according to described user The threshold value that the averaged amplitude value of the averaged amplitude value of information and right eye myoelectric information determines, sees Figure 24, described equipment 1800 can also include:
One first determining module 1840c, putting down for the left eye myoelectric information according to described user The averaged amplitude value of equal range value and right eye myoelectric information determines a threshold value as described reference letter Breath.
Accordingly, described determining module 1820, for putting down in response to described first myoelectric information Equal range value is more than described threshold value, determines that described first eyes are leading eyes;And,
Averaged amplitude value in response to described first myoelectric information is less than described threshold value, determines described First eyes are not leading eyes.
Such as, described first determining module 1840c can gather the right eye flesh of described user in advance Power information and left eye myoelectric information, and analyze and process, it is assumed that described right eye myoelectric information Averaged amplitude value falls into the first interval (Rmmin,Rmmax), it is assumed that described left eye myoelectric information Averaged amplitude value fall into the second interval (Lmmin,Lmmax), and assume that right eye is leading Eye, then have Lmmax< Rmmin, then may determine that described threshold value is Mm, and Lmmax< Mm< Rmmin.It is to say, described threshold value MmIt is the described first interval and described secondth district A numerical value between.
Therefore, if the averaged amplitude value of described first myoelectric information is more than described threshold value Mm, Then thinking that it falls into the described first interval, described first eyes are the leading eyes of described user;As The averaged amplitude value of really described first myoelectric information is less than described threshold value Mm, then it is assumed that it falls into Described second interval, described first eyes are not the leading eyes of described user.

Claims (21)

1. a leading eye determines method, it is characterised in that described method includes:
Obtain the first myoelectric information of first eyes of user;
According to described first myoelectric information and a reference information, determine whether described first eyes are leading eyes.
2. the method for claim 1, it is characterised in that described reference information is the threshold value that the averaged amplitude value of the averaged amplitude value of the left eye myoelectric information according to described user and right eye myoelectric information determines.
3. method as claimed in claim 2, it is characterised in that described according to described first myoelectric information and a reference information, determines whether described first eyes are that leading eye includes:
Averaged amplitude value in response to described first myoelectric information is more than described threshold value, determines that described first eyes are leading eyes;
Averaged amplitude value in response to described first myoelectric information is less than described threshold value, determines that described first eyes are not leading eyes.
4. the method for claim 1, it is characterised in that described method also includes:
Obtain second myoelectric information of the second eyes of described user as described reference information.
5. method as claimed in claim 4, it is characterised in that described according to described first myoelectric information and a reference information, determines whether described first eyes are that leading eye includes:
Averaged amplitude value in response to described first myoelectric information is more than the averaged amplitude value of described second myoelectric information, determines that described first eyes are leading eyes;
Averaged amplitude value in response to described first myoelectric information is less than the averaged amplitude value of described second myoelectric information, determines that described first eyes are not leading eyes.
6. the method as described in any one of claim 1 to 5, it is characterised in that described method also includes:
Perform operation according to determination result.
7. the method as described in any one of claim 1 to 5, it is characterised in that described method also includes:
Receive the input information of described user.
8. method as claimed in claim 7, it is characterised in that described method also includes:
Described input information is leading eye information, according to described input information and determination result, determines that described first eyes are left eye or right eye.
9. method as claimed in claim 7, it is characterised in that described method also includes: described input information includes that described first eyes are left eye or right eye, determines that the left eye of described user or right eye are leading eyes.
10. a leading eye determines equipment, it is characterised in that described equipment includes:
One first acquisition module, for obtaining the first myoelectric information of first eyes of user;
One determining module, for according to described first myoelectric information and a reference information, determines whether described first eyes are leading eyes.
11. equipment as claimed in claim 10, it is characterised in that described equipment also includes:
One first determining module, for determining a threshold value as described reference information according to the averaged amplitude value of left eye myoelectric information and the averaged amplitude value of right eye myoelectric information of described user.
12. equipment as claimed in claim 11, it is characterised in that described determining module, are more than described threshold value for the averaged amplitude value in response to described first myoelectric information, determine that described first eyes are leading eyes;And,
Averaged amplitude value in response to described first myoelectric information is less than described threshold value, determines that described first eyes are not leading eyes.
13. equipment as claimed in claim 10, it is characterised in that described equipment also includes:
One second acquisition module, is used for obtaining the second myoelectric information of second eyes of described user as described reference information.
14. equipment as claimed in claim 13, it is characterised in that described determining module, for the averaged amplitude value in response to described first myoelectric information more than the averaged amplitude value of described second myoelectric information, determine that described first eyes are leading eyes;And,
Averaged amplitude value in response to described first myoelectric information is less than the averaged amplitude value of described second myoelectric information, determines that described first eyes are not leading eyes.
15. equipment as described in any one of claim 10 to 14, it is characterised in that described equipment also includes:
One execution module, for performing operation according to determination result.
16. equipment as described in any one of claim 10 to 14, it is characterised in that described equipment also includes:
One receiver module, for receiving the input information of described user.
17. equipment as claimed in claim 16, it is characterised in that described input information is leading eye information, and described equipment also includes:
One second determining module, for according to described input information and determination result, determining that described first eyes are left eye or right eye.
18. equipment as claimed in claim 16, it is characterised in that described input information includes that described first eyes are left eye or right eye, and described equipment also includes:
One the 3rd determining module, for according to described input information and determination information, determines that the left eye of described user or right eye are leading eyes.
19. 1 kinds of Wearable device, it is characterised in that described Wearable device includes that the leading eye described in any one of claim 10 to 18 determines equipment.
20. 1 kinds of leading eyes determine method, it is characterised in that described method includes:
Obtain the first body-sensing information of first eyes of user;
According to described first body-sensing information and a reference information, determine whether described first eyes are leading eyes.
21. 1 kinds of leading eyes determine equipment, it is characterised in that described equipment includes:
One first acquisition module, for obtaining the first body-sensing information of first eyes of user
One determining module, for according to described first body-sensing information and a reference information, determines whether described first eyes are leading eyes.
CN201410643724.8A 2014-11-07 2014-11-07 Leading eye determines method and apparatus Active CN104367320B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201410643724.8A CN104367320B (en) 2014-11-07 2014-11-07 Leading eye determines method and apparatus
US15/525,040 US10646133B2 (en) 2014-11-07 2015-07-24 Dominant eye determining method and device
PCT/CN2015/085024 WO2016070653A1 (en) 2014-11-07 2015-07-24 Dominant eye determining method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410643724.8A CN104367320B (en) 2014-11-07 2014-11-07 Leading eye determines method and apparatus

Publications (2)

Publication Number Publication Date
CN104367320A CN104367320A (en) 2015-02-25
CN104367320B true CN104367320B (en) 2016-10-05

Family

ID=52546701

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410643724.8A Active CN104367320B (en) 2014-11-07 2014-11-07 Leading eye determines method and apparatus

Country Status (1)

Country Link
CN (1) CN104367320B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016070653A1 (en) * 2014-11-07 2016-05-12 Beijing Zhigu Rui Tuo Tech Co., Ltd. Dominant eye determining method and device
CN106249846B (en) * 2015-06-29 2020-03-17 北京智谷睿拓技术服务有限公司 Light intensity adjusting method and device
CN106371561A (en) * 2015-08-19 2017-02-01 北京智谷睿拓技术服务有限公司 Input information determination method and device
CN106371560B (en) * 2015-08-19 2020-06-02 北京智谷睿拓技术服务有限公司 Method and apparatus for determining blowing and suction air

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100466964C (en) * 2007-03-20 2009-03-11 重庆大学 System and method for separating binocular vision induced potentials
US8862217B2 (en) * 2008-07-09 2014-10-14 Laurence M. McKinley Optic function monitoring process and apparatus
JP2012114544A (en) * 2010-11-22 2012-06-14 Jvc Kenwood Corp Video encoder
JP2012114545A (en) * 2010-11-22 2012-06-14 Jvc Kenwood Corp Video playback device
FR2972339B1 (en) * 2011-03-11 2013-04-19 Essilor Int METHOD FOR DETERMINING THE DIRECTION EYE
FR2972911B1 (en) * 2011-03-25 2013-04-05 Essilor Int DEVICE AND METHOD FOR DETERMINING THE DIRECTING EYE OF A PATIENT

Also Published As

Publication number Publication date
CN104367320A (en) 2015-02-25

Similar Documents

Publication Publication Date Title
CN104367320B (en) Leading eye determines method and apparatus
Leeb et al. Multimodal fusion of muscle and brain signals for a hybrid-BCI
Kim et al. Decoding three-dimensional trajectory of executed and imagined arm movements from electroencephalogram signals
CN105938397B (en) Mixing brain-computer interface method based on stable state of motion visual evoked potential Yu default stimuli responsive
US11490809B2 (en) Ocular parameter-based head impact measurement using a face shield
Meng et al. Three-dimensional brain–computer interface control through simultaneous overt spatial attentional and motor imagery tasks
Riechmann et al. Using a cVEP-based brain-computer interface to control a virtual agent
EP3440494A1 (en) Methods and systems for obtaining. analyzing, and generating vision performance data and modifying media based on the data
CN100466964C (en) System and method for separating binocular vision induced potentials
Larson et al. Electrooculography based electronic communication device for individuals with ALS
CN107260506B (en) 3D vision training system, intelligent terminal and head-mounted device based on eye movement
US10646133B2 (en) Dominant eye determining method and device
Yousefi et al. Exploiting error-related potentials in cognitive task based BCI
Xing et al. Reading the mind: the potential of electroencephalography in brain computer interfaces
Thomas et al. Investigating brief motor imagery for an ERD/ERS based BCI
CN104375644A (en) Method and device for determining dominant eye
CN104360740A (en) Method and equipment for determining dominant eye
US11723580B2 (en) Objective EEG quantitative measurement method for amblyopia
Schwarz et al. Combining frequency and time-domain EEG features for classification of self-paced reach-and-grasp actions
Lin et al. Neural Correlation of EEG and Eye Movement in Natural Grasping Intention Estimation
Ogoshi et al. Mu rhythm suppression during the imagination of observed action
Sun et al. Design of Chinese spelling system based on ERP
Dietrich et al. Towards EEG-based eye-tracking for interaction design in head-mounted devices
CN110913810A (en) Method for acquiring and processing artificial eye image
CN104360739A (en) Method and equipment for determining dominant eye

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant