CN104367320A - Method and device for determining dominant eye - Google Patents

Method and device for determining dominant eye Download PDF

Info

Publication number
CN104367320A
CN104367320A CN201410643724.8A CN201410643724A CN104367320A CN 104367320 A CN104367320 A CN 104367320A CN 201410643724 A CN201410643724 A CN 201410643724A CN 104367320 A CN104367320 A CN 104367320A
Authority
CN
China
Prior art keywords
eyes
information
eye
leading
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410643724.8A
Other languages
Chinese (zh)
Other versions
CN104367320B (en
Inventor
刘浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201410643724.8A priority Critical patent/CN104367320B/en
Publication of CN104367320A publication Critical patent/CN104367320A/en
Priority to US15/525,040 priority patent/US10646133B2/en
Priority to PCT/CN2015/085024 priority patent/WO2016070653A1/en
Application granted granted Critical
Publication of CN104367320B publication Critical patent/CN104367320B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/398Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1101Detecting tremor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]

Abstract

The invention provides a method and device for determining a dominant eye, and relates to the field of wearable equipment. The method includes the steps that the first eye electrical information of the first eye of a user is acquired; whether the first eye is the dominant eye or not is determined according to the first eye electrical information and reference information. By the adoption of the method and device, other devices which are worn by the user can be better set automatically according to a determined result, and thus user experience is improved.

Description

Leading eye defining method and equipment
Technical field
The application relates to Wearable device field, particularly relates to a kind of leading eye defining method and equipment.
Background technology
In recent years, along with the development of Wearable device, Intelligent spire lamella, Intelligent bracelet, intelligent glasses etc. progress into the life of people, enrich and facilitate the life of people greatly.Wearable device is due to small volume, and general interaction capabilities is poor.Therefore, user generally wishes that it has good self-recognizability, thus reduces the setting operation of user.
The mankind are when looking thing, and eyes role is normal not identical, and the main bearer of fusion of wherein often preponderating to a certain extent at a glance, become location, cause, this eye is called leading eye.Leading eye is one of more common laterality functional character of the mankind.Leading eye determination result can be used for the game experiencing, viewing experience etc. that improve user, in shooting game, such as can improve the immersion interactive experience of user with leading eye aiming.
If the leading eye of user can be determined by Wearable device, then as the input of self or other equipment, the setting operation of user can be reduced, improve the experience of user.
Summary of the invention
The object of the application is: provide a kind of leading eye defining method and equipment.
According to the first aspect of at least one embodiment of the application, provide a kind of leading eye defining method, described method comprises:
Obtain the First view electrical information of first eyes of user;
According to described First view electrical information and a reference information, determine whether described first eyes are leading eyes.
May be implementation in conjunction with any one of first aspect, in the implementation that the second is possible, described reference information be the threshold value determined according to the averaged amplitude value of the left eye electrical information of described user and the averaged amplitude value of right eye electrical information.
May be implementation in conjunction with any one of first aspect, in the implementation that the third is possible, described according to described First view electrical information and a reference information, determine whether described first eyes are that leading eye comprises:
Averaged amplitude value in response to described First view electrical information is greater than described threshold value, determines that described first eyes are leading eyes;
Averaged amplitude value in response to described First view electrical information is less than described threshold value, determines that described first eyes are not leading eyes.
May be implementation in conjunction with any one of first aspect, in the 4th kind of possible implementation, described method also comprises: obtain the Second Sight electrical information of second eyes of described user as described reference information.
May be implementation in conjunction with any one of first aspect, in the 5th kind of possible implementation, described according to described First view electrical information and a reference information, determine whether described first eyes are that leading eye comprises:
Averaged amplitude value in response to described First view electrical information is greater than the averaged amplitude value of described Second Sight electrical information, determines that described first eyes are leading eyes;
Averaged amplitude value in response to described First view electrical information is less than the averaged amplitude value of described Second Sight electrical information, determines that described first eyes are not leading eyes.
May be implementation in conjunction with any one of first aspect, in the 6th kind of possible implementation, described method also comprises: according to determination result executable operations.
May be implementation in conjunction with any one of first aspect, in the 7th kind of possible implementation, described method also comprises: the input information receiving described user.
May be implementation in conjunction with any one of first aspect, in the 8th kind of possible implementation, described method also comprises:
Described input information is leading eye information, according to described input information and determination result, determines that described first eyes are left eye or right eye.
May be implementation in conjunction with any one of first aspect, in the 9th kind of possible implementation, described method also comprises:
It is left eye or right eye that described input information comprises described first eyes, according to described input information and determination result, determines that the left eye of described user or right eye are leading eyes.
According to the second aspect of at least one embodiment of the application, provide a kind of leading eye and determine equipment, described equipment comprises:
One first acquisition module, for obtaining the First view electrical information of first eyes of user;
One determination module, for according to described First view electrical information and a reference information, determines whether described first eyes are leading eyes.
May be implementation in conjunction with any one of second aspect, in the implementation that the second is possible, described equipment also comprises:
For the averaged amplitude value of the left eye electrical information according to described user and the averaged amplitude value of right eye electrical information, one first determination module, determines that a threshold value is as described reference information.
May be implementation in conjunction with any one of second aspect, in the implementation that the third is possible, described determination module, for being greater than described threshold value in response to the averaged amplitude value of described First view electrical information, determines that described first eyes are leading eyes; And,
Averaged amplitude value in response to described First view electrical information is less than described threshold value, determines that described first eyes are not leading eyes.
May be implementation in conjunction with any one of second aspect, in the 4th kind of possible implementation, described equipment also comprises:
One second acquisition module, for obtaining the Second Sight electrical information of second eyes of described user as described reference information.
May be implementation in conjunction with any one of second aspect, in the 5th kind of possible implementation, described determination module, for being greater than the averaged amplitude value of described Second Sight electrical information in response to the averaged amplitude value of described First view electrical information, determines that described first eyes are leading eyes; And,
Averaged amplitude value in response to described First view electrical information is less than the averaged amplitude value of described Second Sight electrical information, determines that described first eyes are not leading eyes.
May be implementation in conjunction with any one of second aspect, in the 6th kind of possible implementation, described equipment also comprises:
One execution module, for according to determination result executable operations.
May be implementation in conjunction with any one of second aspect, in the 7th kind of possible implementation, described equipment also comprises:
One receiver module, for receiving the input information of described user.
May be implementation in conjunction with any one of second aspect, in the 8th kind of possible implementation, described input information be leading eye information, and described equipment also comprises:
One second determination module, for according to described input information and determination result, determines that described first eyes are left eye or right eye.
May be implementation in conjunction with any one of second aspect, in the 9th kind of possible implementation, it be left eye or right eye that described input information comprises described first eyes, and described equipment also comprises:
One the 3rd determination module, for according to described input information and determination result, determines that the left eye of described user or right eye are leading eyes.
According to the third aspect of at least one embodiment of the application, provide a kind of leading eye defining method, described method comprises:
Obtain the first body sense information of first eyes of user;
According to described first body sense information and a reference information, determine whether described first eyes are leading eyes.
According to the fourth aspect of at least one embodiment of the application, provide a kind of leading eye and determine equipment, described equipment comprises:
One first acquisition module, for obtaining the first body sense information of first eyes of user;
One determination module, for according to described first body sense information and a reference information, determines whether described first eyes are leading eyes.
Leading eye defining method and equipment described in the embodiment of the present application, obtain the First view electrical information of first eyes of user, and then determine whether described first eyes are leading eyes according to described First view electrical information and a reference information, thus provide a kind of defining method of leading eye, be conducive to the equipment that user wears and carry out Lookup protocol according to determination result, promote Consumer's Experience.
Accompanying drawing explanation
Fig. 1 is the flow chart of leading eye defining method described in the application's embodiment;
Fig. 2 is the electro-ocular signal contrast schematic diagram of described leading eye and non-dominant eye;
Fig. 3 is the refinement flow chart of step S140a described in the application's embodiment;
Fig. 4 is the refinement flow chart of step S140a described in another embodiment of the application;
Fig. 5 is the temperature information contrast schematic diagram of described leading eye and non-dominant eye;
Fig. 6 is the refinement flow chart of step S140b described in the application's embodiment;
Fig. 7 is the refinement flow chart of step S140b described in another embodiment of the application;
Fig. 8 is the electromyographic signal contrast schematic diagram of described leading eye and non-dominant eye;
Fig. 9 is the refinement flow chart of step S140c described in the application's embodiment;
Figure 10 is the refinement flow chart of step S140c described in another embodiment of the application;
Figure 11 is the EEG signals contrast schematic diagram that described leading eye and non-dominant eye are corresponding respectively;
Figure 12 is the refinement flow chart of step S140d described in the application's embodiment;
Figure 13 is the refinement flow chart of step S140d described in another embodiment of the application;
Figure 14 is the flow chart of leading eye defining method described in the application's embodiment;
Figure 15 is the flow chart of leading eye defining method described in another embodiment of the application;
Figure 16 is the flow chart of leading eye defining method described in another embodiment of the application;
Figure 17 is the flow chart of leading eye defining method described in another embodiment of the application;
Figure 18 is the modular structure schematic diagram that described in the application's embodiment, leading eye determines equipment;
Figure 19 is the modular structure schematic diagram that described in the application's embodiment, leading eye determines equipment;
Figure 20 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 21 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 22 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 23 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 24 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 25 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 26 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 27 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 28 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 29 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 30 is the modular structure schematic diagram that described in another embodiment of the application, leading eye determines equipment;
Figure 31 is the hardware configuration schematic diagram of leading limbs determining device described in the application's embodiment.
Detailed description of the invention
Below in conjunction with drawings and Examples, the detailed description of the invention of the application is described in further detail.Following examples for illustration of the application, but are not used for limiting the scope of the application.
Those skilled in the art understand, in the embodiment of the application, the size of the sequence number of following each step does not also mean that the priority of execution sequence, and the execution sequence of each step should be determined with its function and internal logic, and should not form any restriction to the implementation process of the embodiment of the present application.
Fig. 1 is the flow chart of leading eye defining method described in the application's embodiment, and described method can be determined equipment realizes at such as one leading eye.As shown in Figure 1, described method comprises:
S120: the first body sense information obtaining first eyes of user;
S140: according to described first body sense information and a reference information, determines whether described first eyes are leading eyes.
Method described in the embodiment of the present application, obtain the first body sense information of eyes of user, and then determine whether described first eyes are leading eyes according to described first body sense information and a reference information, thus provide a kind of leading eye defining method, the Wearable device being conducive to user carries out Lookup protocol according to determination result, promotes Consumer's Experience.
Below with reference to detailed description of the invention, describe the function of described step S120 and S140 in detail.
S120: the first body sense information obtaining first eyes of user.
Wherein, described first eyes are left eye or right eyes of user.
Described first body sense information can be eye electricity (EOG) information, myoelectricity (EMG) information, the temperature information of described first eyes, or brain electricity (EEG) information etc. that described first eyes are corresponding, it can be obtained by corresponding sensor or acquisition system.Such as, the eye electrical information of described first eyes can by electric transducer acquisition at least at a glance, the myoelectric information of described first eyes can be obtained by least one myoelectric sensor, the temperature information of described first eyes can be obtained by least one temperature sensor, and the brain electric information that described first eyes are corresponding can be obtained by a brain electric transducer.
Wherein, the myoelectric information of described first eyes can be the myoelectric information of the muscle that described first eyes are corresponding; The temperature information of described first eyes can be the temperature of the eyeball of described first eyes; When described first eyes are left eyes, the brain electric information of its correspondence can be brain electric information corresponding to brain FP1 region, and when described first eyes are right eyes, the brain electric information of its correspondence can be brain electric information corresponding to brain FP2 region.
S140: according to described first body sense information and a reference information, determines whether described first eyes are leading eyes.
A) described first body sense information can be the eye electrical information of described first eyes, i.e. First view electrical information.Described step S140 is:
S140a: according to described First view electrical information and a reference information, determines whether described first eyes are leading eyes.
Inventor finds under study for action, and the Rotation of eyeball of people is substantially all within the scope of 0-60 degree.When the Rotation of eyeball of user, can make to produce electric potential difference between the retina of eyeball and cornea.Eye electric transducer can move produced electric potential difference by electrode record eye.When eye moves deflection angle within the scope of 30 degree, described electric potential difference and eyeball deflection angle meet linear relationship; And when eye move deflection angle between 30 degree to 60 degree time, described electric potential difference and eyeball deflection angle meet sine curve relation.
In addition, inventor finds in research process, and the frequency that the leading eye eye of people is dynamic and deflection angle are higher than non-dominant eye.Inventor finds in research process, within a period of time, by identical sample frequency, after respectively electro-ocular signal is gathered with the identical position of non-dominant eye to the leading eye of user, find the averaged amplitude value of averaged amplitude value apparently higher than the electro-ocular signal of non-dominant eye of the electro-ocular signal of leading eye.
Fig. 2 is the contrast schematic diagram of the leading eye of user and the electro-ocular signal of non-dominant eye, abscissa representing time, and vertical coordinate represents the range value of electro-ocular signal, and solid-line curve represents the electro-ocular signal curve of leading eye, and imaginary curve represents the electro-ocular signal curve of non-dominant eye.Can see, the range value of the electro-ocular signal of leading eye generally all can higher than the range value of the electro-ocular signal of non-dominant eye.Based on above-mentioned principle, the determination to leading eye can be realized.
Wherein, the range value of various body sense information (comprising an electrical information, brain electric information, myoelectric information) in the application, refer to the amplitude of the waveform that corresponding body sense information is corresponding, it is always nonnegative value.
In one embodiment, described reference information is the Second Sight electrical information of second eyes of described user.Described method can also comprise:
S130a: obtain the Second Sight electrical information of second eyes of described user as described reference information.
Such as, eye electric transducer described in two groups can be set, simultaneously to the first eyes and the second eyes collection eye electrical information of described user, and the eye electrical information that will gather the second eyes, i.e. described Second Sight electrical information, as described reference information.
In present embodiment, described step S140a can pass through the size of the averaged amplitude value of more described First view electrical information and the averaged amplitude value of described Second Sight electrical information, determines whether described first eyes are leading eyes.The averaged amplitude value of described First view electrical information is the meansigma methods of the eye electricity range value that in described First view electrical information, multiple sampled point is corresponding, similar, the averaged amplitude value of described Second Sight electrical information is the meansigma methods of the eye electricity range value that in described Second Sight electrical information, multiple sampled point is corresponding.By adopting meansigma methods, single sampled point can being avoided to cause when there is sampling error determining mistake, improving and determining accuracy.Concrete, as shown in Figure 3, described step S140a can comprise:
S141a: the averaged amplitude value in response to described First view electrical information is greater than the averaged amplitude value of described Second Sight electrical information, determines that described first eyes are leading eyes;
S142a: the averaged amplitude value in response to described First view electrical information is less than the averaged amplitude value of described Second Sight electrical information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the threshold value determined according to the averaged amplitude value of the left eye electrical information of described user and the averaged amplitude value of right eye electrical information.Concrete, as shown in Figure 4, described step S140a can comprise:
S141a ': the averaged amplitude value in response to described First view electrical information is greater than described threshold value, determines that described first eyes are leading eyes;
S142a ': the averaged amplitude value in response to described First view electrical information is less than described threshold value, determines that described first eyes are not leading eyes.
Such as, gather right eye electrical information and the left eye electrical information of described user in advance, and carry out analyzing and processing, suppose that the averaged amplitude value of described right eye electrical information falls into the first interval (R omin, R omax), suppose that the averaged amplitude value of described left eye electrical information falls into the second interval (L omin, L omax), suppose that right eye is leading eye, then have L simultaneously omax< R omin, then can determine that described threshold value is M o, and L omax< M o< R omin.That is, described threshold value M oa numerical value between described first interval and described second interval.
Therefore, if the averaged amplitude value of described First view electrical information is greater than described threshold value M o, then think that it falls into described first interval, described first eyes are leading eyes of described user; If the averaged amplitude value of described First view electrical information is less than described threshold value M o, then think that it falls into described second interval, described first eyes are not the leading eyes of described user.
General, the averaged amplitude value of the eye electrical information of leading eye can be higher by more than 5% than the averaged amplitude value of the eye electrical information of non-dominant eye, can rationally arrange described threshold value M accordingly o.
B) described first body sense information can be the temperature information of described first eyes, i.e. the first temperature information.Described step S140 is:
S140b: according to described first temperature information and a reference information, determines whether described first eyes are leading eyes.
Inventor finds under study for action, and the blood supply of eyeball is from ophthalmic artery.Ophthalmic artery enters socket of the eye through optic canal after internal carotid artery separates, and is divided into two independently systems: one is retinal centre vascular system, supply retina in several layers; Two is ciliary vascular systems, supplies the eyeball other parts except Zinn's artery supply.Inventor finds under study for action, and due to the result of natural evolution, thicker than non-dominant eye of the leading retinal centre vascular system of eye and the blood vessel of ciliary vascular system, amount of blood supply is larger.Therefore, cause the temperature of leading eye higher than the temperature of non-dominant eye.
In addition, inventor also finds in research process, and the eye dynamic frequency of the leading eye of people and deflection angle are apparently higher than non-dominant eye, and eye is dynamic can produce heat, therefore, also causes the temperature of leading eye higher than the temperature of non-dominant eye.
Fig. 5 is the contrast schematic diagram of the leading eye of user and the temperature information of non-dominant eye, abscissa representing time, vertical coordinate represents the temperature of eyes, and the first curve 510 represents the temperature signal curve of leading eye, and the second curve 520 represents the temperature signal curve of non-dominant eye.Can see, the temperature value of leading eye generally all can higher than the temperature value of non-dominant eye.Based on above-mentioned principle, the determination to leading eye can be realized.
In one embodiment, described reference information is the temperature information of second eyes of described user.Described method can also comprise:
S130b: obtain the second temperature information of second eyes of described user as described reference information.
Such as, can arrange two groups of temperature sensors, simultaneously to the first eyes and the corresponding temperature information of the second eyes collection of described user, and by the temperature information of described second eyes, namely the second temperature information is as described reference information.
In present embodiment, described step S140b can pass through the size of the meansigma methods of more described first temperature information and the meansigma methods of described second temperature information, determines whether described first eyes are leading eyes.The meansigma methods of described first temperature information is the meansigma methods of the temperature value that in described first temperature information, multiple sampled point is corresponding, similar, and the meansigma methods of described second temperature information is the meansigma methods of the temperature value that in described second temperature information, multiple sampled point is corresponding.By adopting meansigma methods, single sampled point can being avoided to cause when there is sampling error determining mistake, improving and determining accuracy.Concrete, see Fig. 6, described step S140b can comprise:
S141b: the meansigma methods in response to described first temperature information is greater than the meansigma methods of described second temperature information, determines that described first eyes are leading eyes;
S142b: the meansigma methods in response to described first temperature information is less than the meansigma methods of described second temperature information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the threshold value determined according to right eye temperature information and the left eye temperature information of described user.Concrete, see Fig. 7, described step S140b can comprise:
S141b ': the meansigma methods in response to described first temperature information is greater than described threshold value, determines that described first eyes are leading eyes;
S142b ': the meansigma methods in response to described first temperature information is less than described threshold value, determines that described first eyes are not leading eyes.
Such as, gather left eye temperature information and the right eye temperature information of described user in advance, and carry out analyzing and processing, suppose that the meansigma methods of described left eye temperature information falls into the first interval (L tmin, L tmax), suppose that the meansigma methods of described right eye temperature information falls into the second interval (R tmin, R tmax), and suppose that right eye is leading eye, then have L tmax< R tmin, then can determine that described threshold value is M t, and L tmax< M t< R tmin.That is, described threshold value M ta numerical value between described first interval and described second interval.
Therefore, if the meansigma methods of described first temperature information is greater than described threshold value M t, then think that it falls into described second interval, described first eyes are leading eyes of described user; If the meansigma methods of described first temperature information is less than described threshold value M t, then think that it falls into described first interval, described first eyes are not the leading eyes of described user.
General, the temperature of the leading eye of described user can be higher than the temperature of non-dominant eye 0.1 ~ 1.2 DEG C, according to this temperature difference, can rationally arrange described threshold value M t.
C) described first body sense information can be the myoelectric information of described first eyes, i.e. the first myoelectric information.Described step S140 is:
S140c: according to described first myoelectric information and a reference information, determines whether described first eyes are leading eyes.
As mentioned before, inventor finds under study for action, and the eye dynamic frequency of the leading eye of people and deflection angle are apparently higher than non-dominant eye.In other words, the contraction frequency of the muscle of leading eye and amplitude is controlled higher than the contraction frequency of muscle and the amplitude that control non-dominant eye.
Inventor also finds in research process, and when muscle shrinks with different loads, range value and the muscular strength of myoelectric information are proportional, and namely the tension force of muscle generation is larger, and the range value of myoelectric information is larger.Further, when muscle 40%MVC (maximum static muscular strength) intensity is below shunk, the range value of muscular strength and myoelectricity is linear; When the above intensity of muscle 60%MVC, the range value of muscular strength and myoelectricity is also linear, but straight slope is now larger.And muscular strength between 40%-60%MVC time, the linear relationship between muscular strength and the range value of myoelectric information does not exist, but still proportional relation.
Fig. 8 is the contrast schematic diagram in a period of time, the muscle of the muscle and control non-dominant eye that control leading eye being detected to the myoelectric information obtained, abscissa representing time, vertical coordinate represents the range value of electromyographic signal, solid-line curve represents the electromyographic signal curve of leading eye, and imaginary curve represents the electromyographic signal curve of non-dominant eye.Can see, the range value of the electromyographic signal of leading eye generally all can higher than the range value of the electromyographic signal of non-dominant eye.Based on above-mentioned principle, the determination to leading eye can be realized.
In one embodiment, described reference information is the second myoelectric information of second eyes of described user.Described method can also comprise:
S130c: obtain the second myoelectric information of second eyes of described user as described reference information.
Such as, two groups of myoelectric sensors can be set, simultaneously to the first eyes and the corresponding myoelectric information of the second eyes collection of described user, and by the myoelectric information of described second eyes, i.e. the second myoelectric information, as described reference information.
Wherein, each eyes have six muscle to control the rotation of eyeball.These six muscle are upper and lower rectus respectively, inside and outside rectus and upper and lower oblique.They arranged by oculomotorius, trochlear nerve and nervus abducens.Inside and outside rectus is responsible for eyeball and inwardly or is outwards rotated; When upper and lower rectus shrinks, eyeball turns or lower turn, also makes adversion of eyeball simultaneously; Superior obliquus mainly makes eyeball inward turning, also makes subduction and outer turn simultaneously; Inferior obliquus mainly makes eyeball turn outward, also makes eyeball to turn and outer turn simultaneously.Described myoelectric information can obtain by carrying out detection to described in whole or in part six muscle.
In present embodiment, described step S140c can pass through the size of the averaged amplitude value of more described first myoelectric information and the averaged amplitude value of described second myoelectric information, determines whether described first eyes are leading eyes.The averaged amplitude value of described first myoelectric information is the meansigma methods of the myoelectric amplitude value that in described first myoelectric information, multiple sampled point is corresponding, similar, the averaged amplitude value of described second myoelectric information is the meansigma methods of the myoelectric amplitude value that in described second myoelectric information, multiple sampled point is corresponding.By adopting meansigma methods, single sampled point can being avoided to cause when there is sampling error determining mistake, improving and determining accuracy.Concrete, see Fig. 9, described step S140c can comprise:
S141c: the averaged amplitude value in response to described first myoelectric information is greater than the averaged amplitude value of described second myoelectric information, determines that described first eyes are leading eyes;
S142c: the averaged amplitude value in response to described first myoelectric information is less than the averaged amplitude value of described second myoelectric information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the threshold value determined according to the averaged amplitude value of the right eye myoelectric information of described user and the averaged amplitude value of left eye myoelectric information.Concrete, see Figure 10, described step S140c can comprise:
S141c ': the averaged amplitude value in response to described first myoelectric information is greater than described threshold value, determines that described first eyes are leading eyes;
S142c ': the averaged amplitude value in response to described first myoelectric information is less than described threshold value, determines that described first eyes are not leading eyes.
Such as, gather right eye myoelectric information and the left eye myoelectric information of described user in advance, and carry out analyzing and processing, suppose that the averaged amplitude value of described right eye myoelectric information falls into the first interval (R mmin, R mmax), suppose that the averaged amplitude value of described left eye myoelectric information falls into the second interval (L mmin, L mmax), and suppose that right eye is leading eye, then have L mmax< R mmin, then can determine that described threshold value is M m, and L mmax< M m< R mmin.That is, described threshold value M ma numerical value between described first interval and described second interval.
Therefore, if the averaged amplitude value of described first myoelectric information is greater than described threshold value M m, then think that it falls into described first interval, described first eyes are leading eyes of described user; If the averaged amplitude value of described first myoelectric information is less than described threshold value M m, then think that it falls into described second interval, described first eyes are not the leading eyes of described user.
General, the averaged amplitude value of the myoelectric information of leading eye can be higher than the averaged amplitude value of the myoelectric information of non-dominant eye by more than 5%, can rationally arrange described threshold value M accordingly t.
D) described first body sense information can be the brain electric information that described first eyes are corresponding, i.e. the first brain electric information.
Described step S140 is:
S140d: according to described first brain electric information and a reference information, determines whether described first eyes are leading eyes.
As mentioned before, when the Rotation of eyeball of user, can make to produce electric potential difference between the retina of eyeball and cornea.Inventor finds in research process, and described electric potential difference can be reflected in EEG signals simultaneously, is such as reflected in the EEG signals that on brain, FP1, FP2 region is corresponding
Figure 11 is the contrast schematic diagram of the EEG signals that EEG signals that the left eye of user is corresponding and right eye are corresponding, wherein, EEG signals corresponding to left eye is in FP1 station acquisition, EEG signals corresponding to right eye is in FP2 station acquisition, abscissa representing time, vertical coordinate represents the range value of EEG signals, and solid-line curve represents the EEG signals curve of leading eye, and imaginary curve represents the EEG signals curve of non-dominant eye.Can see, the range value of the EEG signals of leading eye generally all can higher than the range value of the EEG signals of non-dominant eye.Based on above-mentioned principle, the determination to leading eye can be realized.
In one embodiment, described reference information is the second brain electric information that second eyes of described user are corresponding.Described method can also comprise:
S130d: obtain the second brain electric information corresponding to second eyes of described user as described reference information.
Such as, two groups of brain electric transducers can be set, simultaneously in the FP1 region of described brain and FP2 region, gather the first eyes and brain electric information corresponding to the second eyes, and by brain electric information corresponding for the second eyes, i.e. the second brain electric information, as described reference informations.
In present embodiment, described step S140d can pass through the size of the averaged amplitude value of more described first brain electric information and the averaged amplitude value of described second brain electric information, determines whether described first eyes are leading eyes.The averaged amplitude value of described first brain electric information is the meansigma methods of the brain electricity range value that in described first brain electric information, multiple sampled point is corresponding, similar, the averaged amplitude value of described second brain electric information is the meansigma methods of the brain electricity range value that in described second brain electric information, multiple sampled point is corresponding.By adopting meansigma methods, single sampled point can being avoided to cause when there is sampling error determining mistake, improving and determining accuracy.Concrete, see Figure 12, described step S140d can comprise:
S141d: the averaged amplitude value in response to described first brain electric information is greater than the averaged amplitude value of described second brain electric information, determines that described first eyes are leading eyes;
S142d: the averaged amplitude value in response to described first brain electric information is less than the averaged amplitude value of described second brain electric information, determines that described first eyes are not leading eyes.
In another embodiment, described reference information can be the threshold value determined according to the averaged amplitude value of the left eye brain electric information of user and the averaged amplitude value of right eye brain electric information.Concrete, see Figure 13, described step S140d can comprise:
S141d ': the averaged amplitude value in response to described first brain electric information is greater than described threshold value, determines that described first eyes are leading eyes;
S142d ': the averaged amplitude value in response to described first brain electric information is less than described threshold value, determines that described first eyes are not leading eyes.
Such as, gather left eye brain electric information and the right eye brain electric information of described user in advance, and carry out analyzing and processing, suppose that the averaged amplitude value of described right eye brain electric information falls into the first interval (R emin, R emax), suppose that the averaged amplitude value of described left eye brain electric information falls into the second interval (L emin, L emax), and suppose that right eye is leading eye, then have L emax< R emin, then can determine that described threshold value is M e, and L emax< M e< R emin.That is, described threshold value M ea numerical value between described first interval and described second interval.
Therefore, if the averaged amplitude value of described first brain electric information is greater than described threshold value M e, then think that it falls into described first interval, described first eyes are leading eyes; If the averaged amplitude value of described first brain electric information is less than described threshold value M e, then think that it falls into described second interval, described first eyes are not leading eyes.
General, the averaged amplitude value of the eye electrical information of leading eye can be higher by more than 5% than the averaged amplitude value of the eye electrical information of non-dominant eye, can rationally arrange described threshold value M accordingly e.
See Figure 14, in one embodiment, described method can also comprise:
S150: according to determination result executable operations.
Such as, are leading eyes according to described first eyes of determination result display, if user carries out shooting game, user can be pointed out to use described first eyes to aim at, to improve the immersion interactive experience of user; Or, if user is watching 3D film, the different visual angles display of Autostereoscopic 3D can carried out to the leading eye of user and non-dominant eye, to improve the visual experience of user.
See Figure 15, in one embodiment, described method can also comprise:
S160: the input information receiving described user.
Wherein, user can perform input by modes such as such as voice, button, gestures.
See Figure 16, in one embodiment, described input information is leading eye information, and namely which eye is the information of leading eye, and described method can also comprise:
S170: described input information is leading eye information, according to described input information and determination result, determines that described first eyes are left eye or right eye.
Such as, described in described input information displaying, the right eye of user is leading eye, and described first eyes of described determination result display are leading eyes, then can determine that described first eyes are right eyes.
After determining described first eyes and being left eye or right eye, can carry out that right and left eyes is relevant to be arranged further, such as the eyeglass number of degrees corresponding for right eye are adjusted to automatically and match with the right eye myopia degree of user.
See Figure 17, in one embodiment, described method can also comprise:
S180: it is left eye or right eye that described input information comprises described first eyes, according to described input information and determination result, determines that the left eye of described user or right eye are leading eyes.
Such as, described in described input information displaying, the first eyes are left eyes, and described first eyes of determination result display are leading eyes, then can determine that the left eye of described user is leading eye, can record this information, for other application call.
In addition, the embodiment of the present application also provides a kind of computer-readable medium, is included in the computer-readable instruction carrying out following operation when being performed: perform the step S120 of method in above-mentioned Fig. 1 illustrated embodiment and the operation of S140.
To sum up, method described in the embodiment of the present application, can determine that whether described first eyes are the leading eyes of described user according to the first body sense information of first eyes of user and a reference information, and corresponding operating can be performed according to determination result, improve Consumer's Experience.
Figure 18 is the modular structure schematic diagram that described in one embodiment of the invention, leading eye determines equipment, described leading eye determines that equipment can be arranged in the Wearable device such as intelligent helmet, intelligent glasses, intelligent wig as a functional module, certainly also can be used as one independently Wearable device for user.As shown in figure 18, described equipment 1800 can comprise:
One first acquisition module 1810, for obtaining the first body sense information of first eyes of user;
One determination module 1820, for according to described first body sense information and a reference information, determines whether described first eyes are leading eyes.
Equipment described in the embodiment of the present application, obtain the first body sense information of first eyes of user, and then determine whether described first eyes are leading eyes according to described first body sense information and a reference information, thus provide a kind of leading eye equipment, the Wearable device being conducive to user carries out Lookup protocol according to determination result, promotes Consumer's Experience.
Below with reference to detailed description of the invention, describe the function of described first acquisition module 1810 and determination module 1820 in detail.
Described first acquisition module 1810, for obtaining the first body sense information of first eyes of user.
Wherein, described first eyes are left eye or right eyes of user.
Described first body sense information can be eye electricity (EOG) information, myoelectricity (EMG) information, the temperature information of described first eyes, or brain electricity (EEG) information etc. that described first eyes are corresponding, it can be obtained by corresponding sensor or acquisition system.Such as, the eye electrical information of described first eyes can by electric transducer acquisition at least at a glance, the myoelectric information of described first eyes can be obtained by least one myoelectric sensor, the temperature information of described first eyes can be obtained by least one temperature sensor, and the brain electric information that described first eyes are corresponding all can be obtained by a BP system.
Wherein, the myoelectric information of described first eyes can be the myoelectric information of the muscle that described first eyes are corresponding; The temperature information of described first eyes can be the temperature of the eyeball of described first eyes; When described first eyes are left eyes, the brain electric information of its correspondence can be brain electric information corresponding to brain FP1 region, and when described first eyes are right eyes, the brain electric information of its correspondence can be brain electric information corresponding to brain FP2 region.
Described determination module 1820, for according to described first body sense information and a reference information, determines whether described first eyes are leading eyes.
A) described first body sense information can be the eye electrical information of described first eyes, i.e. First view electrical information.Described determination module 1820, for according to described First view electrical information and a reference information, determines whether described first eyes are leading eyes.
In one embodiment, described reference information is the Second Sight electrical information of second eyes of described user, and see Figure 19, described equipment 1800 also comprises:
One second acquisition module 1830a, for obtaining the Second Sight electrical information of second eyes of described user as described reference information.
Accordingly, described determination module 1820, for being greater than the averaged amplitude value of described Second Sight electrical information in response to the averaged amplitude value of described First view electrical information, determines that described first eyes are leading eyes; And the averaged amplitude value in response to described First view electrical information is less than the averaged amplitude value of described Second Sight electrical information, determine that described first eyes are not leading eyes.
In another embodiment, described reference information is the threshold value determined according to the averaged amplitude value of the left eye electrical information of described user and the averaged amplitude value of right eye electrical information.Concrete, see Figure 20, described equipment 1800 also comprises:
For the averaged amplitude value of the left eye electrical information according to described user and the averaged amplitude value of right eye electrical information, one first determination module 1840a, determines that a threshold value is as described reference information.
Accordingly, described determination module 1820, for being greater than described threshold value in response to the averaged amplitude value of described First view electrical information, determines that described first eyes are leading eyes; And the averaged amplitude value in response to described First view electrical information is less than described threshold value, determine that described first eyes are not leading eyes.
Such as, described first determination module 1840a can gather right eye electrical information and the left eye electrical information of described user in advance, and carries out analyzing and processing, supposes that the averaged amplitude value of described right eye electrical information falls into the first interval (R omin, R omax), suppose that the averaged amplitude value of described left eye electrical information falls into the second interval (L omin, L omax), suppose that right eye is leading eye, then have L simultaneously omax< R omin, then can determine that described threshold value is M o, and L omax< M o< R omin.That is, described threshold value M oa numerical value between described first interval and described second interval.
Therefore, if the averaged amplitude value of described First view electrical information is greater than described threshold value M o, then think that it falls into described first interval, described first eyes are leading eyes of described user; If the averaged amplitude value of described First view electrical information is less than described threshold value M o, then think that it falls into described second interval, described first eyes are not the leading eyes of described user.
B) described first body sense information can be the temperature information of described first eyes, i.e. the first temperature information.Described determination module 1820, for according to described first temperature information and a reference information, determines whether described first eyes are leading eyes.
In one embodiment, described reference information is the second temperature information of second eyes of described user, and see Figure 21, described equipment 1800 also comprises:
One second acquisition module 1830b, for obtaining the second temperature information of second eyes of described user as described reference information.
Accordingly, described determination module 1820, for being greater than the meansigma methods of described second temperature information in response to the meansigma methods of described first temperature information, determines that described first eyes are leading eyes; And the meansigma methods in response to described first temperature information is less than the meansigma methods of described second temperature information, determine that described first eyes are not leading eyes.
In another embodiment, described reference information is the threshold value determined according to the meansigma methods of the left eye temperature information of described user and the meansigma methods of right eye temperature information, and see Figure 22, described equipment 1800 also comprises:
For the meansigma methods of the left eye temperature information according to described user and the meansigma methods of right eye temperature information, one first determination module 1840b, determines that a threshold value is as described reference information.
Accordingly, described determination module 1820, for being greater than described threshold value in response to the meansigma methods of described first temperature information, determines that described first eyes are leading eyes; And the meansigma methods in response to described first temperature information is less than described threshold value, determine that described first eyes are not leading eyes.
Such as, described first determination module 1840b can gather left eye temperature information and the right eye temperature information of described user in advance, and carries out analyzing and processing, supposes that the meansigma methods of described left eye temperature information falls into the first interval (L tmin, L tmax), suppose that the meansigma methods of described right eye temperature information falls into the second interval (R tmin, R tmax), and suppose that right eye is leading eye, then have L tmax< R tmin, then can determine that described threshold value is M t, and L tmax< M t< R tmin.That is, described threshold value M ta numerical value between described first interval and described second interval.
Therefore, if the meansigma methods of described first temperature information is greater than described threshold value M t, then think that it falls into described second interval, described first eyes are leading eyes of described user; If the meansigma methods of described first temperature information is less than described threshold value M t, then think that it falls into described first interval, described first eyes are not the leading eyes of described user.
C) described first body sense information can also be the myoelectric information of described first eyes, i.e. the first myoelectric information.Described determination module 1820, for according to described first myoelectric information and a reference information, determines whether described first eyes are leading eyes.
In one embodiment, described reference information is the second myoelectric information of second eyes of described user, and see Figure 23, described equipment 1800 can also comprise:
One second acquisition module 1830c, for obtaining the second myoelectric information of second eyes of described user as described reference information.
Accordingly, described determination module 1820, for being greater than the averaged amplitude value of described second myoelectric information in response to the averaged amplitude value of described first myoelectric information, determines that described first eyes are leading eyes; And the averaged amplitude value in response to described first myoelectric information is less than the averaged amplitude value of described second myoelectric information, determine that described first eyes are not leading eyes.
In another embodiment, described reference information is the threshold value determined according to the averaged amplitude value of the left eye myoelectric information of described user and the averaged amplitude value of right eye myoelectric information, and see Figure 24, described equipment 1800 can also comprise:
For the averaged amplitude value of the left eye myoelectric information according to described user and the averaged amplitude value of right eye myoelectric information, one first determination module 1840c, determines that a threshold value is as described reference information.
Accordingly, described determination module 1820, for being greater than described threshold value in response to the averaged amplitude value of described first myoelectric information, determines that described first eyes are leading eyes; And,
Averaged amplitude value in response to described first myoelectric information is less than described threshold value, determines that described first eyes are not leading eyes.
Such as, described first determination module 1840c can gather right eye myoelectric information and the left eye myoelectric information of described user in advance, and carries out analyzing and processing, supposes that the averaged amplitude value of described right eye myoelectric information falls into the first interval (R mmin, R mmax), suppose that the averaged amplitude value of described left eye myoelectric information falls into the second interval (L mmin, L mmax), and suppose that right eye is leading eye, then have L mmax< R mmin, then can determine that described threshold value is M m, and L mmax< M m< R mmin.That is, described threshold value M ma numerical value between described first interval and described second interval.
Therefore, if the averaged amplitude value of described first myoelectric information is greater than described threshold value M m, then think that it falls into described first interval, described first eyes are leading eyes of described user; If the averaged amplitude value of described first myoelectric information is less than described threshold value M m, then think that it falls into described second interval, described first eyes are not the leading eyes of described user.
D) described first body sense information can also be the brain electric information that described first eyes are corresponding, i.e. the first brain electric information.Described determination module 1820, for according to described first brain electric information and a reference information, determines whether described first eyes are leading eyes.
In one embodiment, described reference information can be the second brain electric information that second eyes of described user are corresponding, and see Figure 25, described equipment 1800 can also comprise:
One second acquisition module 1830d, the second brain electric information corresponding to the second eyes for obtaining described user is as described reference information.
Accordingly, described determination module 1820, for being greater than the averaged amplitude value of described second brain electric information in response to the averaged amplitude value of described first brain electric information, determines that described first eyes are leading eyes; And the averaged amplitude value in response to described first brain electric information is less than the averaged amplitude value of described second brain electric information, determine that described first eyes are not leading eyes.
In another embodiment, described reference information can be the threshold value determined according to the averaged amplitude value of the left eye brain electric information of described user and the averaged amplitude value of right eye brain electric information, and see Figure 26, described equipment 1800 can comprise:
For the averaged amplitude value of the left eye brain electric information according to described user and the averaged amplitude value of right eye brain electric information, one first determination module 1840d, determines that a threshold value is as described reference information.
Accordingly, described determination module 1820, for being greater than described threshold value in response to the averaged amplitude value of described first brain electric information, determines that described first eyes are leading eyes; And the averaged amplitude value in response to described first brain electric information is less than described threshold value, determine that described first eyes are not leading eyes.
Such as, described first determination module 1840d can gather left eye brain electric information and the right eye brain electric information of described user in advance, and carries out analyzing and processing, supposes that the averaged amplitude value of described right eye brain electric information falls into the first interval (R emin, R emax), suppose that the averaged amplitude value of described left eye brain electric information falls into the second interval (L emin, L emax), and suppose that right eye is leading eye, then have L emax< R emin, then can determine that described threshold value is M e, and L emax< M e< R emin.That is, described threshold value M ea numerical value between described first interval and described second interval.
Therefore, if the averaged amplitude value of described first brain electric information is greater than described threshold value M e, then think that it falls into described first interval, described first eyes are leading eyes; If the averaged amplitude value of described first brain electric information is less than described threshold value M e, then think that it falls into described second interval, described first eyes are not leading eyes.
See Figure 27, in one embodiment, described equipment 1800 can also comprise:
One execution module 1850, for according to determination result executable operations.
Such as, are leading eyes according to described first eyes of determination result display, if user carries out shooting game, described execution module 1850 can point out user to use described first eyes to aim at, to improve the immersion interactive experience of user; Or if user is watching 3D film, described execution module 1850 can carry out the different visual angles display of Autostereoscopic 3D to the leading eye of user and non-dominant eye, to improve the visual experience of user.
See Figure 28, in one embodiment, described equipment 1800 can also comprise:
One receiver module 1860, for receiving the input information of described user.
Wherein, user can perform input by modes such as such as voice, button, gestures.
See Figure 29, in one embodiment, described input information is leading eye information, and described equipment 1800 can also comprise:
One second determination module 1870, for according to described input information and determination result, determines that described first eyes are left eye or right eye.
Such as, described in described input information displaying, the right eye of user is leading eye, and described first eyes of described determination result display are leading eyes, then can determine that described first eyes are right eyes.
After determining described first eyes and being left eye or right eye, can carry out that right and left eyes is relevant to be arranged further, such as the eyeglass number of degrees corresponding for right eye are adjusted to automatically and match with the right eye myopia degree of user.
See Figure 30, in one embodiment, it is left eye or right eye that described input information comprises described first eyes, and described equipment 1800 can also comprise:
One the 3rd determination module 1880, for according to described input information and determination result, determines that the left eye of described user or right eye are leading eyes.
Such as, described in described input information displaying, the first eyes are left eyes, and described first eyes of determination result display are leading eyes, then can determine that the left eye of described user is leading eye, can record this information, for other application call.
Described in the embodiment of the present application, an application scenarios of leading eye defining method and equipment can be as follows: a user has on an intelligent helmet and carries out a shooting game, game starts first 1 minute, intelligent helmet is in FP1 and the FP2 region of user's brain, gather EEG signals corresponding to user's two eyes respectively, and judge that the left eye of user is leading eye according to analysis result; So prompting user uses left eye to aim in game process, user uses left eye to aim at according to prompting, significantly improves the accuracy of fire, improves Consumer's Experience.
Described in the application's embodiment, leading eye determines the hardware configuration of equipment as shown in figure 31.To described leading eye, the application's specific embodiment does not determine that the specific implementation of equipment limits, see Figure 31, described equipment 3100 can comprise:
Processor (processor) 3110, communication interface (Communications Interface) 3120, memorizer (memory) 3130, and communication bus 3140.Wherein:
Processor 3110, communication interface 3120, and memorizer 3130 completes mutual communication by communication bus 3140.
Communication interface 3120, for other net element communications.
Processor 3110, for performing a programme 3132, specifically can perform the correlation step in the embodiment of the method shown in above-mentioned Fig. 1.
Particularly, program 3132 can comprise program code, and described program code comprises computer-managed instruction.
Processor 3110 may be a central processor CPU, or specific integrated circuit ASIC (Application Specific Integrated Circuit), or is configured to the one or more integrated circuits implementing the embodiment of the present application.
Memorizer 3130, for depositing program 3132.Memorizer 3130 may comprise high-speed RAM memorizer, still may comprise nonvolatile memory (non-volatile memory), such as at least one disk memory.Program 3132 specifically can perform following steps:
Obtain the first body sense information of first eyes of user;
According to described first body sense information and a reference information, determine whether described first eyes are leading eyes.
In program 3132, the specific implementation of each step see the corresponding steps in above-described embodiment or module, can be not repeated herein.Those skilled in the art can be well understood to, and for convenience and simplicity of description, the equipment of foregoing description and the specific works process of module, can describe with reference to the corresponding process in preceding method embodiment, not repeat them here.
Those of ordinary skill in the art can recognize, in conjunction with unit and the method step of each example of embodiment disclosed herein description, can realize with the combination of electronic hardware or computer software and electronic hardware.These functions perform with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can use distinct methods to realize described function to each specifically should being used for, but this realization should not think the scope exceeding the application.
If described function using the form of SFU software functional unit realize and as independently production marketing or use time, can be stored in a computer read/write memory medium.Based on such understanding, the part of the part that the technical scheme of the application contributes to prior art in essence in other words or this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprising some instructions in order to make a computer equipment (can be personal computer, controller, or the network equipment etc.) perform all or part of step of method described in each embodiment of the application.And aforesaid storage medium comprises: USB flash disk, portable hard drive, read only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. various can be program code stored medium.
Above embodiment is only for illustration of the application; and the restriction not to the application; the those of ordinary skill of relevant technical field; when not departing from the spirit and scope of the application; can also make a variety of changes and modification; therefore all equivalent technical schemes also belong to the category of the application, and the scope of patent protection of the application should be defined by the claims.

Claims (10)

1. a leading eye defining method, is characterized in that, described method comprises:
Obtain the First view electrical information of first eyes of user;
According to described First view electrical information and a reference information, determine whether described first eyes are leading eyes.
2. the method for claim 1, is characterized in that, described reference information is the threshold value determined according to the averaged amplitude value of the left eye electrical information of described user and the averaged amplitude value of right eye electrical information.
3. method as claimed in claim 2, is characterized in that, described according to described First view electrical information and a reference information, determines whether described first eyes are that leading eye comprises:
Averaged amplitude value in response to described First view electrical information is greater than described threshold value, determines that described first eyes are leading eyes;
Averaged amplitude value in response to described First view electrical information is less than described threshold value, determines that described first eyes are not leading eyes.
4. leading eye determines an equipment, it is characterized in that, described equipment comprises:
One first acquisition module, for obtaining the First view electrical information of first eyes of user;
One determination module, for according to described First view electrical information and a reference information, determines whether described first eyes are leading eyes.
5. equipment as claimed in claim 4, it is characterized in that, described equipment also comprises:
For the averaged amplitude value of the left eye electrical information according to described user and the averaged amplitude value of right eye electrical information, one first determination module, determines that a threshold value is as described reference information.
6. equipment as claimed in claim 5, is characterized in that, described determination module, for being greater than described threshold value in response to the averaged amplitude value of described First view electrical information, determining that described first eyes are leading eyes; And,
Averaged amplitude value in response to described First view electrical information is less than described threshold value, determines that described first eyes are not leading eyes.
7. equipment as claimed in claim 4, it is characterized in that, described equipment also comprises:
One second acquisition module, for obtaining the Second Sight electrical information of second eyes of described user as described reference information.
8. a Wearable device, is characterized in that, the leading eye that described Wearable device comprises described in any one of claim 4 to 7 determines equipment.
9. a leading eye defining method, is characterized in that, described method comprises:
Obtain the first body sense information of first eyes of user;
According to described first body sense information and a reference information, determine whether described first eyes are leading eyes.
10. leading eye determines an equipment, it is characterized in that, described equipment comprises:
One first acquisition module, for obtaining the first body sense information of first eyes of user;
One determination module, for according to described first body sense information and a reference information, determines whether described first eyes are leading eyes.
CN201410643724.8A 2014-11-07 2014-11-07 Leading eye determines method and apparatus Active CN104367320B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201410643724.8A CN104367320B (en) 2014-11-07 2014-11-07 Leading eye determines method and apparatus
US15/525,040 US10646133B2 (en) 2014-11-07 2015-07-24 Dominant eye determining method and device
PCT/CN2015/085024 WO2016070653A1 (en) 2014-11-07 2015-07-24 Dominant eye determining method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410643724.8A CN104367320B (en) 2014-11-07 2014-11-07 Leading eye determines method and apparatus

Publications (2)

Publication Number Publication Date
CN104367320A true CN104367320A (en) 2015-02-25
CN104367320B CN104367320B (en) 2016-10-05

Family

ID=52546701

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410643724.8A Active CN104367320B (en) 2014-11-07 2014-11-07 Leading eye determines method and apparatus

Country Status (1)

Country Link
CN (1) CN104367320B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016070653A1 (en) * 2014-11-07 2016-05-12 Beijing Zhigu Rui Tuo Tech Co., Ltd. Dominant eye determining method and device
CN106249846A (en) * 2015-06-29 2016-12-21 北京智谷睿拓技术服务有限公司 Light intensity control method and equipment
CN106371561A (en) * 2015-08-19 2017-02-01 北京智谷睿拓技术服务有限公司 Input information determination method and device
CN106371560A (en) * 2015-08-19 2017-02-01 北京智谷睿拓技术服务有限公司 Blowing and sucking determination method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101019760A (en) * 2007-03-20 2007-08-22 重庆大学 System and method for separating binocular vision induced potentials
CN102264277A (en) * 2008-07-09 2011-11-30 劳伦斯·M·麦金利 Optic function monitoring process and apparatus
JP2012114545A (en) * 2010-11-22 2012-06-14 Jvc Kenwood Corp Video playback device
JP2012114544A (en) * 2010-11-22 2012-06-14 Jvc Kenwood Corp Video encoder
WO2012123658A1 (en) * 2011-03-11 2012-09-20 Essilor International (Compagnie Générale d'Optique) Method for determining the dominant eye
WO2012131182A1 (en) * 2011-03-25 2012-10-04 Essilor International (Compagnie Generale D'optique) Device and process for determining the dominant eye of a patient

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101019760A (en) * 2007-03-20 2007-08-22 重庆大学 System and method for separating binocular vision induced potentials
CN102264277A (en) * 2008-07-09 2011-11-30 劳伦斯·M·麦金利 Optic function monitoring process and apparatus
JP2012114545A (en) * 2010-11-22 2012-06-14 Jvc Kenwood Corp Video playback device
JP2012114544A (en) * 2010-11-22 2012-06-14 Jvc Kenwood Corp Video encoder
WO2012123658A1 (en) * 2011-03-11 2012-09-20 Essilor International (Compagnie Générale d'Optique) Method for determining the dominant eye
WO2012131182A1 (en) * 2011-03-25 2012-10-04 Essilor International (Compagnie Generale D'optique) Device and process for determining the dominant eye of a patient

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
管永清等: "视觉诱发电位与优势眼", 《现代电生理学杂志》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016070653A1 (en) * 2014-11-07 2016-05-12 Beijing Zhigu Rui Tuo Tech Co., Ltd. Dominant eye determining method and device
US10646133B2 (en) 2014-11-07 2020-05-12 Beijing Zhigu Rui Tuo Tech Co., Ltd Dominant eye determining method and device
CN106249846A (en) * 2015-06-29 2016-12-21 北京智谷睿拓技术服务有限公司 Light intensity control method and equipment
CN106249846B (en) * 2015-06-29 2020-03-17 北京智谷睿拓技术服务有限公司 Light intensity adjusting method and device
CN106371561A (en) * 2015-08-19 2017-02-01 北京智谷睿拓技术服务有限公司 Input information determination method and device
CN106371560A (en) * 2015-08-19 2017-02-01 北京智谷睿拓技术服务有限公司 Blowing and sucking determination method and device
CN106371560B (en) * 2015-08-19 2020-06-02 北京智谷睿拓技术服务有限公司 Method and apparatus for determining blowing and suction air

Also Published As

Publication number Publication date
CN104367320B (en) 2016-10-05

Similar Documents

Publication Publication Date Title
US10660570B2 (en) Attention, comprehension, and drowsiness monitoring via head mounted device supporting augmented and mixed reality experiences
US20170105622A1 (en) Monitoring pulse transmissions using radar
WO2020205744A1 (en) Systems and methods for control schemes based on neuromuscular data
CN104367320A (en) Method and device for determining dominant eye
US11331045B1 (en) Systems and methods for mitigating neuromuscular signal artifacts
WO2015196918A1 (en) Methods and apparatuses for electrooculogram detection, and corresponding portable devices
CN104182041B (en) Blink type determines method and blink type determination device
CN104360745A (en) Dominant limb determination method and dominant limb determination device
US10646133B2 (en) Dominant eye determining method and device
CN110023816A (en) The system for distinguishing mood or psychological condition
CN108829239A (en) Control method, device and the terminal of terminal
CN104049752A (en) Interaction method based on human body and interaction device based on human body
CN104375644A (en) Method and device for determining dominant eye
CN104360740A (en) Method and equipment for determining dominant eye
CN110178102A (en) Estimation in display
CN104407703A (en) Dominant limb determination method and apparatus
CN104407704A (en) Dominant limb determination method and apparatus
WO2017016941A1 (en) Wearable device, method and computer program product
JP6768597B2 (en) Dialogue system, control method of dialogue system, and device
CN108491792A (en) Office scene human-computer interaction Activity recognition method based on electro-ocular signal
CN104360739A (en) Method and equipment for determining dominant eye
WO2021045637A1 (en) Virtual reality interactive system for neuro-meditation and neuro-concentration training
CN104461017A (en) Interactive method, interactive device and user equipment
CN111625098B (en) Intelligent virtual avatar interaction method and device based on multi-channel information fusion
CN106055109B (en) A kind of brain-computer interface stimulus sequence generation method based on body-sensing electro photoluminescence

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant