CN106125926A - A kind of information processing method and electronic equipment - Google Patents
A kind of information processing method and electronic equipment Download PDFInfo
- Publication number
- CN106125926A CN106125926A CN201610460487.0A CN201610460487A CN106125926A CN 106125926 A CN106125926 A CN 106125926A CN 201610460487 A CN201610460487 A CN 201610460487A CN 106125926 A CN106125926 A CN 106125926A
- Authority
- CN
- China
- Prior art keywords
- moving objects
- eyes
- degree
- information
- association
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Abstract
The invention discloses a kind of information processing method and electronic equipment, including: obtain the eye motion information of the eyes of user;Obtaining M object motion information of M Moving Objects on display unit, M is the integer more than or equal to 1;Based on described eye motion information and described M object motion information, from described M Moving Objects, determine the first Moving Objects., there is, for solving ocular pursuit technology in prior art, the technical problem that precision is relatively low in the technical scheme provided by the present invention.
Description
Technical field
The present invention relates to electronic technology field, particularly to a kind of information processing method and electronic equipment.
Background technology
Along with the development of computer technology and increasing of consumer demand, increasing electronic product comes out, and right
The control technology of these electronic products also gets more and more, such as, be controlled by mouse, Trackpad, in order to more convenient user makes
With electronic product, prior art occurring in that, eye is dynamic and controls.
At present, eye of the prior art is dynamic to be controlled, and needs user first to calibrate display screen, if not calibrating,
Tracking degree of accuracy then can be caused relatively low.
Visible, in prior art there is the technical problem that precision is relatively low in ocular pursuit technology.
Summary of the invention
The embodiment of the present invention provides a kind of information processing method and electronic equipment, is used for solving ocular pursuit skill in prior art
There is the technical problem that degree of accuracy is relatively low in art, to reach to improve the technique effect of the degree of accuracy of ocular pursuit technology.
On the one hand, the embodiment of the present application provides a kind of information processing method, comprises the following steps:
Obtain the eye motion information of the eyes of user;
Obtaining M object motion information of M Moving Objects on display unit, M is the integer more than or equal to 1;
Based on described eye motion information and described M object motion information, from described M Moving Objects, determine
One Moving Objects.
Optionally, the eye motion information of the described eyes obtaining user, including:
Obtain first direction of motion of the eyes of user;
Accordingly, M object motion information of M Moving Objects on described acquisition display unit, including:
Obtain M the direction of motion of M Moving Objects on display unit.
Optionally, described based on described eye motion information with described M object motion information, right from described M motion
The first Moving Objects is determined in as, including:
Calculate the angle between described first direction of motion and described M the direction of motion, obtain M angle altogether;
From described M angle, determine that angle is less than the first angle of default angle;
Using the Moving Objects corresponding with described first angle as described first Moving Objects.
Optionally, the eye motion information of the described eyes obtaining user, including:
Obtain the first movement locus of the eyes of user;
Accordingly, M object motion information of M Moving Objects on described acquisition display unit, including:
Obtain M movement locus of M Moving Objects on display unit.
Optionally, described based on described eye motion information with described M object motion information, right from described M motion
The first Moving Objects is determined in as, including:
Calculate the degree of association between described first movement locus and described M movement locus, obtain M the degree of association altogether;
From described M the degree of association, determine that the degree of association is more than first degree of association of the default degree of association;
Using the Moving Objects corresponding with described first degree of association as the first Moving Objects.
Optionally, described from described M Moving Objects, determine the first Moving Objects time, described method also includes:
Move the cursor on the position corresponding with described first Moving Objects.
Optionally, described based on described eye motion information with described M object motion information, from described M motion
After determining the first Moving Objects in object, described method also includes:
Obtain the second eyes movable information and/or the head movement information of head of the eyes of described user;
According to described second eyes movable information and/or described head movement information, described first Moving Objects is carried out
Corresponding operation.
On the other hand, the embodiment of the present application also provides for a kind of electronic equipment, including:
Housing;
Display unit;
The dynamic monitoring component of eye, is arranged in described housing, is connected with described display unit, for obtaining the eyes of user
Eye motion information;
Processing means, is arranged in described housing, moves monitoring component with described eye and be connected, be used for obtaining M on display unit
M object motion information of individual Moving Objects, M is the integer more than or equal to 1;Individual based on described eye motion information and described M
Object motion information, determines the first Moving Objects from described M Moving Objects.
Optionally, described eye moves monitoring component and is used for:
Obtain first direction of motion of the eyes of user;
Accordingly, described processing means is used for:
Obtain M the direction of motion of M Moving Objects on display unit.
Optionally, described processing means is used for:
Calculate the angle between described first direction of motion and described M the direction of motion, obtain M angle altogether;
From described M angle, determine that angle is less than the first angle of default angle;
Using the Moving Objects corresponding with described first angle as described first Moving Objects.
Optionally, described eye moves monitoring component and is used for:
Obtain the first movement locus of the eyes of user;
Accordingly, described processing means is used for:
Obtain M movement locus of M Moving Objects on display unit.
Optionally, described processing means is used for:
Calculate the degree of association between described first movement locus and described M movement locus, obtain M the degree of association altogether;
From described M the degree of association, determine that the degree of association is more than first degree of association of the default degree of association;
Using the Moving Objects corresponding with described first degree of association as the first Moving Objects.
Optionally, when determining the first Moving Objects from described M Moving Objects, described processing means is additionally operable to:
Move the cursor on the position corresponding with described first Moving Objects.
Optionally, described based on described eye motion information with described M object motion information, from described M motion
After determining the first Moving Objects in object, described electronic equipment also includes:
Sensing device, is used for:
Obtain the second eyes movable information and/or the head movement information of head of the eyes of described user;
Described processing means is additionally operable to:
According to described second eyes movable information and/or described head movement information, described first Moving Objects is carried out
Corresponding operation.
By an embodiment in the above-described embodiment in the present invention or multiple embodiment, following skill at least can be realized
Art effect:
One, due to the technical scheme in the embodiment of the present application, the eye motion information of the eyes of user is obtained;Obtain display
M object motion information of M Moving Objects on unit, M is the integer more than or equal to 1;Based on described eye motion information and
Described M object motion information, determines the first Moving Objects from described M Moving Objects.I.e. will not be as in prior art
Eye dynamic control, first having to display screen is calibrated, if not calibrating, then can affect tracking degree of accuracy, and in this technology
In scheme, it is that M object motion information of M Moving Objects on the movable information according to eyes and display unit determines
One Moving Objects, thus this Moving Objects is operated accordingly, in the case of without calibration, also ensure that the essence of tracking
Exactness, thus it is possible to the ocular pursuit technology in prior art that effectively solves exists the technical problem that degree of accuracy is relatively low, and then reaches to carry
The technique effect of high ocular pursuit degree of accuracy.
Two, due to the technical scheme in the embodiment of the present application, from described M Moving Objects, the first fortune is determined described
During dynamic object, move the cursor on the position corresponding with described first Moving Objects.I.e. can be given by the technical program and use
Bring the experience that cursor is dynamic with eye and dynamic at family, and then further up to improving the technique effect of user experience.
Three, due to the technical scheme in the embodiment of the present application, it is thus achieved that the second eyes movable information of the eyes of described user
And/or the head movement information of head;According to described second eyes movable information and/or described head movement information, to described
First Moving Objects operates accordingly.I.e. in the technical program, after determining the first Moving Objects, it is to pass through user
Second Moving Objects is operated by the eye motion information of eyes and/or the head movement information of head accordingly, thus keeps away
Exempt from maloperation, and then reach to improve the technique effect of user experience.
Accompanying drawing explanation
A kind of information processing method that Fig. 1 provides for the embodiment of the present application one implement flow chart;
In a kind of information processing method that Fig. 2 provides for the embodiment of the present application one on eye motion track and display unit
The movement locus schematic diagram of Moving Objects;
The structural representation of a kind of electronic equipment that Fig. 3 provides for the embodiment of the present application two;
Fig. 4 is specially the structural representation of headset equipment for a kind of electronic equipment that the embodiment of the present application two provides.
Detailed description of the invention
The technical scheme that the embodiment of the present application provides, being used for the ocular pursuit technology in prior art that solves, to there is degree of accuracy relatively low
Technical problem, to reach to improve the technique effect of degree of accuracy of ocular pursuit technology.
Technical scheme in the embodiment of the present application is for solving above-mentioned technical problem, and general thought is as follows:
Obtain the eye motion information of the eyes of user;
Obtaining M object motion information of M Moving Objects on display unit, M is the integer more than or equal to 1;
Based on described eye motion information and described M object motion information, from described M Moving Objects, determine
One Moving Objects.
In technique scheme, obtain the eye motion information of the eyes of user;Obtain M motion on display unit right
M the object motion information of elephant, M is the integer more than or equal to 1;Based on described eye motion information and described M object motion
Information, determines the first Moving Objects from described M Moving Objects.I.e. will not control, first as eye of the prior art is dynamic
Display screen calibrated, if not calibrating, then can affect tracking degree of accuracy, and in the technical program, be according to eye
On the movable information of eyeball and display unit, M object motion information of M Moving Objects determines the first Moving Objects, thus
This Moving Objects is operated accordingly, in the case of without calibration, also ensures that the degree of accuracy of tracking, thus it is possible to have
Effect solves ocular pursuit technology in prior art and there is the technical problem that degree of accuracy is relatively low, and then reaches to improve ocular pursuit degree of accuracy
Technique effect.
In order to be illustrated more clearly that the embodiment of the present application or technical scheme of the prior art, below will be to embodiment or existing
In having technology to describe, the required accompanying drawing used is briefly described, it should be apparent that, the accompanying drawing in describing below is only this
Application embodiment, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to according to carrying
The accompanying drawing of confession obtains other accompanying drawing.
Embodiment one
Refer to Fig. 1, a kind of information processing method provided for the embodiment of the present application one, including:
S101: obtain the eye motion information of the eyes of user;
S102: obtaining M object motion information of M Moving Objects on display unit, M is the integer more than or equal to 1;
S103: based on described eye motion information and described M object motion information, from described M Moving Objects really
Make the first Moving Objects.
A kind of information processing method that the application implements to provide can be applied to an electronic equipment, as: with photographic head
Headset equipment, panel computer, smart mobile phone or notebook computer etc., or be other electronic equipment, here, the most one by one
Schematically illustrate.
In the embodiment of the present application, step S101 is first carried out: obtain the eye motion information of the eyes of user.
In the embodiment of the present application, information can be moved by the eye of the eye-tracking equipment acquisition eyes in electronic equipment;
Facial image can also be obtained by the photographic head on electronic equipment, by facial image identification model inspection to eye image,
Eye image carries out gray value, marginalisation etc. again process, thus obtain eye and move information.In the embodiment of the present application, the dynamic letter of eye
Breath includes but not limited to: point of fixation, fixation times, twitching of the eyelid distance or pupil size.Concrete, eye moves three kinds of basic modes:
Watch attentively, twitching of the eyelid and following movement.
Then the movable information of acquisition of information eyes is moved according to the eye obtained.In the embodiment of the present application, the motion of eyes
Information specifically has two kinds of situations, is described in detail both of these case separately below.
The first situation, implements process for step S101, specifically includes following steps:
Obtain first direction of motion of the eyes of user;
Accordingly, for step S102: obtaining M object motion information of M Moving Objects on display unit, M is big
Implement process in the integer equal to 1, specifically include following steps:
Obtain M the direction of motion of M Moving Objects on display unit.
During implementing, move first direction of motion of the eyes of acquisition of information user according to the eye of user, such as: phase
For display unit, move right from the left side of display unit;Or from the right side of display unit to left movement;Or display
The upper side edge of unit moves downward etc..
Accordingly, in the embodiment of the present application, while obtaining the eye motion direction of eyes of user, also to obtain aobvious
Show M the direction of motion of M Moving Objects on unit.
In the embodiment of the present application, M Moving Objects is specifically as follows such as: show 3 flying birds on the display unit, uses
Family, by following the tracks of different flying birds, can obtain the details of this flying bird;Or user is playing Rana nigromaculata and is eating the game of fly,
Showing 4 flies on display unit, user is by watching different flies attentively, and Rana nigromaculata then can eat up that user watched attentively
Fly;Or user is currently in use Baidu's map and carries out route inquiry, and display unit shows different target locations, user
By watching different target locations attentively, such as: target location A and target location B, can automatically determine by the navigation circuit of A to B
Deng.
During implementing, if M is as a example by 2, the direction of motion of first Moving Objects is specifically as follows from display
The left side of unit moves right, and the direction of motion of second Moving Objects is specially the east of the centre coordinate of relative display unit
30 degree by north;Or west by south 30 degree of the centre coordinate that the direction of motion of first Moving Objects is relative display unit, second
The upper side edge that the direction of motion of individual Moving Objects is specially from display unit moves downward, or is other direction, in the application
Embodiment is not especially limited.
In the embodiment of the present application, when the eye motion information of eyes is specially the direction of motion, for step S103: base
In described eye motion information and described M object motion information, from described M Moving Objects, determine the first Moving Objects
Implement process, specifically include following steps:
Calculate the angle between described first direction of motion and described M the direction of motion, obtain M angle altogether;
From described M angle, determine that angle is less than the first angle of default angle;
Using the Moving Objects corresponding with described first angle as described first Moving Objects.
In the embodiment of the present application, M is as a example by 2, and the direction of motion of first Moving Objects is the left side from display unit
While move right, the direction of motion of second Moving Objects is relative to 30 degree, the centre coordinate north by east of display unit;The of eyes
One direction of motion is that the left side from display unit moves to the right.Default angle is 5 degree, 10 degree or 15 degree, or is other
Angle, those of ordinary skill in the art can be set according to actual needs, be not especially limited in the embodiment of the present application.
During implementing, if presetting angle as a example by 10 degree.First direction of motion and first Moving Objects
The first angle between the direction of motion is 0 degree;Between the direction of motion of first direction of motion and second Moving Objects second
Angle is 30 degree.Owing to the first angle is less than presetting angle 10, the second angle is more than presetting angle 10, it is determined that with the first angle
First corresponding Moving Objects is the first Moving Objects.
The second situation, implements process for step S101, specifically includes following steps:
Obtain the first movement locus of the eyes of user;
Accordingly, for step S102: obtaining M object motion information of M Moving Objects on display unit, M is big
Implement process in the integer equal to 1, specifically include following steps:
Obtain M movement locus of M Moving Objects on display unit.During implementing, electronics can be passed through
Eye-tracking equipment in equipment obtains the direction of motion of glasses, the change attitude of motion and motion change trend etc. and determines eye
The movement locus of eyeball, such as: for the display screen of electronic equipment, moves right 5 centimetres from the left side of display screen, then
Move upward 5 centimetres;Or move downward 5 centimetres from the upper side edge of display screen, then move right 5 centimetres;Or from left side
While move right 5 centimetres, motion 5 centimetres then up, or be other movement locus, here, schematically illustrate the most one by one.
Accordingly, in the embodiment of the present application, while obtaining the eye motion track of eyes of user, also to obtain aobvious
Show M movement locus of M Moving Objects on unit.In the embodiment of the present application, M Moving Objects is specifically as follows: display
3 flying birds of display on unit;Or user is playing Rana nigromaculata and is eating the game of fly, 4 flies of display on display unit, or
For other display object of display on display unit, it is not especially limited in the embodiment of the present application.
During implementing, if M is as a example by 2, the movement locus of first Moving Objects is the left side from display screen
While move right 5 centimetres, then move downward 5 centimetres;The movement locus of second Moving Objects is the upper side edge from display screen
Move downward 5 centimetres, then move right 5 centimetres;Or be other movement locus, here, schematically illustrate the most one by one.
Accordingly, in the embodiment of the present application, when the movable information of eyes is specially the movement locus of eyes, for step
Rapid S103: based on described eye motion information and described M object motion information, determines the from described M Moving Objects
One Moving Objects implement process, specifically include following steps:
Calculate the degree of association between described first movement locus and described M movement locus, obtain M the degree of association altogether;
From described M the degree of association, determine that the degree of association is more than first degree of association of the default degree of association;
Using the Moving Objects corresponding with described first degree of association as the first Moving Objects.
In the embodiment of the present application, M is as a example by 2, and the movement locus of first Moving Objects is the left side from display screen
Move right 5 centimetres, then move downward 5 centimetres;The movement locus of second Moving Objects be from the upper side edge of display screen to
Lower motion 5 centimetres, then moves right 5 centimetres;First movement locus is that the left side from display screen moves right 5 centimetres, so
After move downward 5 centimetres.The default degree of association is 50%, 70% or 80%, or is other degree of association, ordinary skill people
Member can be set according to actual needs, is not especially limited in the embodiment of the present application.
During implementing, calculate the degree of association between the first movement locus and M movement locus, can be extracted
One movement locus and first movement locus, the characteristic point of second movement locus, then compare the characteristic point extracted
Right.In the embodiment of the present application, characteristic point is the point of the shape that can reflect movement locus, such as: starting point in movement locus, turn
Point, peak point, terminal etc..
During implementing, extract 4 features of 4 characteristic points of the first movement locus, first movement locus
Point, 4 characteristic points of second movement locus, after extracting these characteristic points, it is determined that between the characteristic point extracted
Whether mate.In the embodiment of the present application, determine and whether mate between each characteristic point, determine that the position between each characteristic point
Deviation is in preset range.
During implementing, if the default degree of association is as a example by 80%, 4 characteristic points of the first movement locus and first
Mate completely between 4 characteristic points of individual movement locus, then show between the first movement locus and first movement locus
One degree of association is 100%;2 are only had between 4 characteristic points and 4 characteristic points of second movement locus of the first movement locus
Feature Points Matching, then show that second degree of association between the first movement locus and second movement locus is 50%.Due to first
The degree of association is more than presetting the degree of association 80%, and second degree of association is less than presetting the degree of association 80%, it is determined that corresponding with first degree of association
First Moving Objects be the first Moving Objects, concrete schematic diagram refer to Fig. 2.
In the embodiment of the present application, after determining the first Moving Objects from M Moving Objects, described method is also wrapped
Include:
Move the cursor on the position corresponding with described first Moving Objects.
Further, for more humane operation electric terminal, it is achieved the dynamic control of eye to electronic equipment, first is being determined
During Moving Objects, it will also be shown that the cursor on unit moves on the position that the first Moving Objects is corresponding, user can have cursor
With having dynamic and dynamic sensation in mind, thus provide the user with preferable experience effect.
Further, in the embodiment of the present application, believe with described M object motion based on described eye motion information described
Breath, after determining the first Moving Objects from described M Moving Objects, described method also includes:
Obtain the second eyes movable information and/or the head movement information of head of the eyes of described user;
According to described second eyes movable information and/or described head movement information, described first Moving Objects is carried out
Corresponding operation.
In the embodiment of the present application, gravity sensor or the eye motion of photographic head identification user can be passed through, such as: blink
Eyes;And/or headwork, such as: shake the head, put first-class, that these actions correspondences are different instruction, and different instructions can realize
Different operations.Concrete, such as: when determining that Moving Objects is the first Moving Objects flying bird A, if at this moment detecting user's
Headwork is for nodding, then it represents that user wants to obtain the details of flying bird A, so an instruction will be produced, single in display
The details of flying bird A are shown in unit;Or the headwork identifying user is rotary head to the left, then flying bird A is shot at.?
In the present embodiment, headwork and corresponding operation can also be fabricated to table, when identifying headwork, just table look-up, just
The operation that headwork is corresponding can be found.
In the embodiment of the present application, after determining corresponding Moving Objects, in addition it is also necessary to by the eye motion of user
Or after the confirmation action of head, just Moving Objects is operated accordingly, thus avoid the maloperation of user, and then reach
Improve the technique effect of user experience.
Embodiment two
Based on same inventive concept, the embodiment of the present application also provides for a kind of electronic equipment, refer to Fig. 3, including:
Housing 30;
Display unit 31;
The dynamic monitoring component 32 of eye, is arranged in described housing 30, is connected with described display unit 31, for obtaining user's
The eye motion information of eyes;
Processing means 33, is arranged in described housing 30, moves monitoring component with described eye and be connected, be used for obtaining display unit
M object motion information of upper M Moving Objects, M is the integer more than or equal to 1;Based on described eye motion information and described M
Individual object motion information, determines the first Moving Objects from described M Moving Objects.
In the embodiment of the present application, electronic equipment is specifically as follows headset equipment, smart mobile phone, panel computer or notes
This computer etc., or be other electronic equipment, it is not especially limited in the embodiment of the present application.
In the embodiment of the present application, eye moves detection part and is specifically as follows photographic head, and processing means is specifically as follows one
Central processor CPU 1, or two central processor CPUs 1 and CPU2, or be the central processing unit of other quantity, this area is general
Logical technical staff can determine according to actual needs, is not especially limited in the embodiment of the present application.
Optionally, described eye moves monitoring component 32 and is used for:
Obtain first direction of motion of the eyes of user;
Accordingly, described processing means 33 is used for:
Obtain M the direction of motion of M Moving Objects on display unit.
Optionally, described processing means 33 is used for:
Calculate the angle between described first direction of motion and described M the direction of motion, obtain M angle altogether;
From described M angle, determine that angle is less than the first angle of default angle;
Using the Moving Objects corresponding with described first angle as described first Moving Objects.
Optionally, described eye moves monitoring component 32 and is used for:
Obtain the first movement locus of the eyes of user;
Accordingly, described processing means 33 is used for:
Obtain M movement locus of M Moving Objects on display unit.
Optionally, described processing means 33 is used for:
Calculate the degree of association between described first movement locus and described M movement locus, obtain M the degree of association altogether;
From described M the degree of association, determine that the degree of association is more than first degree of association of the default degree of association;
Using the Moving Objects corresponding with described first degree of association as the first Moving Objects.
Optionally, when determining the first Moving Objects from described M Moving Objects, described processing means 33 is also used
In:
Move the cursor on the position corresponding with described first Moving Objects.
Optionally, described based on described eye motion information with described M object motion information, from described M motion
After determining the first Moving Objects in object, described electronic equipment also includes:
Sensing device 34, is used for:
Obtain the second eyes movable information and/or the head movement information of head of the eyes of described user;
Described processing means 33 is additionally operable to:
According to described second eyes movable information and/or described head movement information, described first Moving Objects is carried out
Corresponding operation.
In the embodiment of the present application, electronic equipment, specifically as a example by headset equipment, refer to Fig. 4, headset equipment bag
Including structure member 400, structure member includes nose support 401 and ear mount 402, for this electric terminal is worn on the health of user
On;The dynamic monitoring component 32 of eye can be located on structure member 400, such as: at nose support 401, as long as or being positioned on display unit 31 and can supervise
The eye measuring user moves.When electronic equipment is worn on the health of user, move monitoring component 32 by eye and monitor user's
Eye is dynamic, further, for identifying that the sensing device 34 of user's eye motion and/or headwork may be alternatively located at structure member 400
On, this sensing device is specifically as follows gravity sensor, or is photographic head, and those of ordinary skill in the art can be according to reality
Needs are set, and are not especially limited in the embodiment of the present application.Therefore, in the embodiment of the present application, when user is by electronics
When equipment is worn on health, can be easily by eye is dynamic and/or electronic equipment is controlled by headwork, control efficiency
High, it is not easy to maloperation.
By an embodiment in the above-described embodiment in the present invention or multiple embodiment, following skill at least can be realized
Art effect:
One, due to the technical scheme in the embodiment of the present application, the eye motion information of the eyes of user is obtained;Obtain display
M object motion information of M Moving Objects on unit, M is the integer more than or equal to 1;Based on described eye motion information and
Described M object motion information, determines the first Moving Objects from described M Moving Objects.I.e. will not be as in prior art
Eye dynamic control, first having to display screen is calibrated, if not calibrating, then can affect tracking degree of accuracy, and in this technology
In scheme, it is that M object motion information of M Moving Objects on the movable information according to eyes and display unit determines
One Moving Objects, thus this Moving Objects is operated accordingly, in the case of without calibration, also ensure that the essence of tracking
Exactness, thus it is possible to the ocular pursuit technology in prior art that effectively solves exists the technical problem that degree of accuracy is relatively low, and then reaches to carry
The technique effect of high ocular pursuit degree of accuracy.
Two, due to the technical scheme in the embodiment of the present application, from described M Moving Objects, the first fortune is determined described
During dynamic object, move the cursor on the position corresponding with described first Moving Objects.I.e. can be given by the technical program and use
Bring the experience that cursor is dynamic with eye and dynamic at family, and then further up to improving the technique effect of user experience.
Three, due to the technical scheme in the embodiment of the present application, it is thus achieved that the second eyes movable information of the eyes of described user
And/or the head movement information of head;According to described second eyes movable information and/or described head movement information, to described
First Moving Objects operates accordingly.I.e. in the technical program, after determining the first Moving Objects, it is to pass through user
Second Moving Objects is operated by the eye motion information of eyes and/or the head movement information of head accordingly, thus keeps away
Exempt from maloperation, and then reach to improve the technique effect of user experience.
Those skilled in the art are it should be appreciated that embodiments of the invention can be provided as method, system or computer program
Product.Therefore, the reality in terms of the present invention can use complete hardware embodiment, complete software implementation or combine software and hardware
Execute the form of example.And, the present invention can use at one or more computers wherein including computer usable program code
The upper computer program product implemented of usable storage medium (including but not limited to disk memory, CD-ROM, optical memory etc.)
The form of product.
The present invention is with reference to method, equipment (system) and the flow process of computer program according to embodiments of the present invention
Figure and/or block diagram describe.It should be understood that can the most first-class by computer program instructions flowchart and/or block diagram
Flow process in journey and/or square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided
Instruction arrives the processor of general purpose computer, special-purpose computer, Embedded Processor or other programmable data processing device to produce
A raw machine so that the instruction performed by the processor of computer or other programmable data processing device is produced for real
The device of the function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame now.
These computer program instructions may be alternatively stored in and computer or other programmable data processing device can be guided with spy
Determine in the computer-readable memory that mode works so that the instruction being stored in this computer-readable memory produces and includes referring to
Make the manufacture of device, this command device realize at one flow process of flow chart or multiple flow process and/or one square frame of block diagram or
The function specified in multiple square frames.
These computer program instructions also can be loaded in computer or other programmable data processing device so that at meter
Perform sequence of operations step on calculation machine or other programmable devices to produce computer implemented process, thus at computer or
The instruction performed on other programmable devices provides for realizing at one flow process of flow chart or multiple flow process and/or block diagram one
The step of the function specified in individual square frame or multiple square frame.
Specifically, the computer program instructions that the information processing method in the embodiment of the present application is corresponding can be stored in
CD, hard disk, on the storage medium such as USB flash disk, when the computer program instructions quilt corresponding with information processing method in storage medium
When one electronic equipment reads or is performed, comprise the steps:
Obtain the eye motion information of the eyes of user;
Obtaining M object motion information of M Moving Objects on display unit, M is the integer more than or equal to 1;
Based on described eye motion information and described M object motion information, from described M Moving Objects, determine
One Moving Objects.
Optionally, store in described storage medium and step: the eye motion information of the eyes of described acquisition user is right
The computer instruction answered, during being specifically performed, specifically includes following steps:
Obtain first direction of motion of the eyes of user;
Accordingly, in described storage medium storage and step: on described acquisition display unit, the M of M Moving Objects is individual
Object motion information, corresponding computer instruction, during being specifically performed, specifically includes following steps:
Obtain M the direction of motion of M Moving Objects on display unit.
Optionally, in described storage medium storage and step: described right based on described eye motion information and described M
As movable information, from described M Moving Objects, determine that the first Moving Objects, corresponding computer instruction were specifically performed
Cheng Zhong, specifically includes following steps:
Calculate the angle between described first direction of motion and described M the direction of motion, obtain M angle altogether;
From described M angle, determine that angle is less than the first angle of default angle;
Using the Moving Objects corresponding with described first angle as described first Moving Objects.
Optionally, store in described storage medium and step: the eye motion information of the eyes of described acquisition user is right
The computer instruction answered, during being specifically performed, specifically includes following steps:
Obtain the first movement locus of the eyes of user;
Accordingly, in described storage medium storage and step: on described acquisition display unit, the M of M Moving Objects is individual
Object motion information, corresponding computer instruction, during being specifically performed, specifically includes following steps:
Obtain M movement locus of M Moving Objects on display unit.
Optionally, in described storage medium storage and step: described right based on described eye motion information and described M
As movable information, from described M Moving Objects, determine that the first Moving Objects, corresponding computer instruction are specifically being performed
During, specifically include following steps:
Calculate the degree of association between described first movement locus and described M movement locus, obtain M the degree of association altogether;
From described M the degree of association, determine that the degree of association is more than first degree of association of the default degree of association;
Using the Moving Objects corresponding with described first degree of association as the first Moving Objects.
Optionally, in described storage medium, also storage has other computer instruction, this other computer instruction
With step: determining that from described M Moving Objects the computer instruction that the first Moving Objects is corresponding is specifically being performed
Time be performed, this other computer instruction, during being specifically performed, specifically includes following steps:
Move the cursor on the position corresponding with described first Moving Objects.
Optionally, in described storage medium, also storage has other computer instruction, this other computer instruction
With step: described based on described eye motion information with described M object motion information, from described M Moving Objects
Determining that the computer instruction that the first Moving Objects is corresponding is performed to be performed afterwards, this other computer instruction is performed
During, specifically include following steps:
Obtain the second eyes movable information and/or the head movement information of head of the eyes of described user;
According to described second eyes movable information and/or described head movement information, described first Moving Objects is carried out
Corresponding operation.
For making the purpose of the embodiment of the present invention, technical scheme and advantage clearer, below in conjunction with the embodiment of the present invention
In accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is
The a part of embodiment of the present invention rather than whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art
The every other embodiment obtained under not making creative work premise, broadly falls into the scope of protection of the invention.
Although preferred embodiments of the present invention have been described, but those skilled in the art once know basic creation
Property concept, then can make other change and amendment to these embodiments.So, claims are intended to be construed to include excellent
Select embodiment and fall into all changes and the amendment of the scope of the invention.
Obviously, those skilled in the art can carry out various change and the modification essence without deviating from the present invention to the present invention
God and scope.So, if these amendments of the present invention and modification belong to the scope of the claims in the present invention and equivalent technologies thereof
Within, then the present invention is also intended to comprise these change and modification.
Claims (14)
1. an information processing method, including:
Obtain the eye motion information of the eyes of user;
Obtaining M object motion information of M Moving Objects on display unit, M is the integer more than or equal to 1;
Based on described eye motion information and described M object motion information, from described M Moving Objects, determine the first fortune
Dynamic object.
2. the method for claim 1, it is characterised in that the eye motion information of the eyes of described acquisition user, including:
Obtain first direction of motion of the eyes of user;
Accordingly, M object motion information of M Moving Objects on described acquisition display unit, including:
Obtain M the direction of motion of M Moving Objects on display unit.
3. method as claimed in claim 2, it is characterised in that described based on described eye motion information with described M object
Movable information, determines the first Moving Objects from described M Moving Objects, including:
Calculate the angle between described first direction of motion and described M the direction of motion, obtain M angle altogether;
From described M angle, determine that angle is less than the first angle of default angle;
Using the Moving Objects corresponding with described first angle as described first Moving Objects.
4. the method for claim 1, it is characterised in that the eye motion information of the eyes of described acquisition user, including:
Obtain the first movement locus of the eyes of user;
Accordingly, M object motion information of M Moving Objects on described acquisition display unit, including:
Obtain M movement locus of M Moving Objects on display unit.
5. method as claimed in claim 4, it is characterised in that described based on described eye motion information with described M object
Movable information, determines the first Moving Objects from described M Moving Objects, including:
Calculate the degree of association between described first movement locus and described M movement locus, obtain M the degree of association altogether;
From described M the degree of association, determine that the degree of association is more than first degree of association of the default degree of association;
Using the Moving Objects corresponding with described first degree of association as the first Moving Objects.
6. the method as described in claim as arbitrary in claim 1-5, it is characterised in that determining from described M Moving Objects
During the first Moving Objects, described method also includes:
Move the cursor on the position corresponding with described first Moving Objects.
7. method as claimed in claim 6, it is characterised in that described right based on described eye motion information and described M
As movable information, after determining the first Moving Objects from described M Moving Objects, described method also includes:
Obtain the second eyes movable information and/or the head movement information of head of the eyes of described user;
According to described second eyes movable information and/or described head movement information, described first Moving Objects is carried out accordingly
Operation.
8. an electronic equipment, including:
Housing;
Display unit;
The dynamic monitoring component of eye, is arranged in described housing, is connected with described display unit, for obtaining the eyes of the eyes of user
Movable information;
Processing means, is arranged in described housing, moves monitoring component with described eye and be connected, and is used for obtaining M fortune on display unit
M object motion information of dynamic object, M is the integer more than or equal to 1;Based on described eye motion information and described M object
Movable information, determines the first Moving Objects from described M Moving Objects.
9. electronic equipment as claimed in claim 8, it is characterised in that described eye moves monitoring component and is used for:
Obtain first direction of motion of the eyes of user;
Accordingly, described processing means is used for:
Obtain M the direction of motion of M Moving Objects on display unit.
10. electronic equipment as claimed in claim 9, it is characterised in that described processing means is used for:
Calculate the angle between described first direction of motion and described M the direction of motion, obtain M angle altogether;
From described M angle, determine that angle is less than the first angle of default angle;
Using the Moving Objects corresponding with described first angle as described first Moving Objects.
11. electronic equipments as claimed in claim 8, it is characterised in that described eye moves monitoring component and is used for:
Obtain the first movement locus of the eyes of user;
Accordingly, described processing means is used for:
Obtain M movement locus of M Moving Objects on display unit.
12. electronic equipments as claimed in claim 11, it is characterised in that described processing means is used for:
Calculate the degree of association between described first movement locus and described M movement locus, obtain M the degree of association altogether;
From described M the degree of association, determine that the degree of association is more than first degree of association of the default degree of association;
Using the Moving Objects corresponding with described first degree of association as the first Moving Objects.
Electronic equipment as described in 13. claims as arbitrary in claim 8-12, it is characterised in that from described M Moving Objects
When determining the first Moving Objects, described processing means is additionally operable to:
Move the cursor on the position corresponding with described first Moving Objects.
14. electronic equipments as claimed in claim 13, it is characterised in that described based on described eye motion information with described
M object motion information, after determining the first Moving Objects from described M Moving Objects, described electronic equipment also includes:
Sensing device, is used for:
Obtain the second eyes movable information and/or the head movement information of head of the eyes of described user;
Described processing means is additionally operable to:
According to described second eyes movable information and/or described head movement information, described first Moving Objects is carried out accordingly
Operation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610460487.0A CN106125926B (en) | 2016-06-22 | 2016-06-22 | A kind of information processing method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610460487.0A CN106125926B (en) | 2016-06-22 | 2016-06-22 | A kind of information processing method and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106125926A true CN106125926A (en) | 2016-11-16 |
CN106125926B CN106125926B (en) | 2019-10-29 |
Family
ID=57269144
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610460487.0A Active CN106125926B (en) | 2016-06-22 | 2016-06-22 | A kind of information processing method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106125926B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110310006A1 (en) * | 2008-12-22 | 2011-12-22 | Timothy James Henry Edwards | Automatic Calibration Of A Gaze Direction Algorithm From User Behavior |
CN104750232A (en) * | 2013-12-28 | 2015-07-01 | 华为技术有限公司 | Eye tracking method and eye tracking device |
CN105247447A (en) * | 2013-02-14 | 2016-01-13 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
-
2016
- 2016-06-22 CN CN201610460487.0A patent/CN106125926B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110310006A1 (en) * | 2008-12-22 | 2011-12-22 | Timothy James Henry Edwards | Automatic Calibration Of A Gaze Direction Algorithm From User Behavior |
CN105247447A (en) * | 2013-02-14 | 2016-01-13 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
CN104750232A (en) * | 2013-12-28 | 2015-07-01 | 华为技术有限公司 | Eye tracking method and eye tracking device |
Also Published As
Publication number | Publication date |
---|---|
CN106125926B (en) | 2019-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7283506B2 (en) | Information processing device, information processing method, and information processing program | |
US11893689B2 (en) | Automated three dimensional model generation | |
US11270515B2 (en) | Virtual keyboard | |
CN110352446B (en) | Method and apparatus for obtaining image and recording medium thereof | |
US11947729B2 (en) | Gesture recognition method and device, gesture control method and device and virtual reality apparatus | |
US10481689B1 (en) | Motion capture glove | |
US11417052B2 (en) | Generating ground truth datasets for virtual reality experiences | |
US10915993B2 (en) | Display apparatus and image processing method thereof | |
US9547412B1 (en) | User interface configuration to avoid undesired movement effects | |
KR20140104661A (en) | Method and apparatus for user interface using gaze interaction | |
WO2014200781A1 (en) | Locating and orienting device in space | |
CN108292448A (en) | Information processing unit, information processing method and program | |
CN105302407A (en) | Application icon display method and apparatus | |
CN108680196B (en) | Time delay correction method, system and computer readable medium | |
JP6534011B2 (en) | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD | |
US10488949B2 (en) | Visual-field information collection method and system for executing the visual-field information collection method | |
US20180005437A1 (en) | Virtual manipulator rendering | |
WO2021002945A1 (en) | Virtual dial control | |
CN105242888A (en) | System control method and electronic device | |
JPWO2015159547A1 (en) | Information processing system, control method, and program recording medium | |
JP6519075B2 (en) | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD | |
CN109858402B (en) | Image detection method, device, terminal and storage medium | |
US20150103080A1 (en) | Computing device and method for simulating point clouds | |
CN113296605A (en) | Force feedback method, force feedback device and electronic equipment | |
US11100723B2 (en) | System, method, and terminal device for controlling virtual image by selecting user interface element |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |