CN101807110B - Pupil positioning method and system - Google Patents

Pupil positioning method and system Download PDF

Info

Publication number
CN101807110B
CN101807110B CN2009100067693A CN200910006769A CN101807110B CN 101807110 B CN101807110 B CN 101807110B CN 2009100067693 A CN2009100067693 A CN 2009100067693A CN 200910006769 A CN200910006769 A CN 200910006769A CN 101807110 B CN101807110 B CN 101807110B
Authority
CN
China
Prior art keywords
angle
screen
value
distance
zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2009100067693A
Other languages
Chinese (zh)
Other versions
CN101807110A (en
Inventor
赵新民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Utechzone Co Ltd
Original Assignee
Utechzone Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Utechzone Co Ltd filed Critical Utechzone Co Ltd
Priority to CN2009100067693A priority Critical patent/CN101807110B/en
Publication of CN101807110A publication Critical patent/CN101807110A/en
Application granted granted Critical
Publication of CN101807110B publication Critical patent/CN101807110B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Eye Examination Apparatus (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a pupil positioning method and a pupil positioning system. The method comprises the following steps of: capturing an eye image of a user, and positioning a pupil center and a light reflecting spot by using the eye image; defining a distance value by using a distance from the pupil center to the light reflecting spot, and defining a datum line by using the position of the pupil center to match with the light reflecting spot to define an angle value relatively to a vector of the pupil center; acquiring an angle-distance distribution graph by viewing distance values and angle values of different positions; and accordingly converting positions corresponding to a screen when viewing any point of the screen. The method and the system can more accurately and effectively control the positioning function of a computer pointer under the condition that a human body does not need to carry additional equipment and complex calculation does not need.

Description

Pupil positioning method and system
Technical field
The present invention relates to a kind of pupil positioning method and system, particularly relate to a kind of pupil positioning method and system of available eyes control computer.
Background technology
Whether people's ocular pursuit (Eye tracking) technology is to be divided into contact and contactless with the human eye contact region at present; Contact human eye tracer technique can be divided into search coil method and electroculogram method; Contactless human eye tracer technique mainly is to be recognized as basis (Vision based) with vision, can divide into wear-type (Head-mount) or exempt from wear-type (Free-head).
Aspect contact human eye tracer technique, search coil method (Search coil) is to let the user wear to have the soft eyeglass of inductive coil, when the user rotates eyeball and then drives eyeglass; Inductive coil can be because magnetic flux change produces induction electromotive force; This electromotive force size is promptly represented the angle of eyeball deflection, but the shortcoming of the method is to receive easily the influence of user's eyeball situation, like secretion of eye etc.; And soft eyeglass is a double-decker, can influence user's eyesight; As for electroculogram (EOG) method; Then be to attach most electrodes in circumference of eyes; And utilize said electrode detecting eyeball to rotate the voltage difference that is produced and judge its angle up and down; Shortcoming be the dermatopolyneuritis of face's adhesive electrode easily because the cutin secretion makes that the electric signal of obtaining is unstable, and only can write down huge the turning to of eyeball and can't write down more small angle and change.
Aspect wear-type human eye tracer technique; The user must wear the glasses with minicam; Because the relative distance of eye and video camera is fixed; So can change because of the relative distance of face's skew or eye and cause judging inaccurate, thereby when the user uses, just must glasses be fixed in the head fixing relative position of minicam and eyes whereby, not only inconvenience is also uncomfortable as far as the user.
Exempt from wear-type human eye tracer technique aspect, the eye tracker (Eye trackers) that cooperates screen and two CCD cameras is abroad arranged, Taiwan is more famous then has people such as Lin Chensheng to use the correlative study of screen and single CCD camera.Yet the aforesaid wear-type human eye tracer technique of exempting from adopts than complex calculations, and in addition, though the eye tracker of two CCD cameras can accurately be located index, cost is very expensive, and needs to adopt two CCD cameras.
The disappearance of eye control techniques was in the past: contact or contactless when implementing no matter, owing to need location accurately, and precisely the location needs the expensive soft hardware equipment of collocation, thus, makes a control techniques universalness general masses also can be used.
Summary of the invention
Just need expensive device that the problem of pinpoint accuracy is arranged in order to overcome existing eye control techniques, the present invention provides a kind of can only need program to judge the pupil positioning method that just can accurately locate.
So; It is to utilize a kind of pupil positioning method that can be realized by program software that pupil positioning method of the present invention solves technical scheme that its technical matters adopted; Said method is when a user watches a screen attentively; Acquisition user's eye image, and utilize the eye image to orient a pupil center and a reflective spot; Make a distance value with the distance bound of pupil center to reflective spot, and define a datum line with pupil center location and cooperate reflective spot to define an angle value with respect to a vector of pupil center; After the distance value of diverse location and one angle-range distribution figure that angle value obtains are watched in utilization; Distance value and angle value to watch arbitrary position obtain a coordinate points according to a predetermined way in angle-range distribution figure correspondence; Corresponding to the position of screen, orient the position that the user watches screen at last when the screen any point is watched in conversion whereby.The user only need install the computer software of the inventive method and differentiate, and just can reach the purpose of pinpoint accuracy.
Need the problem of expensive device in order to overcome existing eye control techniques, the present invention provides a kind of pupil positioning system that can only need simple device just can accurately locate.
So it is to comprise that the light source, of an emission one light beam irradiates one user's eye captures the video camera and a computing module of user's eye image that pupil positioning system of the present invention solves the technical scheme that its technical matters adopted; Computing module has a feature location unit, a distance-angle processing unit and a coordinate transformation unit.
The feature location unit is obtained the eye image and is oriented a pupil center and a reflective spot; Distance-angle processing unit is made a distance value with the distance bound of pupil center to reflective spot, and defines a datum line with pupil center location and cooperate reflective spot to define an angle value with respect to a vector of pupil center; After the coordinate transformation unit by using is watched the distance value of diverse location and one angle-range distribution figure that angle value obtains; Obtain a coordinate points according to a predetermined way in angle-range distribution figure correspondence with the distance value of watching arbitrary position and angle value, convert whereby when watching the screen any point corresponding to the position of screen.
Whereby, the user only need possess aforesaid light source, video camera and computing module, need not other expensive equipment, therefore can reach the purpose that reduces hardware cost.
The whole beneficial effect of the present invention is: need not wear extra gear and can reduce cost at human body; And utilize the angle-range distribution figure that obtains watch diverse location to convert when watching the screen any point corresponding to the position of screen, therefore need not complicated calculations and can be to positioning function more accurate and effectively control.
Description of drawings
Fig. 1 is a synoptic diagram, and the preferred embodiment of pupil positioning system of the present invention is described;
Fig. 2 is a synoptic diagram, explains with pupil center to cooperate reflective spot to define a distance value and an angle value;
Fig. 3 is a process flow diagram, and the preferred embodiment of pupil positioning method of the present invention is described;
Fig. 4 is a process flow diagram, and the flow process of training stage among the embodiment of the grouping correspondent method that pupil positioning system of the present invention is used is described;
Fig. 5 is a synoptic diagram, and account for screen is divided into many groups zone, but each wherein one group of zone that only shows;
Fig. 6 is a synoptic diagram, and the localization method of pupil center and reflective spot is described;
Fig. 7 is a synoptic diagram, and the distance-angle difference in each group zone of distance value and angle value and screen is described;
Fig. 8 is a synoptic diagram, the distance-angle distribution figure of declare area " 1 " to " 16 ", and can corresponding in angle-range distribution figure, obtain a coordinate points with distance value Li and angle value θ i;
Fig. 9 is a process flow diagram, and the flow process of application stage among the embodiment of the grouping correspondent method that pupil positioning system of the present invention is used is described;
Figure 10 is a synoptic diagram, explains that the point set credit union of distance-angle that corresponding each zone obtains obtains an irregular fan-shaped distribution;
Figure 11 is a synoptic diagram, explain with minority distance-angle points drawn one fan-shaped, utilize one to turn round positive computing and be convertible into the rectangle of rule comparatively, and then the position of rectangle corresponded to the coordinate of screen;
Figure 12 is a process flow diagram, and the flow process of application stage among the embodiment of the interior slotting correspondent method that pupil positioning system of the present invention is used is described;
Figure 13 is a process flow diagram, and the flow process of application stage among the embodiment of the interior slotting correspondent method that pupil positioning system of the present invention is used is described.
Embodiment
Below in conjunction with accompanying drawing and embodiment the present invention is elaborated:
Before the present invention is described in detail, be noted that in following description similarly assembly is to represent with identical numbering.
One, system architecture and method notion
Consult Fig. 1, the preferred embodiment of pupil positioning system 1 of the present invention comprises a light source 11, a video camera 12, a computing module 13 and a screen 14; Computing module 13 comprises a training unit 130, a feature location unit 131, a distance-angle processing unit 132 and a coordinate transformation unit 133.
In this preferred embodiment, computing module 13 is computer programs of Storage Media record, carries out pupil positioning method of the present invention in order to make computing machine, and said method is implemented according to the following step:
Consult Fig. 1 to Fig. 3, training unit 130 is to show that at screen 14 target area supplies the user to watch at every turn, shows after execution in step that continues 301 and the step 302 that another target area supplies the user to watch, and all shows up to the target area of a predetermined quantity to finish again; A user 4 eye image 401 is obtained in feature location unit 131; And orient a pupil center 41 and a reflective spot 42 (step 301); And eye image 401 is taken by video camera 12, and reflective spot 42 is to utilize light source 11 emission one light beam irradiates people eye reflexs and get.
Then; Distance-angle processing unit 132 is made a distance value L with the distance bound of pupil center 41 to reflective spot 42, and defines a datum line (being transverse axis) with pupil center 41 positions and cooperate reflective spot 42 to define an angle value θ (step 302) with respect to a vector 43 of pupil center 41.
Then; Coordinate transformation unit 133 utilizes watches the distance value L of diverse location and one angle-range distribution figure (step 303) that angle value θ obtains; After the step 301 of aforementioned training stage to step 303 finishes; Promptly get into the application stage; When letting the user watch the arbitrary position of screen 14, make the distance value of said position and angle value obtain a coordinate points (step 304) in angle-range distribution figure correspondence according to a predetermined way, in the time of so just can obtaining watching screen 14 any points corresponding to the position of screen 14.
What need explanation is, when the present invention defines angle value θ and is negative, is converted into the angle between 0 degree to 180 degree according to following formula:
if?Angle<0?then
Angle=180+Angle;
Below introduce distance value L and angle value θ and how to convert corresponding to two preferred embodiments of pointer 101 positions of screen 14, wherein a kind of predetermined way is the grouping correspondent method, and another kind then is interior slotting correspondent method.
Two, the embodiment of grouping correspondent method
Pupil positioning method of the present invention can be divided into a training stage (like Fig. 4) and an application stage (like Fig. 9), and details are as follows for the flow process in each stage:
Consult Fig. 4, the flow process of training stage comprises the steps:
Cooperate Fig. 1; Training unit 130 is divided many groups zone in advance with screen 14; Each wherein one group of zone that only shows shows that at every turn controlling features positioning unit 131 carries out the location and command range-angle processing unit 132 is obtained distance-angle that each organizes the zone when one group of regional user of confession watched; And, obtain the distance-angle distribution figure in each group zone if judge that All Ranges has shown when finishing.
As shown in Figure 5, screen 14 is divided into 4 * 4=16 group zone, but each wherein one group of zone that only shows; Then, when the flow process of training stage begins, set count value I=1 (step 201) earlier, show that at every turn an area I supplies the user to watch (step 202).
Then, capture a user eye image 401 (step 203), and utilize eye image 401 to orient a pupil center 41 and a reflective spot 42 (step 204); As shown in Figure 6, the localization method of pupil center 41 is with after sealing (Close) calculation process of eye image 401 via dynamic binaryzation (Adaptive threshold) and kenel (Morphology), can calculate the position of pupil center 41.
Then, utilize pupil center and reflective spot to define a distance value L and an angle value θ (step 205); With count value I+1 (step 206); Do you judge I=16 (step 207) again? If not, then continue repeating step 202 to 207, when judging I=16, expression has obtained the distance-angle distribution figure (step 208) in each group zone.
How to define and can explain as follows as for the distance-angle difference in distance value L and angle value θ and screen 14 each group zone about distance value L and angle value θ with reference to the comment of earlier figures 2:
Consult Fig. 7 (a), screen 14 shows " 1 " zone earlier, obtains distance value L1 and angle value θ 1 this moment; Consult Fig. 7 (b), screen 14 shows " 2 " zone, obtains distance value L2 and angle value θ 2 this moment; Consult Fig. 7 (c), screen 14 shows " 3 " zone, obtains distance value L3 and angle value θ 3 this moment; Consult Fig. 7 (d), screen 14 shows " 4 " zone, obtains distance value L4 and angle value θ 4 this moment; It should be noted that because eye sight line is to move horizontally, so distance value L1, L2, L3 and L4 are can gap too not big, but angle value θ 1, θ 2, θ 3 and 4 of θ have bigger variation.
But when being displaced downwardly to " 1 ", " 2 ", " 3 ", eye sight line reaches next row's numeral " 5 ", " 6 " of " 4 ", when " 7 " reach " 8 "; Because pupil moves down a bit of distance; Make that watching the distance value that " 5 ", " 6 ", " 7 " obtain when reach " 8 " attentively will diminish, angle value then corresponding separately and θ 1, θ 2, θ 3 and θ 4 do not have too big gap.
Consult Fig. 8, according to aforesaid principle, add because eyeball is a sphere and on-plane surface obtains the distance-angle distribution figure like zone " 1 " to " 16 ".
Consult Fig. 9, and cooperate Fig. 6, the grouping correspondent method comprises the steps: in the flow process of practical stage
Capture a user eye image 401 (step 211), and utilize eye image 401 to orient a pupil center 41 and a reflective spot 42 (step 212); Utilize pupil center and reflective spot to define a distance value Li and an angle value θ i (step 213); Above step please refer to the explanation of Fig. 2, repeats no more.
Consult Fig. 8 again; Can corresponding in angle-range distribution figure, obtain a coordinate points 5 with distance value Li and angle value θ i; Utilize minimum target letter formula to find out range coordinate and put 5 nearest zones (step 214), so promptly accomplish the positioning function (step 215) of screen cursor.
The formula of minimum target letter formula is following:
Min?Obj.=W1|Dist_t-Dis_i|+W2|Angle_t-Angle_i|
(i=1~16)
/ Dis_i and Angle_i are target range value and angle value, and Dis_t and Angle_t are present distance value and angle value; W1 and W2 be for assigning weight, this be for make the two contribution balanced/
Dist = [ ( x 1 - x 2 ) 2 + ( y 1 - y 2 ) 2 ] ; / distance/formula 1
Angle = Tan - 1 [ y 1 - y 2 x 1 - x 2 ] ; / angle/formula 2
Can know that by this preferred embodiment this method is to utilize the pupil center that many groups zone obtains and the distance and the angles of reflective spot, come with the spaces in many groups of zones of screen, set up one-to-one correspondence each other, and then obtain the positioning function effect; Yet the grouping correspondent method need be seen more groups of zones, and its locating effect just can be good more, thereby the interior slotting correspondent method of introducing below having improves this shortcoming.
Three, the embodiment of interior slotting correspondent method
Consult Figure 10, as if with screen divider being 8 * 8=64 group zone, the point set credit union of distance-angle that then corresponding each zone obtains obtains an irregular fan-shaped distribution; Thus, want to utilize interpolation method to obtain the corresponding coordinate of screen and just run into difficulty.
Consult Figure 11; With minority distance-angle points drawn one fan-shaped 51; Utilize one just turning round (Un-warping) computing and be convertible into the rectangle 52 of rule comparatively, and then the position of rectangle 52 is corresponded to the coordinate of screen 14, so can obtain quite approximate and good corresponding effect.
The pupil positioning method of present embodiment also is to divide into a training stage and an application stage, and details are as follows for the flow process in each stage:
Consult Figure 12, the similar Fig. 2 of the flow process of training stage also comprises the steps:
At first, set count value I=1 (step 601), then, each viewing area I supplies the user to watch (step 602); Then, capture a user eye image 401 (step 603), and utilize eye image 401 to orient a pupil center 41 and a reflective spot 42 (step 204); Then, utilize pupil center and reflective spot to define a distance value L and an angle value θ (step 605); With count value I+1 (step 606); Do you judge I=16 (step 607) again? If not, then continue repeating step 602 to 607, when judging I=l6, expression has obtained the distance-angle distribution figure (step 608) in each group zone.
Different is that this method also will be turned round positive computing to obtain a regular distribution plan (step 609) through one with aforesaid distance-angle distribution figure, utilizes affine switch technology to proofread and correct regular distribution plan (step 610).
Normalized distance-angle distribution figure that step 609 is tried to achieve, its formula is following:
u = a 0 + a 1 x + a 2 y + a 3 x 2 + a 4 Xy + a 5 y 2 v = b 0 + b 1 x + b 2 y + b 3 x 2 + b 4 Xy + b 5 y 2 ; Formula 3
u v = a 0 a 1 a 2 a 3 a 4 a 5 b 0 b 1 b 2 b 3 b 4 b 5 1 x y x 2 Xy y 2 ; Formula 4
Turn round positive computing and be with the sampling spot before just not turning round (X, Y)={ (x 1, y 1) .. (x n, y n) becoming a full member is impact point (X T, Y T)={ (x T1, y T1) ... (x Tn, y Tn), as: the sample point (x before input at least 6 groups are not just turned round 1, y 1), (x 2, y 2), (x 3, y 3), (x 4, y 4), (x 5, y 5) (x 6, y 6) becoming a full member is corresponding 6 groups of impact point (u 1, v 1), (u 2, v 2), (u 3, v 3), (u 4, v 4) (u 5, v 5) (u 6, v 6); X wherein, Y representes distance value Li, angle value θ i, n=sampling number.
In order to obtain the optimal parameter of second order derived function, the turning round positive computing and can try to achieve a of step 609 by the computing of inverse matrix 0~a 5, b 0~b 5The optimum solution of totally 12 groups of coefficients; Try to achieve a0~a5, after the separating of totally 12 groups of coefficients, (X Y) brings formula 4 into to b0~b5, just can be in the hope of new coordinate points (u, value v) with present unknown point.
Step 610 is to utilize affine conversion (Affine transform) technology that pupil position is carried out shift calibrating (Moving calibration); Shift calibrating mainly is that coordinate is got rid of its image-zooming, image translation and image rotation influence, and then obtains a normalized coordinate.
x ′ y ′ = Ab Cd x y + e f ; Formula 5
Wherein (x ', y ') be new coordinate, a, b, c, d, e, f are the coefficients of affine conversion.The account form of affine conversion is to import not three pairs of coordinate points in wantonly three corners of screen before proofreading and correct like (x 1, y 1), (x 2, y 2), (x 3, y 3) and proofread and correct after screen wantonly three corners three pairs of coordinate points (x ' 1, y ' 1), (x ' 2, y ' 2) (x ' 3, y ' 3), so can solve a~f totally 6 affine conversion coefficients, (X Y) brings formula 5 into, just can be in the hope of the value of new coordinate points (x ', y ') with present unknown point again.
Consult Figure 13, interior slotting correspondent method comprises the steps: in the flow process of practical stage
Capture a user eye image 401 (step 611), and utilize eye image 401 to orient a pupil center 41 and a reflective spot 42 (step 612); Utilize pupil center and reflective spot to define a distance value Li and an angle value θ i (step 613); Above step please refer to the explanation of Fig. 2, repeats no more.
Then,, again coordinate points is converted corresponding to the position (step 614) of screen with interpolation method, so promptly accomplish screen positioning function (step 615) obtaining a coordinate points in distance value Li and the corresponding regular distribution plan after correction of angle value θ i (being X and Y).
Four, conclusion
Can know that by above explanation pupil positioning method of the present invention and system adopt contactless technology and have following advantage:
1. need not wear extra gear at human body, the expense that the user need not both expensive.
2. the grouping correspondent method is to utilize distance and the angle of pupil center that many groups zone obtains and reflective spot and set up one-to-one correspondence each other with the spaces in many groups of zones of screen, therefore need not complicated calculations and can be to positioning function accurately and effectively control.
3. interior slotting correspondent method with minority distance-angle points drawn one fan-shaped utilize one turn round positive computing convert to one comparatively the rule rectangle; And then rectangle corresponded to screen; So can obtain quite approximate and good corresponding effect; And if let the user watch few locations just can reach advantages of good positioning effect in the training stage, not only accurately also can be more laborsaving.

Claims (12)

1. pupil positioning method, in order to when a user watches a screen attentively, to orient the position that the user watches screen, it is characterized in that: said method comprises following step:
(1) launches a light beam irradiates one user's eye and produce a reflective spot;
(2) acquisition user's eye image, and utilize the eye image to orient a pupil center;
(3) make a distance value with the distance bound of pupil center to reflective spot, and define a datum line with pupil center location and cooperate reflective spot to define an angle value with respect to a vector of pupil center; And
(4) distance value of diverse location and one angle-range distribution figure that angle value obtains are watched in utilization; And
(5) obtain a coordinate points according to a predetermined way in angle-range distribution figure correspondence with the distance value of watching arbitrary position and angle value, when obtaining whereby watching the screen any point corresponding to the position of screen.
2. pupil positioning method as claimed in claim 1 is characterized in that: said step (4) is that screen is divided in advance how group is regional, and each only demonstration is one group of zone wherein, and execution comprises the steps:
(a1) show when one group of zone supplies the user to watch execution in step (2) and step (3) at every turn; And
(a2) if judge that All Ranges has shown when finishing, obtain the distance-angle distribution figure in corresponding each group zone.
3. pupil positioning method as claimed in claim 2 is characterized in that: the predetermined way of said step (5) is a grouping correspondent method, and said grouping correspondent method comprises the steps:
When (a3) user watches the arbitrary position of screen, execution in step (2) and step (3) a distance value and an angle value to obtain the position;
(a4) in angle-range distribution figure, obtain a coordinate points with distance value and angle value; And
(a5) utilize a minimum target letter formula to find out the nearest zone of range coordinate point, borrow said zone to convert corresponding to the position of screen.
4. pupil positioning method as claimed in claim 3 is characterized in that: the formula of said minimum target letter formula is following:
Min Obj=W1|Dist_t-Dis_i|+W2|Angle_t-Angle_i|; Dist_i and Angle_i are that the target range value reaches
Angle value, Dist_t and Angle_t are present distance value and angle value, and W1 and W2 are for assigning weight.
5. pupil positioning method as claimed in claim 1 is characterized in that: said step (4) is that screen is divided in advance how group is regional, and each only demonstration is one group of zone wherein, and execution comprises the steps:
(b1) show when one group of zone supplies the user to watch execution in step (2) and step (3) at every turn;
(b2) oneself shows when finishing if judge All Ranges, obtains one and comprises the respectively distance-angle distribution figure in group zone;
(b3) will turn round positive computing to obtain a regular distribution plan through one apart from-angle distribution figure; And
(b4) utilize affine switch technology to proofread and correct regular distribution plan.
6. pupil positioning method as claimed in claim 5 is characterized in that: the predetermined way of said step (5) is to insert correspondent method in one, comprises the steps:
When (b5) user watches the arbitrary position of screen, execution in step (2) and step (3) a distance value and an angle value to obtain the position;
(b6) to obtain a coordinate points in distance value and the regular distribution plan of angle value correspondence after correction; And
(b7) coordinate points is converted corresponding to the position of screen with interpolation method.
7. pupil positioning system, in order to when a user watches a screen attentively, to orient the position that the user watches screen, it is characterized in that: the pupil positioning system comprises:
One light source is launched a light beam irradiates one user's eye and is produced a reflective spot;
One video camera, acquisition user's eye image; And
One computing module has:
One feature location unit is obtained the eye image and is oriented a pupil center,
One distance-angle processing unit is made a distance value with the distance bound of pupil center to reflective spot, and defines a datum line with pupil center location and cooperate reflective spot to define an angle value with respect to a vector of pupil center, and
One coordinate transformation unit; After the distance value of diverse location and one angle-range distribution figure that angle value obtains are watched in utilization; Obtain a coordinate points according to a predetermined way in angle one range distribution figure correspondence with the distance value of watching arbitrary position and angle value, when obtaining whereby watching the screen any point corresponding to the position of screen.
8. pupil positioning system as claimed in claim 7; It is characterized in that: said computing module also has a training unit; In order to screen is divided in advance many groups zone; Each only demonstration is one group of zone wherein, and training unit is also carried out when comprising the steps: that one group of zone of each demonstration supplies the user to watch, and the controlling features positioning unit carries out the location and command range-angle processing unit is obtained the respectively distance-angle in group zone.
9. pupil positioning system as claimed in claim 8 is characterized in that: the predetermined way that said coordinate transformation unit is carried out is a grouping correspondent method, and it comprises the steps:
When the user watched the arbitrary position of screen, the controlling features positioning unit was carried out the location and command range-angle processing unit is obtained a distance value and an angle value;
In angle-range distribution figure, obtain a coordinate points with distance value and angle value correspondence; And
Utilize a minimum target letter formula to find out the nearest zone of range coordinate point, borrow said zone to convert corresponding to the position of screen.
10. pupil positioning system as claimed in claim 9 is characterized in that: the formula of said minimum target letter formula is following:
Min Obj=W1|Dist_t-Dis_i|+W2|Angle_t-Angle_i|; Dist_i and Angle_i are target range value and angle value, and Dist_t and Angle_t are present distance value and angle value, and W1 and W2 are for assigning weight.
11. pupil positioning system as claimed in claim 7; It is characterized in that: said computing module also has a training unit; In order to screen is divided in advance many groups zone; Each only demonstration is one group of zone wherein, and training unit is also carried out when comprising the steps: that one group of zone of each demonstration supplies the user to watch, and the controlling features positioning unit carries out the location and command range-angle processing unit is obtained the respectively distance-angle in group zone;
Oneself shows when finishing if judge All Ranges, obtains one and comprises the respectively distance-angle distribution figure in group zone; To turn round positive computing to obtain a regular distribution plan through one apart from-angle distribution figure; And
Utilize affine switch technology to proofread and correct regular distribution plan.
12. pupil positioning system as claimed in claim 11 is characterized in that: the predetermined way of said coordinate transformation unit is to insert correspondent method in one, and it comprises the steps:
When the user watched the arbitrary position of screen, the controlling features positioning unit was carried out the location and command range-angle processing unit is obtained a distance value and an angle value;
To obtain a coordinate points in distance value and the regular distribution plan of angle value correspondence after correction; And
Coordinate points is converted corresponding to the position of screen with interpolation method.
CN2009100067693A 2009-02-17 2009-02-17 Pupil positioning method and system Active CN101807110B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100067693A CN101807110B (en) 2009-02-17 2009-02-17 Pupil positioning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100067693A CN101807110B (en) 2009-02-17 2009-02-17 Pupil positioning method and system

Publications (2)

Publication Number Publication Date
CN101807110A CN101807110A (en) 2010-08-18
CN101807110B true CN101807110B (en) 2012-07-04

Family

ID=42608924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100067693A Active CN101807110B (en) 2009-02-17 2009-02-17 Pupil positioning method and system

Country Status (1)

Country Link
CN (1) CN101807110B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8885877B2 (en) 2011-05-20 2014-11-11 Eyefluence, Inc. Systems and methods for identifying gaze tracking scene reference locations

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
CN103809737A (en) 2012-11-13 2014-05-21 华为技术有限公司 Method and device for human-computer interaction
TWI473934B (en) * 2012-11-30 2015-02-21 Utechzone Co Ltd A method for inputting a password and a safe for using the eye movement password input method
CN103784112B (en) * 2013-10-10 2017-04-12 杨松 Eye movement sensing method, flexible contact, external sensing coil and system
TW201518979A (en) * 2013-11-15 2015-05-16 Utechzone Co Ltd Handheld eye-controlled ocular device, password input device and method, computer-readable recording medium and computer program product
CN104915013B (en) * 2015-07-03 2018-05-11 山东管理学院 A kind of eye tracking calibrating method based on usage history
TWI533234B (en) * 2015-07-24 2016-05-11 由田新技股份有限公司 Control method based on eye's motion and apparatus using the same
TWI617948B (en) * 2015-07-24 2018-03-11 由田新技股份有限公司 Module, method and computer readable medium for eye-tracking correction
CN108519676B (en) * 2018-04-09 2020-04-28 杭州瑞杰珑科技有限公司 Head-wearing type vision-aiding device
CN109343700B (en) * 2018-08-31 2020-10-27 深圳市沃特沃德股份有限公司 Eye movement control calibration data acquisition method and device
CN113095182A (en) * 2021-03-31 2021-07-09 广东奥珀智慧家居股份有限公司 Iris feature extraction method and system for human eye image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006507054A (en) * 2002-11-21 2006-03-02 トビイ・テクノロジー・エイビイ Method and apparatus for detecting and tracking the eye and its gaze direction
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
CN101026777A (en) * 2006-02-21 2007-08-29 台湾薄膜电晶体液晶显示器产业协会 Display device dynamic image colour excursion detecting system and detecting method
CN101344919A (en) * 2008-08-05 2009-01-14 华南理工大学 Sight tracing method and disabled assisting system using the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006507054A (en) * 2002-11-21 2006-03-02 トビイ・テクノロジー・エイビイ Method and apparatus for detecting and tracking the eye and its gaze direction
CN101026777A (en) * 2006-02-21 2007-08-29 台湾薄膜电晶体液晶显示器产业协会 Display device dynamic image colour excursion detecting system and detecting method
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
CN101344919A (en) * 2008-08-05 2009-01-14 华南理工大学 Sight tracing method and disabled assisting system using the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特表2006507054A 2006.03.02

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8885877B2 (en) 2011-05-20 2014-11-11 Eyefluence, Inc. Systems and methods for identifying gaze tracking scene reference locations

Also Published As

Publication number Publication date
CN101807110A (en) 2010-08-18

Similar Documents

Publication Publication Date Title
CN101807110B (en) Pupil positioning method and system
TWI432172B (en) Pupil location method, pupil positioning system and storage media
Plopski et al. Corneal-imaging calibration for optical see-through head-mounted displays
Itoh et al. Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization
CN102830797B (en) A kind of man-machine interaction method based on sight line judgement and system
Morimoto et al. Eye gaze tracking techniques for interactive applications
US9519640B2 (en) Intelligent translations in personal see through display
CN101872237B (en) Method and system for pupil tracing as well as correction method and module for pupil tracing
CN102547123B (en) Self-adapting sightline tracking system and method based on face recognition technology
CN104113680B (en) Gaze tracking system and method
Capurro et al. Dynamic vergence using log-polar images
CN101814129A (en) Automatically focused remote iris image acquisition device, method and recognition system
EP3490434A1 (en) Sensor fusion systems and methods for eye-tracking applications
CN101587542A (en) Field depth blending strengthening display method and system based on eye movement tracking
CN101308400A (en) Novel human-machine interaction device based on eye-motion and head motion detection
US20150130714A1 (en) Video analysis device, video analysis method, and point-of-gaze display system
CN103279188A (en) Method for operating and controlling PPT in non-contact mode based on Kinect
WO2018109570A1 (en) Smart contact lens and multimedia system including the smart contact lens
CN112232310B (en) Face recognition system and method for expression capture
EP3438882B1 (en) Eye gesture tracking
US20190073041A1 (en) Gesture Control Method for Wearable System and Wearable System
JP2022523306A (en) Eye tracking devices and methods
TW200947262A (en) Non-contact type cursor control method using human eye, pupil tracking system and storage media
CN108369744A (en) It is detected by the 3D blinkpunkts of binocular homography
CN110887638A (en) Device and method for drawing image plane of optical system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant