CN102043952A - Eye-gaze tracking method based on double light sources - Google Patents

Eye-gaze tracking method based on double light sources Download PDF

Info

Publication number
CN102043952A
CN102043952A CN 201010618752 CN201010618752A CN102043952A CN 102043952 A CN102043952 A CN 102043952A CN 201010618752 CN201010618752 CN 201010618752 CN 201010618752 A CN201010618752 A CN 201010618752A CN 102043952 A CN102043952 A CN 102043952A
Authority
CN
China
Prior art keywords
blinkpunkt
image
screen
pupil
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010618752
Other languages
Chinese (zh)
Other versions
CN102043952B (en
Inventor
孙建德
杨彩霞
刘琚
张�杰
杨晓晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201010618752A priority Critical patent/CN102043952B/en
Publication of CN102043952A publication Critical patent/CN102043952A/en
Application granted granted Critical
Publication of CN102043952B publication Critical patent/CN102043952B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an eye-gaze tracking method based on double light sources, comprising the following steps: (1) image preprocessing: based on the pupil and reflecting point gray level difference principle, preprocessing an acquired face image, extracting two reflecting points and pupil areas from the face image, and calculating the coordinates of the two reflecting points and the pupil center in an image coordinate system; (2) fixation point estimating: determining the apparent position of a fixation point on a screen based on that a triangle composed of the pupil center and two reflecting points in the image and an triangle composed of the fixation point and two infrared light supplies on the screen approximately are a pair of homothetic triangles; and (3) fixation point correcting: acquiring the final position of the fixation point. The method provided by the invention has the advantages that the distance between people and the screen is not required to be measured, the complexity of computation is low, the robustness for head motion is strong, the experimentation is nature and comfort, and the error of estimation is within the permissible range for the application error of the eye-gaze tracking system; and the method is easy for implementation.

Description

A kind of sight tracing based on two light sources
Technical field
The present invention relates to a kind of sight tracing, belong to video, multimedia signal processing technique field based on two light sources.
Background technology
Along with the develop rapidly of computer science and technology and industry, domestic consumer's sharp increase of computing machine, intelligentized man-machine interaction also receives publicity day by day.As the important channel that the mankind obtain external information, sight line also can be brought into play its great potential in intelligent man-machine interaction, becomes the effective tool of intelligent interaction.For example, we can express our subjective thought and then realize that things ground is directly controlled to external world by sight line
T.Hutchinson etc. took the lead in having proposed in 1989 carrying out mutual thought with sight line and computing machine.Its tempting application prospect has attracted countless domestic and international research persons from each side such as image pre-service, mapping algorithm, corrections eye tracking to be studied.Nowadays, the eye tracking technology is increasingly mature, it uses the approval that also constantly obtains people,, development of games auxiliary, the appearance design 3D free viewpoint video in epoch in vogue of testing and assessing from the disabled person, none brings convenience for our life, has effectively promoted intelligent science and technology development especially.
The method of eye tracking has electroculogram (EOG), iris-sclera edge, corneal reflection, pupil-corneal reflection vector etc. multiple, and wherein, because its precision is higher, user's comfort is high and enjoy favor based on the method for pupil-corneal reflection vector.Utilizations such as D.H.Yoo four light sources in calendar year 2001 propose can comparatively accurately realize the estimation of sight line based on the gaze tracking system of the constant algorithm of double ratio.Yet, because be subjected to that hardware is integrated, the influence of execution speed and application prospect, people more are partial to adopt still less light source to realize the estimation of sight line.At present, want to realize that comparatively accurate sight line estimation needs two infrared light supplies at least under the situation freely at head movement.And existing two light source sight tracings calculate more complicated, even if allow head moving, correct moving scope also has very big restriction.Therefore find a kind of simple and practical eye tracking mapping method to have important theoretical research meaning and great actual application value.
Summary of the invention
At existing two light source sight tracing calculation of complex, to the restricted big problem of head range of movement, the invention provides a kind of computation complexity low, to little, the practical sight tracing of head limit movement based on two light sources.
According to the image-forming principle of watching characteristic and camera attentively of human eye, as shown in Figure 4, two light source L on the screen 1, L 2The reflection spot V that on eye cornea form corresponding with it 01, V 02Line and blinkpunkt Q on the screen and the P of pupil center 0Line meet at the corneal curvature center C of human eye; And, according to the refraction at spherical surface principle, the reflection spot V of two infrared light supplies on cornea 01, V 02And the P of pupil center 0With they picture V after the cornea refraction at spherical surface 1, V 2And therefore P in same footpath upwards, uses V 1, V 2And P replaces V respectively 01, V 02And P 0Can not influence the blinkpunkt results estimated.
Set up in the process in mapping, because people's eyeball is a spheroidite, so the reflection spot V that two infrared light supplies form in human eye 1, V 2Though and pupil center is inequality, of slight difference at the coordinate figure of the refraction point on the cornea on Z-direction.Calculate by analysis: these 3 coordinate figures on the Z axle can think approximate identical, the blinkpunkt result is estimated that the error that is caused is 0.9887cm to the maximum on X-axis, on Y-axis, be 0.7852cm to the maximum, and the trend of error presents compartmentalization and distributes, therefore can calibrate in conjunction with secondary and reduce because the approximate error that produces, the error of the feasible blinkpunkt that finally estimates is within the tolerance interval in actual applications.
Sight tracing based on two light sources of the present invention, concrete steps are as follows:
(1) image pre-service: the principle of utilizing pupil and reflection spot gray scale difference, the facial image that collects is carried out pre-service, extract in the eye image two reflection spots and pupil region, and calculate two reflection spots and the pupil center coordinate in image coordinate system, wherein image coordinate system is a true origin with the upper left corner of image, horizontal direction is an X-axis, and vertical direction is a Y-axis;
(2) blinkpunkt is estimated: the triangle of blinkpunkt and two infrared light supply formations is approximately the apparent position that a pair of similar triangles are determined blinkpunkt on the screen on triangle that constitutes according to pupil center in the image and two reflection spots and the screen.
(3) blinkpunkt is proofreaied and correct: the method for utilization secondary calibration is proofreaied and correct the intrinsic deviation of human eye's visual axis and optical axis and because of being similar to the evaluated error that produces, is obtained final blinkpunkt position.
The specific implementation step that described step (2) blinkpunkt is estimated is:
A. draw blinkpunkt and two leg-of-mutton approximate similarity sexual intercourse that infrared light supply constitutes on triangle that pupil center and two reflection spots constitute and the screen according to the image-forming principle of watching characteristic and camera attentively of human eye;
B. estimate the apparent position of blinkpunkt on the screen according to the triangle similarity.
The specific implementation step that described step (3) blinkpunkt is proofreaied and correct is:
The a-quadrant is divided: number and distribution according to scaling point are five zones with whole screen divider;
1 calibration of B: the scaling point by the screen centre position carries out some calibration;
C zone location: according to the position of some calibration back blinkpunkt and the zone at the definite blinkpunkt place of relation, the position between each scaling point;
D secondary calibration: according to the zone at blinkpunkt place, the blinkpunkt after once calibrating is carried out the secondary calibration, obtain final blinkpunkt position with the scaling point in the blinkpunkt region.
The triangle that blinkpunkt and two infrared light supplies constitute on triangle that the present invention constitutes according to the pupil center in the image and two reflection spots and the screen is approximately the apparent position that a pair of similar triangles are determined blinkpunkt on the screen, calibrate by secondary again and proofread and correct inherent variability and the approximate evaluated error that is caused of algorithm between human eye's visual axis and the optical axis, obtain comparatively accurate blinkpunkt estimated value.This method need not to measure the distance of head distance screen, and computation complexity is low, and to the head movement strong robustness, experimentation is comfortable naturally, is easy to realize, is convenient to integrated actual product.
Description of drawings
Fig. 1 is the system hardware figure that realizes the inventive method.
Fig. 2 is the process flow diagram of the inventive method.
Fig. 3 is the synoptic diagram of image preprocessing process in the inventive method.
Fig. 4 is video camera imaging and people's eye fixation principle schematic.
Fig. 5 is the approximate substitution synoptic diagram of coordinate position in the inventive method.
Fig. 6 is the leg-of-mutton synoptic diagram of two approximate similarities.
Fig. 7 is that scaling point distributes and screen area is divided synoptic diagram
Fig. 8 is a blinkpunkt estimated result synoptic diagram.
Fig. 9 is that the blinkpunkt estimated accuracy is analyzed synoptic diagram.
Embodiment
Realization the present invention is based on two light sources sight tracing hardware system as shown in Figure 1, comprise a gray scale video camera, two infrared light supplies and a personal computer.Personal computer adopts the 2.60GHZ Pentium Dual Core, computer screen size 34 * 27cm (wide * height), it is 694 * 1040 gray scale video camera that a resolution is adorned in computing machine below, it is motionless that camera position keeps in experimentation, and one 1 watt infrared light supply L is equipped with in the lower left corner of display and the lower right corner respectively 1And L 2The tester is sitting in apart from the position of screen 60--70cm, and head can move in the scope of 20 * 20 * 10cm (wide * length * degree of depth).
Fig. 2 has provided the process flow diagram of the sight tracing based on two light sources of the present invention, and according to this flow process, concrete implementation step is as follows:
1. determine reflection spot center and the pupil center coordinate in image coordinate system in the image by the image pre-service, its process as shown in Figure 3.According to the gray scale difference of pupil and corneal reflection point, extract pupil and corneal reflection point zone, and obtain their the coordinate position p of center in image coordinate system respectively i(p x, p y), v 1i(v 1ix, v 1iy), v 2i(v 2ix, v 2iy).Wherein image coordinate system is a true origin with the upper left corner of image, and horizontal direction is an X-axis, and vertical direction is a Y-axis.
2. blinkpunkt is estimated.The triangle that blinkpunkt and two infrared light supplies constitute on triangle that constitutes according to the pupil center in the image and two reflection spots and the screen is approximately a pair of similar triangles and determines blinkpunkt apparent position on the screen.Its specific implementation step is:
(1), draws blinkpunkt and two leg-of-mutton approximate similarity sexual intercourse that infrared light supply constitutes on triangle that pupil center and two reflection spots constitute and the screen according to the image-forming principle of watching characteristic and camera attentively of human eye shown in Figure 4.Its concrete steps are as follows:
(a) according to the image-forming principle of camera, with corneal reflection point in the facial image and pupil center the coordinate conversion in camera coordinates system in world coordinate system, its transformational relation as shown in the formula:
p i(p ix,p iy)→p(p x,p y,p z)=p(pixelpitch c(p ix-c certer)+O x,pixelpitch r(p iy-r center)+O y,-λ+O z)
v 1i(v 1ix,v 1iy)→v 1(v 1x,v 1y,v 1z)=v 1(pixelpitch c(v 1ix-c certer)+O x,pixelpitch r(v 1iy-r center)+O y,-λ+O z)
v 2i(v 2ix,v 2iy)→v 2(v 2x,v 2y,v 2z)=v 2(pixelpitch c(v 2ix-c center)+O x,pixelpitch r(v 2iy-r center)+O y,-λ+O z)
Wherein, space coordinates are initial point with the lower left corner of television screen, are the x axle with the axle of the horizontal direction that is parallel to television screen, are y with the axle of the vertical direction that is parallel to television screen, are the z axle with the axle perpendicular to the direction of television screen directed towards user.Pixe; Pitch c, pixelpitch r, c Center, r CenterAll are intrinsic parameters of camera, O x, O y, O zBe the coordinate figure of camera light node on a world coordinate system coordinate axis.λ is the constant relevant with camera focus, focal length and object distance.From above-mentioned conversion relational expression as can be seen, by a p (p x, p y, p z), v 1(v 1x, v 1y, v 1z) and v 2(v 2x, v 2y, v 2z) plane parallel that constitutes is in screen and Δ pv 1v 2Similar in appearance to Δ p iv 1iv 2i
(b) according to the image-forming principle of camera, the P (P of pupil center x, P y, P z) on the line of pO, the reflection spot V of infrared light supply on cornea 1(V 1x, V 1y, V 1z), V 2(V 2x, V 2y, V 2z) respectively at v 1O and v 2On the line of O.Strictly speaking, P z, V 2zAnd V 2zBe unequal, but they are more or less the same, so let us can think P z=V 2z=V 1z, promptly put P and V 2Use straight line OP and plane Z=V respectively 2zIntersection point P " and straight line OV 2With plane Z=V 2zIntersection point V 2" replace, as shown in Figure 6, and Δ P " V 1V 2" similar in appearance to triangle Δ pv 1v 2If human eye center of curvature C and P " line and the intersection point of screen be Q, then Q is approximate blinkpunkt estimated value, and Δ QL 1L 2Similar in appearance to Δ P " V 1V 2", so Δ QL 1L 2Similar in appearance to Δ pv 1v 2And similar in appearance to Δ p iv 1iv 2i
(2) be respectively L through measuring the coordinate that can obtain two infrared light supplies 1(0,0) and L 2(0,34), L 1xAnd L 2xBe positioned at same horizontal line.By Δ QL 1L 2Similar in appearance to Δ p iv 1iv 2i, as shown in Figure 5, utilize the character of similar triangle, can obtain blinkpunkt coordinate Q (Q approximate on the screen according to following equation x, Q y).
Q x L 2 x - L 1 x = v 1 ix - p ix v 1 ix - v 2 ix Q y L 2 x - L 1 x = v 1 iy - p iy v 1 ix - v 2 ix ⇒ Q x = v 1 ix - p ix v 1 ix - v 2 ix * ( L 2 x - L 1 x ) Q y = v 1 iy - p iy v 1 ix - v 2 ix * ( L 2 x - L 1 x )
L wherein 1xAnd L 2xBe respectively infrared light supply L 1And L 2Coordinate figure on X-axis.
3,,, the position of the blinkpunkt that is calculated in the step 2 watches the position attentively so need being only accurately human eye through the calibration correction because vision optical axis and the intrinsic deviation of the optical axis.Adopt the secondary calibration mode, also can reduce the error that produces owing to the approximate substitution in the algorithm when proofreading and correct the optical axis and optical axis inherent variability, concrete steps are as follows:
(1) area dividing.Number and distribution according to scaling point in the system are A with whole screen divider 0, A 1, A 2, A 3And A 4Five zones, as shown in Figure 7.
(2) some calibration.Scaling point by the screen centre position carries out some calibration.Its concrete grammar is as follows:
If the true coordinate value of scaling point a is (x_real_a, y_real_a), and utilize coordinate figure that algorithm estimates for (x_gaze_a, y_gaze_a), then scaling point a is at the poor x_error_a between actual value and the estimated value on X-axis and the Y-axis, and y_error_a can be represented by the formula:
x_error_a=x_gaze_a-x_real_a
y_error_a=y_gaze_a-y_real_a
Suppose the blinkpunkt Q that the algorithm with this pair light source estimates i(i=1,2,3, L) with all the other four scaling point b, c, d, the coordinate position of e is respectively (x_gaze_Q i, y_gaze_Q i) (i=1,2,3, L), (x_gaze_b, y_gaze_b), (x_gaze_c, y_gaze_c), (x_gaze_d, y_gaze_d) and (x_gaze_e, y_gaze_e), then through the coordinate (x_gazel_Q of center scaling point a after once calibrating i, y_gazel_Q i), (x_gazel_b, y_gazel_b), (x_gazel_c, y_gazel_c), (x_gazel_d, y_gazel_d), (x_gazel_e y_gazel_e) can be expressed as:
x_gazel_Q i=x_gaze_Q i+w ix*x_error_a,y_gazel_Q i=y_gaze_Q i+w iy*y_error_a
x_gazel_b=x_gaze_b+w bx*x_error_a,y_gazel_b=y_gaze_b+w by*y_error_a
x_gazel_c=x_gaze_c+w cx*x_error_a,y_gazel_c=y_gaze_c+w cy*y_error_a
x_gazel_d=x_gaze_d+w dx*x_error_a,y_gazel_d=y_gaze_d+w dy*y_error_a
x_gazel_e=x_gaze_e+w ex*x_error_a,y_gazel_e=y_gaze_e+w ey*y_error_a
Wherein, weight coefficient w Ix, w Bx, w Cx, w Dx, w Ex, w Iy, w By, w Cy, w Dy, w EyThe distance dependent on X-axis or Y-axis with corresponding point and centre coordinate point.
(3) zone location.According to the position of some calibration back blinkpunkt and the zone at the definite blinkpunkt place of relation, the position between each scaling point.
(4) secondary calibration.According to the zone at blinkpunkt place, with the scaling point in the blinkpunkt region blinkpunkt after once calibrating is carried out the secondary calibration, obtain final blinkpunkt position, its coordinate position can be expressed as:
X_gaze2_Q i=x_gazel_Q i, y_gaze2_Q i=y_gazel_Q i(at regional A 0)
X_gaze2_Q i=x_gazel_Q i+ x_error_b, y_gaze2_Q i=y_gazel_Q i+ y_error_b is (at regional A 1)
X_gaze2_Q i=x_gazel_Q i+ x_error_c, y_gaze2_Q i=y_gazel_Q i+ y_error_c is (at regional A 2)
X_gaze2_Q i=x_gazel_Q i+ x_error_d, y_gaze2_Q i=y_gazel_Q i+ y_error_d is (at regional A 3)
X_gaze2_Q i=x_gazel_Q i+ x_error_e, y_gaze2_Q i=y_gazel_Q i+ y_error_e is (in zone 4 4)
Should be based on two light source sight tracings of space similar triangles and secondary calibration, need not to measure the distance between head and the screen, computing method are simple, though exist approximate error can not surpass their maximal value 0.9887cm and 0.7852cm in algorithm on X-axis and Y-axis.And because the custom characteristic of people's eye fixation, the error of its blinkpunkt estimated result presents zonal distribution, as shown in Figure 8.Fig. 9 has provided blinkpunkt estimated accuracy analysis chart.The evaluated error result of the blinkpunkt that is used to test is as shown in the table:
Blinkpunkt X_error(cm) Yerror(cm)
1 0.5050 0.3143
2 0.5132 0.8293
3 0.5299 0.3748
4 0.6994 0.0992
5 0.9113 0.2203
6 0.5349 0.3853
7 0.2769 0.0475
8 1.0562 0.1781
9 0.7784 0.4744
10 0.3104 0.6536
11 0.2331 0.3049
12 1.1230 0.1768
13 1.0073 0.2070
14 0.7716 0.5990
15 0.9398 0.6581
16 0.9398 0.5310
Average 0.6956 0.3784
Its average error on X-axis and Y-axis is respectively 0.6956cm and 0.3784cm, so method of the present invention can be used for actual application fully.

Claims (3)

1. the sight tracing based on two light sources is characterized in that, may further comprise the steps:
(1) image pre-service: the principle of utilizing pupil and reflection spot gray scale difference, the facial image that collects is carried out pre-service, extract in the eye image two reflection spots and pupil region, and calculate two reflection spots and the pupil center coordinate in image coordinate system, wherein image coordinate system is a true origin with the upper left corner of image, horizontal direction is an X-axis, and vertical direction is a Y-axis;
(2) blinkpunkt is estimated: the triangle of blinkpunkt and two infrared light supply formations is approximately the apparent position that a pair of similar triangles are determined blinkpunkt on the screen on triangle that constitutes according to pupil center in the image and two reflection spots and the screen;
(3) blinkpunkt is proofreaied and correct: the method for utilization secondary calibration is proofreaied and correct the intrinsic deviation of human eye's visual axis and optical axis and because of being similar to the evaluated error that produces, is obtained final blinkpunkt position.
2. the sight tracing based on two light sources according to claim 1 is characterized in that, the specific implementation step that described step (2) blinkpunkt is estimated is:
A. draw blinkpunkt and two leg-of-mutton approximate similarity sexual intercourse that infrared light supply constitutes on triangle that pupil center and two reflection spots constitute and the screen according to the image-forming principle of watching characteristic and camera attentively of human eye;
B. estimate the apparent position of blinkpunkt on the screen according to the triangle similarity.
3. the sight tracing based on two light sources according to claim 1 is characterized in that, the specific implementation step that described step (3) blinkpunkt is proofreaied and correct is:
The a-quadrant is divided: number and distribution according to scaling point are five zones with whole screen divider;
1 calibration of B: the scaling point by the screen centre position carries out some calibration;
C zone location: according to the position of some calibration back blinkpunkt and the zone at the definite blinkpunkt place of relation, the position between each scaling point;
D secondary calibration: according to the zone at blinkpunkt place, the blinkpunkt after once calibrating is carried out the secondary calibration, obtain final blinkpunkt position with the scaling point in the blinkpunkt region.
CN201010618752A 2010-12-31 2010-12-31 Eye-gaze tracking method based on double light sources Expired - Fee Related CN102043952B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010618752A CN102043952B (en) 2010-12-31 2010-12-31 Eye-gaze tracking method based on double light sources

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010618752A CN102043952B (en) 2010-12-31 2010-12-31 Eye-gaze tracking method based on double light sources

Publications (2)

Publication Number Publication Date
CN102043952A true CN102043952A (en) 2011-05-04
CN102043952B CN102043952B (en) 2012-09-19

Family

ID=43910080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010618752A Expired - Fee Related CN102043952B (en) 2010-12-31 2010-12-31 Eye-gaze tracking method based on double light sources

Country Status (1)

Country Link
CN (1) CN102043952B (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677270A (en) * 2013-12-13 2014-03-26 电子科技大学 Human-computer interaction method based on eye movement tracking
CN103793719A (en) * 2014-01-26 2014-05-14 深圳大学 Monocular distance-measuring method and system based on human eye positioning
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye movement tracking method and device
CN104244807A (en) * 2012-07-31 2014-12-24 独立行政法人科学技术振兴机构 Point of gaze detection device, point of gaze detection method, individual parameter computation device, individual parameter computation method, program, and computer-readable recording medium
CN104598019A (en) * 2013-10-28 2015-05-06 欧姆龙株式会社 Screen operation apparatus and screen operation method
CN104644120A (en) * 2013-11-15 2015-05-27 现代自动车株式会社 Gaze detecting apparatus and method
CN104915013A (en) * 2015-07-03 2015-09-16 孙建德 Eye tracking and calibrating method based on usage history
CN105678209A (en) * 2014-12-08 2016-06-15 现代自动车株式会社 Method for detecting face direction of a person
CN106547341A (en) * 2015-09-21 2017-03-29 现代自动车株式会社 The method of gaze tracker and its tracing fixation
CN107003521A (en) * 2014-09-22 2017-08-01 脸谱公司 The display visibility assembled based on eyes
CN107515474A (en) * 2017-09-22 2017-12-26 宁波维真显示科技股份有限公司 Autostereoscopic display method, apparatus and stereoscopic display device
CN108140244A (en) * 2015-12-01 2018-06-08 Jvc 建伍株式会社 Sight line detector and method for detecting sight line
CN108196676A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 Track and identify method and system
CN108427503A (en) * 2018-03-26 2018-08-21 京东方科技集团股份有限公司 Human eye method for tracing and human eye follow-up mechanism
CN108697389A (en) * 2015-11-18 2018-10-23 阿斯泰克有限公司 System and method for supporting neural state assessment and neural rehabilitation, especially cognition and/or laloplegia
CN109144267A (en) * 2018-09-03 2019-01-04 中国农业大学 Man-machine interaction method and device
CN110908511A (en) * 2019-11-08 2020-03-24 Oppo广东移动通信有限公司 Method for triggering recalibration and related device
CN113589532A (en) * 2021-07-30 2021-11-02 歌尔光学科技有限公司 Display calibration method and device of head-mounted equipment, head-mounted equipment and storage medium
CN117648037A (en) * 2024-01-29 2024-03-05 北京未尔锐创科技有限公司 Target sight tracking method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201307266Y (en) * 2008-06-25 2009-09-09 韩旭 Binocular sightline tracking device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201307266Y (en) * 2008-06-25 2009-09-09 韩旭 Binocular sightline tracking device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Intelligent Control and Automation (WCICA), 2010 8th World Congress on》 20100709 Xiaohui Yang et al 《A gaze tracking scheme for eye-based intelligent control》 50-55 1-3 , *
《中国工程科学》 20081231 黄莹,王志良,戚颖 《基于双光源的实时视线追踪系统》 86-90 1-3 , *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104244807A (en) * 2012-07-31 2014-12-24 独立行政法人科学技术振兴机构 Point of gaze detection device, point of gaze detection method, individual parameter computation device, individual parameter computation method, program, and computer-readable recording medium
US9262680B2 (en) 2012-07-31 2016-02-16 Japan Science And Technology Agency Point-of-gaze detection device, point-of-gaze detecting method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium
CN104244807B (en) * 2012-07-31 2016-10-19 国立研究开发法人科学技术振兴机构 Watch point detection device and method for viewing points detecting attentively
CN104598019A (en) * 2013-10-28 2015-05-06 欧姆龙株式会社 Screen operation apparatus and screen operation method
CN104644120A (en) * 2013-11-15 2015-05-27 现代自动车株式会社 Gaze detecting apparatus and method
CN103677270A (en) * 2013-12-13 2014-03-26 电子科技大学 Human-computer interaction method based on eye movement tracking
CN103677270B (en) * 2013-12-13 2016-08-17 电子科技大学 A kind of man-machine interaction method based on eye-tracking
CN103793719A (en) * 2014-01-26 2014-05-14 深圳大学 Monocular distance-measuring method and system based on human eye positioning
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye movement tracking method and device
CN103885589B (en) * 2014-03-06 2017-01-25 华为技术有限公司 Eye movement tracking method and device
CN107003521A (en) * 2014-09-22 2017-08-01 脸谱公司 The display visibility assembled based on eyes
CN107003521B (en) * 2014-09-22 2020-06-05 脸谱科技有限责任公司 Display visibility based on eye convergence
CN105678209A (en) * 2014-12-08 2016-06-15 现代自动车株式会社 Method for detecting face direction of a person
CN105678209B (en) * 2014-12-08 2020-06-30 现代自动车株式会社 Method for detecting face direction of person
CN104915013B (en) * 2015-07-03 2018-05-11 山东管理学院 A kind of eye tracking calibrating method based on usage history
CN104915013A (en) * 2015-07-03 2015-09-16 孙建德 Eye tracking and calibrating method based on usage history
CN106547341A (en) * 2015-09-21 2017-03-29 现代自动车株式会社 The method of gaze tracker and its tracing fixation
CN106547341B (en) * 2015-09-21 2023-12-08 现代自动车株式会社 Gaze tracker and method for tracking gaze thereof
CN108697389A (en) * 2015-11-18 2018-10-23 阿斯泰克有限公司 System and method for supporting neural state assessment and neural rehabilitation, especially cognition and/or laloplegia
CN108140244A (en) * 2015-12-01 2018-06-08 Jvc 建伍株式会社 Sight line detector and method for detecting sight line
CN108140244B (en) * 2015-12-01 2021-09-24 Jvc 建伍株式会社 Sight line detection device and sight line detection method
CN107515474A (en) * 2017-09-22 2017-12-26 宁波维真显示科技股份有限公司 Autostereoscopic display method, apparatus and stereoscopic display device
CN108196676A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 Track and identify method and system
CN108196676B (en) * 2018-01-02 2021-04-13 联想(北京)有限公司 Tracking identification method and system
CN108427503A (en) * 2018-03-26 2018-08-21 京东方科技集团股份有限公司 Human eye method for tracing and human eye follow-up mechanism
CN108427503B (en) * 2018-03-26 2021-03-16 京东方科技集团股份有限公司 Human eye tracking method and human eye tracking device
CN109144267A (en) * 2018-09-03 2019-01-04 中国农业大学 Man-machine interaction method and device
CN110908511A (en) * 2019-11-08 2020-03-24 Oppo广东移动通信有限公司 Method for triggering recalibration and related device
CN110908511B (en) * 2019-11-08 2022-03-15 Oppo广东移动通信有限公司 Method for triggering recalibration and related device
CN113589532A (en) * 2021-07-30 2021-11-02 歌尔光学科技有限公司 Display calibration method and device of head-mounted equipment, head-mounted equipment and storage medium
CN117648037A (en) * 2024-01-29 2024-03-05 北京未尔锐创科技有限公司 Target sight tracking method and system
CN117648037B (en) * 2024-01-29 2024-04-19 北京未尔锐创科技有限公司 Target sight tracking method and system

Also Published As

Publication number Publication date
CN102043952B (en) 2012-09-19

Similar Documents

Publication Publication Date Title
CN102043952B (en) Eye-gaze tracking method based on double light sources
US10958898B2 (en) Image creation device, method for image creation, image creation program, method for designing eyeglass lens and method for manufacturing eyeglass lens
CN105708467B (en) Human body actual range measures and the method for customizing of spectacle frame
Lai et al. Hybrid method for 3-D gaze tracking using glint and contour features
Villanueva et al. A novel gaze estimation system with one calibration point
CN101901485A (en) 3D free head moving type gaze tracking system
CN106796449A (en) Eye-controlling focus method and device
CN106056092A (en) Gaze estimation method for head-mounted device based on iris and pupil
US10620454B2 (en) System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping of camera images
CN103366381A (en) Sight line tracking correcting method based on space position
Model et al. User-calibration-free remote gaze estimation system
Schnieders et al. Reconstruction of display and eyes from a single image
US20220207919A1 (en) Methods, devices and systems for determining eye parameters
CN107464275A (en) Human spine center line three-dimensional reconstruction method
Mestre et al. Robust eye tracking based on multiple corneal reflections for clinical applications
Rast et al. Between-day reliability of three-dimensional motion analysis of the trunk: A comparison of marker based protocols
CN112329699A (en) Method for positioning human eye fixation point with pixel-level precision
Nagamatsu et al. Calibration-free gaze tracking using a binocular 3D eye model
Liu et al. 3D model-based gaze tracking via iris features with a single camera and a single light source
Liu et al. Iris feature-based 3-D gaze estimation method using a one-camera-one-light-source system
Lu et al. Neural 3D gaze: 3D pupil localization and gaze tracking based on anatomical eye model and neural refraction correction
CN113613546A (en) Apparatus and method for evaluating performance of vision equipment for vision task
Zhang et al. A simplified 3D gaze tracking technology with stereo vision
Nitschke Image-based eye pose and reflection analysis for advanced interaction techniques and scene understanding
Pansing et al. Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120919

Termination date: 20211231

CF01 Termination of patent right due to non-payment of annual fee