CN104113680A - Sight line tracking system and method - Google Patents

Sight line tracking system and method Download PDF

Info

Publication number
CN104113680A
CN104113680A CN201310138541.6A CN201310138541A CN104113680A CN 104113680 A CN104113680 A CN 104113680A CN 201310138541 A CN201310138541 A CN 201310138541A CN 104113680 A CN104113680 A CN 104113680A
Authority
CN
China
Prior art keywords
point
eyes
sight line
represent
center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310138541.6A
Other languages
Chinese (zh)
Other versions
CN104113680B (en
Inventor
王西颖
高书征
金智渊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN201310138541.6A priority Critical patent/CN104113680B/en
Priority to KR1020140015528A priority patent/KR20140125713A/en
Priority to US14/254,008 priority patent/US20140313308A1/en
Publication of CN104113680A publication Critical patent/CN104113680A/en
Application granted granted Critical
Publication of CN104113680B publication Critical patent/CN104113680B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a sight line tracking system and method. The sight line tracking system includes the following components of: a plurality of cameras; a plurality of groups of light sources corresponding to the plurality of cameras; a light source control device which controls the plurality of groups of light sources to emit light sequentially so that the plurality of cameras can photograph bright pupil images and dark pupil images of users; a sight line feature detection unit which detects the positions of pupil center points and the positions of reflection bright spots from the bright pupil images and the dark pupil images photographed by the cameras; and a sight line estimation unit which calculates the three-dimensional space positions of cornea hook face center points and the pupil center points through using detected positions of the pupil center points and positions of the reflection bright spots so as to track the direction of the sight lines of two eyes so that a concerned position of the two eyes can be determined.

Description

Gaze tracking system and method
Technical field
The present invention relates to a kind of gaze tracking system and method, more particularly, relate to a kind of high accuracy gaze tracking system and method for the use eyes sight line estimation model based on multiple cameras.
Background technology
Sight line is the direction of eye gaze, and it represents the focus that a people pays close attention to.Nearly decades, sight line chases after to follow the tracks of and always is more active research topic.Eye tracking exists very large application potential in field of human-computer interaction.For example, in the time that user watches indicator screen attentively, can estimate the blinkpunkt of user on screen by gaze tracking system.Therefore, sight line can be used as advanced computer input means, and it has been proved to be more efficient than traditional input equipment (such as mouse), and health physical disabilities also can use gaze tracking system.In addition, can also realize the interactive display based on Visual Trace Technology, the face face that this display shows can change with the variation of sight line.Meanwhile, eye tracking is also widely used in the mankind's cognition and Research on Memory by cognitive scientist.
But, although eye tracking has been carried out to a large amount of research work, but it is still a difficult problem that the method that uses computer vision in real world is estimated the direction of people an eye line, particularly have in the situations such as illumination variation, end rotation angle are large more obvious.
Current sight tracing can roughly be divided into sight tracing and the remote sight tracing based on headset equipment.Sight tracing based on headset equipment need to, at the equipment of a similar helmet of head-mount, utilize the image information that is arranged on the camera collection eyes in headset equipment.The greatest problem of this method is use and carry inconvenience, for user brings extra burden, and due to headset equipment more complicated and cost higher, limited the possibility of extensive popularization.By comparison, remote sight tracing is a kind of contactless method, and it does not need user to wear extra equipment, can not affect naturally using of user.Remote sight tracing is mainly the method based on pupil-corneal reflection technology, namely utilize infrared light supply in human eye, to produce pip, by following the trail of the relative position of pupil center and corneal reflection, according to the three-dimensional position of the imaging path estimation people eyeball of camera, obtain direction of visual lines.This method has following advantage: (1) is because adopt geometrical model to carry out sight line estimation, so the precision of eye tracking can not be greatly affected along with head movement; (2) what three-dimensional sight line method of estimation was estimated is space direction of visual lines, instead of screen sight line drop point, so eye tracking does not rely on screen position, size, shape, can calculate the blinkpunkt of sight line on any object.But also there is following shortcoming in this method: (1) is because needs long-focus camera causes visual angle narrow simultaneously; (2) cannot support that multiple people carry out sight line estimation and tracking simultaneously; (3) eye tracking precision has much room for improvement.
Summary of the invention
Therefore,, in order to solve above shortcoming of the prior art, the object of the present invention is to provide a kind of method and apparatus strengthening for 3D rendering quality that can remove 3D rendering noise and can improve 3D rendering contrast.
According to an aspect of the present invention, provide a kind of gaze tracking system, comprising: multiple cameras; The multiple sets of light sources corresponding with described multiple cameras; Light source control device, controls described multiple sets of light sources luminous successively, so that described multiple camera is taken user's bright pupil image and dark pupil image; Sight line characteristic detection unit, detects the position of pupil center's point and the position of reflection speck from described multiple cameras the bright pupil image of taking and dark pupil image; Sight line estimation unit, the three-dimensional space position of cornea center of surface point and pupil center's point is calculated in the position of the position of the pupil center's point detecting by use and reflection speck, thus the direction of tracking pairs of eyes sight line, to determine the concern position of eyes.
Described gaze tracking system also comprises: pay close attention to position correction unit, the multiple known points on the screen of watching by user carry out location position, revise the concern position of eyes.
Described multiple sets of light sources is the coaxial light source corresponding to described multiple cameras.
One group of light source comprises several infrared LEDs, and described several LED arrange equably around a camera corresponding with this group light source.
For a camera, in the time of the one group light source luminescent corresponding with this camera, this camera is taken user's bright pupil image, and in the time of other each group light source luminescents, this camera is taken user's dark pupil image.
Sight line characteristic detection unit detects the position of pupil center's point according to the half-tone information of human eye area in bright pupil image.
The gray value of sight line characteristic detection unit based on bright pupil image carries out pupil region to be cut apart, and then segmentation result is carried out to ellipse fitting, and the oval center that matching is obtained is as pupil center's point, thereby obtains the position of pupil center's point.
Sight line characteristic detection unit from dark pupil image mutually near and the approximately equalised multiple reflection specks of size in the nearest circle reflection speck of detection range pupil center, reflection speck using this circle reflection speck as cornea, thus the position of reflecting speck obtained.
The three-dimensional coordinate of the cornea center of surface point that the three-dimensional coordinate of pupil center's point that sight line estimation unit calculating left and right is two and left and right are two, thereby obtain left eye sight line and right eye sight line with the line performance of cornea center of surface point by pupil center's point, and the intersection point of left eye sight line and right eye sight line is defined as to the concern position of eyes.
Left eye and right eye meet respectively equation (1) to (8),
q ij=o j+k q,ij(o j-u ij) (1)
||q ij-c||=R1 (2)
(l i-o j)×(q ij-o j)·(c-o j)=0 (3)
(l i-q ij)·(q ij-c)*||o j-q ij||=(o j-q ij)×(q ij-c)*||l i-q ij|| (4)
r j=o j+k r,j(o j-v j) (5)
||r jc||=R1 (6)
(r j-o j)×(c-o j)·(p-o j)=0 (7)
n 1*||(r j-c)×(p-r j)||*||o j-r j||n 2*||(r j-c)×(o j-r j)||*||p-r j|| (8)
Wherein, × representing vectorial multiplication cross to represent vector dot, * represents that numerical value multiplies each other, q ijrepresent the light reflection point of the cornea of every eyes, o jrepresent the optical center point of camera j, u ijthe position of the reflection speck in the dark pupil image of every eyes, k q, ijthe slope of the straight line of the position of the reflection speck in the light reflection point of cornea and the dark pupil image of every eyes of every eyes of expression connection, c represents the cornea center of surface point of every eyes, R1 represents the radius constant of cornea curved surface place sphere, l irepresent the position from axle light source, r jrepresent the light refraction point of the cornea of every eyes, v jrepresent the position of the pupil center's point in the bright pupil image of every eyes, k r, jthe slope of the straight line of the position of the pupil center's point in the light refraction point of cornea and the bright pupil image of every eyes of every eyes of expression connection, p represents pupil center's point of every eyes, n 1represent the refraction coefficient between aqueous humor and cornea, n 2represent the refraction coefficient between air and cornea, sight line estimation unit is determined the concern position of eyes in the situation that meeting following constraints for left eye and right eye solving equation (1) to (8):
R=c 1+k 1(p 1-c 1)
R=c 2+k 2(p 2-c 2)
c 1+k 1(p 1-c 1)=c 2+k 2(p 2-c 2),
Wherein, R represents eyes sight line concern position, c 1represent the cornea center of surface point of left eye, c 2represent the cornea center of surface point of right eye, p 1represent pupil center's point of left eye, p 2represent pupil center's point of right eye, k 1represent the slope of the optical axis of left eye sight line, k 2represent the slope of the optical axis of right eye sight line.
If the quantity of described multiple sets of light sources is N group, left eye and right eye meet respectively N equation (1) to (8), and sight line estimation unit solves altogether 16*N equation and determine the concern position of eyes in the situation that meeting described constraints for left eye and right eye.
On the screen of watching user, set in advance multiple known target points, and sight line characteristic detection unit is determined the position of known target point, wherein, error between the position of known target point and the physical location of known target point of concern position correction unit calculative determination, calculate weight according to the distance between the concern position of the definite eyes of sight line estimation unit and the position of definite known target point, and use error correction model is revised the concern position of the definite eyes of sight line estimation unit.
Pay close attention to position correction unit by revise the concern position of the definite eyes of sight line estimation unit with following equation:
P processed = P computed + Σ i = 1 , . . . , M w i × e i ,
e i=(s i-p i),d i=||P computed-p i||,
Wherein, M represents the quantity of known target point, P processedrepresent the concern position of revised eyes, P computedrepresent the concern position of the definite eyes of sight line estimation unit, w irepresent weight, s irepresent the physical location of known target point, p irepresent the position of the definite known target point of sight line estimation unit, e irepresent the error between the position of the definite known target point of the physical location of known target point and sight line estimation unit, d irepresent the distance between concern position and the position of known target point of the definite eyes of sight line estimation unit.
According to an aspect of the present invention, provide a kind of sight tracing, comprise the following steps: (a) by controlling the multiple sets of light sources corresponding with multiple cameras luminous bright pupil image and the dark pupil image that uses described multiple camera to take user successively; (b) from the bright pupil image taken and dark pupil image, detect the position of pupil center point and reflect the position of speck; (c) three-dimensional space position of cornea center of surface point and pupil center's point is calculated in the position of the position of the pupil center's point detecting by use and reflection speck, thus the direction of tracking pairs of eyes sight line, to determine the concern position of eyes.
Described sight tracing is further comprising the steps of: the multiple known points on the screen of (d) watching by user carry out location position, revises the concern position of eyes.
Described multiple sets of light sources is the coaxial light source corresponding to described multiple cameras.
One group of light source comprises several infrared LEDs, and described several LED arrange equably around a camera corresponding with this group light source.
For a camera, in the time of the one group light source luminescent corresponding with this camera, this camera is taken user's bright pupil image, and in the time of other each group light source luminescents, this camera is taken user's dark pupil image.
In step (b), detect the position of pupil center's point according to the half-tone information of human eye area in bright pupil image.
In step (b), the gray value based on bright pupil image carries out pupil region to be cut apart, and then segmentation result is carried out to ellipse fitting, and the oval center that matching is obtained is as pupil center's point, thereby obtains the position of pupil center's point.
In step (b), the nearest circle reflection speck of detection range pupil center in mutual close and big or small approximately equalised multiple reflection specks from dark pupil image, reflection speck using this circle reflection speck as cornea, thus the position of reflecting speck obtained.
In step (c), the three-dimensional coordinate of the cornea center of surface point that the three-dimensional coordinate of pupil center's point that calculating left and right is two and left and right are two, thereby obtain left eye sight line and right eye sight line with the line performance of cornea center of surface point by pupil center's point, and the intersection point of left eye sight line and right eye sight line is defined as to the concern position of eyes.
Left eye and right eye meet respectively equation (1) to (8),
q ij=o j+k q,ij(o j-u ij) (1)
||q ij-c||=R1 (2)
(l I-o j)×(q ij-o j)·(c-o j)=0 (3)
(l i-q ij)·(q ij-c)*||o j-q ij||=(o j-q ij)×(q ij-c)*||l i-q ij|| (4)
r j=o j+k r,j(o j-v j) (5)
||r j-c||=R1 (6)
(r j-o j)×(c-o j)·(p-o j)=0 (7)
n 1*||(r j-c)×(p-r j)||*||o j-r j||=n 2*||(r j-c)×(o j-r j)||*||p-r j|| (8)
Wherein, × representing vectorial multiplication cross to represent vector dot, * represents that numerical value multiplies each other, q ijrepresent the light reflection point of the cornea of every eyes, o jrepresent the optical center point of camera j, u ijthe position of the reflection speck in the dark pupil image of every eyes, k q, ijthe slope of the straight line of the position of the reflection speck in the light reflection point of cornea and the dark pupil image of every eyes of every eyes of expression connection, c represents the cornea center of surface point of every eyes, R1 represents the radius constant of cornea curved surface place sphere, l irepresent the position from axle light source, r jrepresent the light refraction point of the cornea of every eyes, v jrepresent the position of the pupil center's point in the bright pupil image of every eyes, k r, jthe slope of the straight line of the position of the pupil center's point in the light refraction point of cornea and the bright pupil image of every eyes of every eyes of expression connection, p represents pupil center's point of every eyes, n 1represent the refraction coefficient between aqueous humor and cornea, n 2represent the refraction coefficient between air and cornea, and in the situation that meeting following constraints, determine the concern position of eyes for left eye and right eye solving equation (1) to (8):
R=c 1+k 1(p 1-c 1)
R=c 2+k 2(p 2-c 2)
c 1+k 1(p 1-c 1)=c 2+k 2(p 2-c 2),
Wherein, R represents eyes sight line concern position, c 1represent the cornea center of surface point of left eye, c 2represent the cornea center of surface point of right eye, p 1represent pupil center's point of left eye, p 2represent pupil center's point of right eye, k 1represent the slope of the optical axis of left eye sight line, k 2represent the slope of the optical axis of right eye sight line.
If the quantity of described multiple sets of light sources is N group, left eye and right eye meet respectively N equation (1) to (8), and in the situation that meeting described constraints, solve altogether 16*N equation and determine the concern position of eyes for left eye and right eye.
On the screen of watching user, set in advance multiple known target points, and determine the position of known target point to (c) by step (a), wherein, calculate by the error between position and the physical location of known target point of the definite known target point of step (a) to (c), calculate weight according to the distance between the concern position by the definite eyes of step (a) to (c) and the position of definite known target point, and use error correction model is revised by the concern position of the definite eyes of step (a) to (c).
By revising with following equation by the concern position of the definite eyes of step (a) to (c):
P processed = P computed + Σ i = 1 , . . . , M w i × e i ,
e i=(s i-p i),d i=||P computed-p i||,
Wherein, M represents the quantity of known target point, P processedrepresent the concern position of revised eyes, P computedrepresent by the concern position of the definite eyes of step (a) to (c) w irepresent weight, s irepresent the physical location of known target point, p irepresent the position to (c) definite known target point by step (a), e irepresent the physical location of known target point and pass through step (a) to the error between the position of (c) definite known target point, d irepresent by the distance between concern position and the position of known target point of the definite eyes of step (a) to (c).
Adopt multi-cam structure to solve the problem that human body and head move freely according to the gaze tracking system of the embodiment of the present invention and method, support many people's sight line to be estimated simultaneously, and improved significantly sight line estimated accuracy.
By part in ensuing description set forth the present invention other aspect and/or advantage, some will be clearly by descriptions, or can pass through enforcement of the present invention and learning.
Brief description of the drawings
By the detailed description of carrying out below in conjunction with accompanying drawing, above and other objects of the present invention, feature and advantage will become apparent, wherein:
Fig. 1 is the block diagram illustrating according to the gaze tracking system of the embodiment of the present invention;
Fig. 2 illustrates the diagram of synchronizeing with light source scintillation according to the picking rate of the camera of the embodiment of the present invention;
Fig. 3 is the exemplary diagram that bright pupil image and the dark pupil image of multiple cameras shootings are shown;
Fig. 4 is the flow chart illustrating according to the sight tracing of the embodiment of the present invention.
Embodiment
Describe below with reference to accompanying drawings embodiments of the invention in detail.In the accompanying drawings, identical drawing reference numeral represents identical structure, feature and element all the time.
Fig. 1 is the block diagram illustrating according to the gaze tracking system of the embodiment of the present invention.
With reference to Fig. 1, can comprise multiple cameras 10, multiple sets of light sources 20, light source control device 30, sight line characteristic detection unit 40 and the sight line estimation unit 50 corresponding with described multiple cameras according to the gaze tracking system of the embodiment of the present invention.Selectively, also can comprise and pay close attention to position correction unit 60 according to the gaze tracking system of the embodiment of the present invention.
According to the embodiment of the present invention, multiple cameras 10 can be arranged on the below of the screen that user watches.But multiple cameras 10 also can be arranged on top, the left or right-hand of the screen that user watches.The quantity of multiple cameras 10 can be two or more.
Multiple sets of light sources 20 can be (but being not limited to) infrared light supply, and can be set to the coaxial light source with respect to described multiple cameras 10, or can be set to respect to described multiple cameras 10 from axle light source.Can realize coaxial light source by the one group light source corresponding with camera arranged to (, the central point of this group light source and the central point of camera overlap) around this camera.The edge of a side that on the other hand, can be by multiple sets of light sources being arranged on to the screen that user watches is realized from axle light source.Coaxial light source can make camera photograph user's bright pupil image, and can make camera photograph user's dark pupil image from axle light source.According to the embodiment of the present invention, take bright pupil image and can improve the accuracy of detection of the position of pupil center's point, and include the reflection speck forming from axle light source in the dark pupil image of taking on user's cornea.Hereinafter, will the gaze tracking system according to the embodiment of the present invention be described as an example of coaxial light source example.But, also can realize the gaze tracking system according to the embodiment of the present invention from axle light source by arranging.Here, every group of light source can be made up of several (such as, but not limited to 3) infrared LEDs.As mentioned above, every group of light source can arrange around corresponding camera, and wherein, several infrared LEDs in every group of light source can be arranged in around corresponding camera equably, thereby form concentrically ringed shape with this camera.
Light source control device 30 can be realized by existing various processors (such as but not limited to single-chip microcomputer, CPU, DSP, ARM chip etc.), luminous successively for controlling multiple sets of light sources 20.If be provided with N group light source, light source control device 30 can be divided into one-period T N time period, and is controlled at and in each time period, has and only have one group of light source luminescent.In this case, for certain camera, in the time of the one group light source luminescent corresponding with it, this camera can be taken user's bright pupil image, and in the time of other each group light source luminescents, this camera can be taken user's dark pupil image.The picking rate of each camera need to match with the frequency of light source scintillation.Fig. 2 illustrates the diagram of synchronizeing with light source scintillation according to the picking rate of the camera of the embodiment of the present invention.Fig. 2 illustrates a camera and three groups of light sources.Therefore, one-period T was divided into for 3 time periods, only had one group of light source luminescent within each time period, and camera can take bright pupil image in a period of time, took dark pupil image within two other time period.For example, if the first light source group is the coaxial light source of camera,, within the luminous time period of the first light source group, camera is taken bright pupil image.In the dark pupil image of each frame, there is one from axle light source luminescent, thereby on people's cornea, form a reflection speck.Therefore,, according to the dark pupil image of continuous multiple frames, can obtain the positional information of the reflection speck of multiple corneas.Here, suppose that the picking rate of camera is enough fast, in one-period T, the position of eyeball does not change.Below, taking light source control device 30 control multiple sets of light sources 20 successively luminous control mode the sight tracing according to the embodiment of the present invention is described as example.
The image that sight line characteristic detection unit 40 can be taken from multiple cameras 10, detect the feature of human eye.That is, the bright pupil image that sight line characteristic detection unit 40 can be taken from multiple cameras 10, detect the position of pupil center's point, and the position of detection of reflected speck the dark pupil image of taking from multiple cameras 10.Fig. 3 illustrates bright pupil image that multiple cameras 10 take and the exemplary diagram of dark pupil image.Specifically, the position of detection pupil center's point is to carry out according to the half-tone information of human eye area in the bright pupil image of taking.In bright pupil image, pupil region is illuminated by coaxial light source, and the gray value presenting is higher than the gray value of peripheral part.Therefore, can carry out pupil region based on the gray value of bright pupil image and cut apart, then segmentation result be carried out to ellipse fitting, and the oval center that matching is obtained is as pupil center's point, thereby obtain the position of pupil center's point.Here can adopt existing various image partition method, such as region growing method, threshold segmentation method etc..Cut apart and obtain after pupil region by image, can utilize the profile of the pupil region of cutting apart to carry out ellipse fitting.Detect the position of the reflection speck of cornea carries out on dark pupil image.In dark pupil image, reflection speck and other regions form obvious gray scale difference.In other words, reflection speck is brighter than other regions.But, on sclera, also have the speck of reflection.Therefore, the reflection speck of the reflection speck of cornea and sclera need to be distinguished from each other.The reflection speck of cornea meets the following conditions: the location comparison of (1) multiple reflection specks is close, and size is more consistent; (2) the positional distance pupil center of reflection speck is not far; (3) shape matching of reflection speck is regular, approaches a roundlet.Therefore, sight line characteristic detection unit 40 can be from dark pupil image mutually near and the approximately equalised multiple reflection specks of size in the nearest circle reflection speck of detection range pupil center, set it as the reflection speck of cornea, thereby obtain the position of reflection speck.
The three-dimensional space position of cornea center of surface point and pupil center's point is calculated in the position of the position of pupil center's point that sight line estimation unit 50 detects by use and reflection speck, thus the direction of tracking pairs of eyes sight line, to determine the concern position of eyes.Further, sight line estimation unit 50 is estimated the optical axis of every eyes, that is, the line of pupil center's point P and cornea center of surface point C, to determine the concern position of eyes.Therefore, sight line estimation unit 50 use eyes sight line estimation models calculate.Different from existing sight line estimation model according to the eyes sight line estimation model of the embodiment of the present invention, it is upper that the optical axis of its hypothesis eyes sight line focuses on a bit (the concern position that is eyes), and therefore sight line estimation unit 50 need to be estimated the direction of visual lines of two eyes simultaneously.That is to say, the three-dimensional coordinate of the cornea center of surface point that the three-dimensional coordinate of pupil center's point that sight line estimation unit 50 calculating left and right is two and left and right are two, thereby obtain the direction of left eye sight line and the direction of right eye sight line, and the focus of left and right two an eye line is defined as to the concern position of intersection point.
Specifically, can determine by solving following equation the concern position of eyes, wherein, left eye and right eye meet respectively equation (1) to (8),
q ij=o j+k q,ij(o j-u ij) (1)
||q ij-c||=R1 (2)
(l i-o j)×(q ij-o j)·(c-o j)=0 (3)
(l i-q ij)·(q ij-c)*||o j-q ij||=(o j-q ij)×(q ij-c)*||l i-q ij|| (4)
r j=o j+k r,j(o j-v j) (5)
||r j-c||=R1 (6)
(r j-o j)×(c-o j)·(p-o j)=0 (7)
n 1*||(r j-c)×(p-r j)||*||o j-r j||n 2*||(r j-c)×(o j-r j)||p-r j|| (8)
Wherein, × representing vectorial multiplication cross to represent vector dot, * represents that numerical value multiplies each other, q ijrepresent the light reflection point of the cornea of every eyes, o jrepresent the optical center point of camera j, u ijthe position of the reflection speck in the dark pupil image of every eyes, k q, ijthe slope of the straight line of the position of the reflection speck in the light reflection point of cornea and the dark pupil image of every eyes of every eyes of expression connection, c represents the cornea center of surface point of every eyes, R1 represents the radius constant of cornea curved surface place sphere, l irepresent the position from axle light source, r jrepresent the light refraction point of the cornea of every eyes, v jrepresent the position of the pupil center's point in the bright pupil image of every eyes, k r, jthe slope of the straight line of the position of the pupil center's point in the light refraction point of cornea and the bright pupil image of every eyes of every eyes of expression connection, p represents pupil center's point of every eyes, n 1represent the refraction coefficient between aqueous humor and cornea, n 2represent the refraction coefficient between air and cornea, and below meeting three constraintss (9) to (11) in the situation that for left eye and right eye solving equation (1) to (8):
R=c 1+k 1(p 1-c 1) (9)
R=c 2+k 2(p 2-c 2) (10)
c 1+k 1(p 1-c 1)=c 2+k 2(p 2-c 2) (11)
Wherein, R represents eyes sight line concern position, c 1represent the cornea center of surface point of left eye, c 2represent the cornea center of surface point of right eye, p 1represent pupil center's point of left eye, p 2represent pupil center's point of right eye, k 1represent the slope of the optical axis of left eye sight line, k 2represent the slope of the optical axis of right eye sight line.That is to say, should be more than meeting three constraints in the situation that, to left eye and right eye 16 equation solutions altogether.In addition, in the time of actual solving equation, u 1and n 2can ignore.According to the embodiment of the present invention, if be provided with N group light source, can obtain N equation (1) to equation (8) for every eyes.Therefore, can be more than meeting three constraints in the situation that, to left eye and right eye 16*N equation solution altogether, thereby try to achieve the three-dimensional coordinate of the cornea center of surface point of two of the three-dimensional coordinate of pupil center of two of left and right and left and right, and the concern position of definite eyes.
After sight line estimation unit 50 is determined the concern position of eyes, multiple known points of paying close attention on the screen that can watch by user position correction unit 60 carry out location position, revise the concern position of eyes.The main cause of carrying out position correction is to have an angular deviation between the optical axis of sight line and the actual optical axis of sight line.Here, the optical axis of sight line is the line of pupil center's point and cornea center of surface point, and the actual optical axis of sight line is the line of pupil center's point and foveal region of retina concave point.According to the embodiment of the present invention, can adopt multiple known points on screen to carry out location position direction of visual lines is revised, to reach the object of concern position of accurate location eyes.First on screen, show successively the individual point of M (M is more than or equal to 2), edge and the standoff distance of the position that these points are set in screen is far away as far as possible.Then, require user to watch each point attentively and stop a bit of time (for example but do not limit for 2 seconds), gather the view data that user pays close attention to current point, the error between computing system estimated result and actual concern point, sets up VEC according to the position of current concerns.Finally, in actual estimated process, utilize correction model to compensate and correct estimated result.
Specifically, in embodiments of the present invention, adopt 5 impact points to revise the concern position of eyes.But, can adopt more or less point to revise the concern position of eyes.First can determine by sight line estimation unit 50 position of known target points, then pay close attention to the error of position correction unit 60 between can the position of known target point and the physical location of known target point of calculative determination.,, in the time that the concern position to eyes is revised, calculate suitable weight according to the distance between the concern position of the definite eyes of sight line estimation unit 50 and the position of definite known target point thereafter.Finally, concern position correction unit 60 use error correction models are revised the concern position of the definite eyes of sight line estimation unit 50.According to the embodiment of the present invention, paying close attention to position correction unit 60 can be by revising the concern position of eyes with following equation:
P processed = P computed + Σ i = 1 , . . . , M w i × e i ,
e i=(s i-p i),d i=||P computed-p i||,
Wherein, M represents the quantity of known target point, P processedrepresent the concern position of revised eyes, P computedrepresent the concern position of the definite eyes of sight line estimation unit 50, w irepresent weight, s irepresent the physical location of known target point, p irepresent the position of the definite known target point of sight line estimation unit 50.In addition, e irepresent the error between the position of the definite known target point of the physical location of known target point and sight line estimation unit 50, d irepresent the distance between concern position and the position of known target point of the definite eyes of sight line estimation unit 50.
Fig. 4 is the flow chart illustrating according to the sight tracing of the embodiment of the present invention.
With reference to Fig. 4, at step S401, by controlling the multiple sets of light sources corresponding with multiple cameras luminous bright pupil image and the dark pupil image that uses described multiple camera to take user successively.As mentioned above, described multiple sets of light sources can be set to the coaxial light source with respect to described multiple cameras 10.Have at any time and only have one group of light source luminescent, and the camera corresponding with luminous light source can take bright pupil image, and other cameras can be taken dark pupil image.
At step S402, from the bright pupil image taken and dark pupil image, detect the position of pupil center point and reflect the position of speck.Here, can carry out pupil region based on the gray value of bright pupil image and cut apart, then segmentation result be carried out to ellipse fitting, and the oval center that matching is obtained is as pupil center's point, thereby obtain the position of pupil center's point.On the other hand, can be from dark pupil image mutually near and the approximately equalised multiple reflection specks of size in the nearest circle reflection speck of detection range pupil center, set it as the reflection speck of cornea, thereby obtain the position of reflection speck.
At step S403, the three-dimensional space position of cornea center of surface point and pupil center's point is calculated in the position of the position of the pupil center's point detecting by use and reflection speck, thus the direction of tracking pairs of eyes sight line, to determine the concern position of eyes.As mentioned above, can meet three constraintss (9) to (11) in the situation that for left eye and right eye solving equation (1) to (8), thereby determine the concern position of eyes.
Selectively, at step S404, the multiple known points on the screen that can watch by user carry out location position, revise the concern position of eyes.According to the embodiment of the present invention, error between can the position of known target point and the physical location of known target point of calculative determination, calculate weight according to the distance between the concern position of the definite eyes of above step S401 to S403 and the position of definite known target point, and use error correction model is revised the concern position of the definite eyes of sight line estimation unit.As mentioned above, can be by revise the concern position of eyes with following equation:
P processed = P computed + Σ i = 1 , . . . , M w i × e i ,
e i=(s i-p i),d i=||P computed-p i||,
Wherein, M represents the quantity of known target point, P processedrepresent the concern position of revised eyes, P pomputedrepresent the concern position of the eyes definite by step S401 to S403, w irepresent weight, s irepresent the physical location of known target point, p irepresent the position of the known target point definite by step S401 to S403.In addition, e irepresent the physical location of known target point and pass through the error between the position of the definite known target point of step S401 to S403, d irepresent to look the distance between concern position and the position of known target point of the eyes definite by step S401 to S403.
Can be embodied as computer-readable code or the computer program on computer readable recording medium storing program for performing according to the sight tracing of the embodiment of the present invention.Described computer readable recording medium storing program for performing is the arbitrary data storage device that can store the data that can be read by computer system thereafter.
Adopt multi-cam structure to solve the problem that human body and head move freely according to the gaze tracking system of the embodiment of the present invention and method, support many people's sight line to be estimated simultaneously, and improved significantly sight line estimated accuracy.
Although specifically shown with reference to its exemplary embodiment and described the present invention, but it should be appreciated by those skilled in the art, in the case of not departing from the spirit and scope of the present invention that claim limits, can carry out the various changes in form and details to it.

Claims (17)

1. a gaze tracking system, comprising:
Multiple cameras;
The multiple sets of light sources corresponding with described multiple cameras;
Light source control device, controls described multiple sets of light sources luminous successively, so that described multiple camera is taken user's bright pupil image and dark pupil image;
Sight line characteristic detection unit, detects the position of pupil center's point and the position of reflection speck from described multiple cameras the bright pupil image of taking and dark pupil image;
Sight line estimation unit, the three-dimensional space position of cornea center of surface point and pupil center's point is calculated in the position of the position of the pupil center's point detecting by use and reflection speck, thus the direction of tracking pairs of eyes sight line, to determine the concern position of eyes.
2. gaze tracking system according to claim 1, also comprises:
Pay close attention to position correction unit, the multiple known points on the screen of watching by user carry out location position, revise the concern position of eyes.
3. gaze tracking system according to claim 1, wherein, described multiple sets of light sources is the coaxial light source corresponding to described multiple cameras.
4. gaze tracking system according to claim 3, wherein, one group of light source comprises several infrared LEDs, described several LED arrange equably around a camera corresponding with this group light source.
5. gaze tracking system according to claim 1, wherein, for a camera, in the time of the one group light source luminescent corresponding with this camera, this camera is taken user's bright pupil image, and in the time of other each group light source luminescents, this camera is taken user's dark pupil image.
6. gaze tracking system according to claim 1, wherein, sight line characteristic detection unit detects the position of pupil center's point according to the half-tone information of human eye area in bright pupil image.
7. gaze tracking system according to claim 6, wherein, the gray value of sight line characteristic detection unit based on bright pupil image carries out pupil region to be cut apart, then segmentation result is carried out to ellipse fitting, and the oval center that matching is obtained is as pupil center's point, thereby obtain the position of pupil center's point.
8. gaze tracking system according to claim 1, wherein, sight line characteristic detection unit from dark pupil image mutually near and the approximately equalised multiple reflection specks of size in the nearest circle reflection speck of detection range pupil center, reflection speck using this circle reflection speck as cornea, thus the position of reflecting speck obtained.
9. gaze tracking system according to claim 1, wherein, the three-dimensional coordinate of the cornea center of surface point that the three-dimensional coordinate of pupil center's point that sight line estimation unit calculating left and right is two and left and right are two, thereby obtain left eye sight line and right eye sight line with the line performance of cornea center of surface point by pupil center's point, and the intersection point of left eye sight line and right eye sight line is defined as to the concern position of eyes.
10. gaze tracking system according to claim 1, wherein, left eye and right eye meet respectively equation (1) to (8),
q ij=o j+k q,ij(o j-u ij) (1)
||q ij-c||=R1 (2)
(l i-o j)×(q ij-o j)·(c-o j)=0 (3)
(l i-q ij)·(q ij-c)*||o j-q ij||=(o j-q ij)×(q ij-c)*||l i-q ij|| (4)
r i=o j+k r,j(o j-v j) (5)
||r j-c||=R1 (6)
(r j-o j)×(c-o j)·(p-o j)=0 (7)
n 1*||(r j-c)×(p-r j)||*||o j-r j||n 2*||(r j-c)×(o j-r j)||*||p-r j|| (8)
Wherein, × representing vectorial multiplication cross to represent vector dot, * represents that numerical value multiplies each other, q ijrepresent the light reflection point of the cornea of every eyes, o jrepresent the optical center point of camera j, u ijthe position of the reflection speck in the dark pupil image of every eyes, k q, ijthe slope of the straight line of the position of the reflection speck in the light reflection point of cornea and the dark pupil image of every eyes of every eyes of expression connection, c represents the cornea center of surface point of every eyes, R1 represents the radius constant of cornea curved surface place sphere, l irepresent the position from axle light source, r jrepresent the light refraction point of the cornea of every eyes, v jrepresent the position of the pupil center's point in the bright pupil image of every eyes, k r, jthe slope of the straight line of the position of the pupil center's point in the light refraction point of cornea and the bright pupil image of every eyes of every eyes of expression connection, p represents pupil center's point of every eyes, n 1represent the refraction coefficient between aqueous humor and cornea, n 2represent the refraction coefficient between air and cornea,
Sight line estimation unit is determined the concern position of eyes in the situation that meeting following constraints for left eye and right eye solving equation (1) to (8):
R=c 1+k 1(p 1-c 1)
R=c 2+k 2(p 2-c 2)
c 1+k 1(p 1-c 1)=c 2+k 2(p 2-c 2),
Wherein, R represents eyes sight line concern position, c 1represent the cornea center of surface point of left eye, c 2represent the cornea center of surface point of right eye, p 1represent pupil center's point of left eye, p 2represent pupil center's point of right eye, k 1represent the slope of the optical axis of left eye sight line, k 2represent the slope of the optical axis of right eye sight line.
11. gaze tracking systems according to claim 10, wherein, if the quantity of described multiple sets of light sources is N group, left eye and right eye meet respectively N equation (1) to (8), and sight line estimation unit solves altogether 16*N equation and determine the concern position of eyes in the situation that meeting described constraints for left eye and right eye.
12. gaze tracking systems according to claim 2, wherein, set in advance multiple known target points user on the screen of watching, and sight line characteristic detection unit determines the position of known target point,
Wherein, error between the position of known target point and the physical location of known target point of concern position correction unit calculative determination, calculate weight according to the distance between the concern position of the definite eyes of sight line estimation unit and the position of definite known target point, and use error correction model is revised the concern position of the definite eyes of sight line estimation unit.
13. gaze tracking systems according to claim 12, wherein, pay close attention to position correction unit by revise the concern position of the definite eyes of sight line estimation unit with following equation:
P processed = P computed + Σ i = 1 , . . . , M w i × e i ,
e i=(s i-p i),d i=||P computed-p i||,
Wherein, M represents the quantity of known target point, P processedrepresent the concern position of revised eyes, P computedrepresent the concern position of the definite eyes of sight line estimation unit, w irepresent weight, s irepresent the physical location of known target point, p irepresent the position of the definite known target point of sight line estimation unit, e irepresent the error between the position of the definite known target point of the physical location of known target point and sight line estimation unit, d irepresent the distance between concern position and the position of known target point of the definite eyes of sight line estimation unit.
14. 1 kinds of sight tracings, comprise the following steps:
(a) by controlling the multiple sets of light sources corresponding with multiple cameras luminous bright pupil image and the dark pupil image that uses described multiple camera to take user successively;
(b) from the bright pupil image taken and dark pupil image, detect the position of pupil center point and reflect the position of speck;
(c) three-dimensional space position of cornea center of surface point and pupil center's point is calculated in the position of the position of the pupil center's point detecting by use and reflection speck, thus the direction of tracking pairs of eyes sight line, to determine the concern position of eyes.
15. sight tracings according to claim 14, further comprising the steps of:
(d) the multiple known points on the screen of watching by user carry out location position, revise the concern position of eyes.
16. sight tracings according to claim 14, wherein, in step (b), detect the position of pupil center's point according to the half-tone information of human eye area in bright pupil image.
17. sight tracings according to claim 14, wherein, in step (c), the three-dimensional coordinate of the cornea center of surface point that the three-dimensional coordinate of pupil center's point that calculating left and right is two and left and right are two, thereby obtain left eye sight line and right eye sight line with the line performance of cornea center of surface point by pupil center's point, and the intersection point of left eye sight line and right eye sight line is defined as to the concern position of eyes.
CN201310138541.6A 2013-04-19 2013-04-19 Gaze tracking system and method Expired - Fee Related CN104113680B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201310138541.6A CN104113680B (en) 2013-04-19 2013-04-19 Gaze tracking system and method
KR1020140015528A KR20140125713A (en) 2013-04-19 2014-02-11 Apparatus and method of gaze tracking based on camera array
US14/254,008 US20140313308A1 (en) 2013-04-19 2014-04-16 Apparatus and method for tracking gaze based on camera array

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310138541.6A CN104113680B (en) 2013-04-19 2013-04-19 Gaze tracking system and method

Publications (2)

Publication Number Publication Date
CN104113680A true CN104113680A (en) 2014-10-22
CN104113680B CN104113680B (en) 2019-06-28

Family

ID=51710306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310138541.6A Expired - Fee Related CN104113680B (en) 2013-04-19 2013-04-19 Gaze tracking system and method

Country Status (2)

Country Link
KR (1) KR20140125713A (en)
CN (1) CN104113680B (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138965A (en) * 2015-07-31 2015-12-09 东南大学 Near-to-eye sight tracking method and system thereof
CN106324831A (en) * 2015-06-30 2017-01-11 汤姆逊许可公司 A gaze tracking device and a head mounted device embedding said gaze tracking device
CN106339087A (en) * 2016-08-29 2017-01-18 上海青研科技有限公司 Eyeball tracking method based on multidimensional coordinate and device thereof
CN106339085A (en) * 2016-08-22 2017-01-18 华为技术有限公司 Terminal with sight-line tracking function and method and device for determining user viewpoint
CN106547341A (en) * 2015-09-21 2017-03-29 现代自动车株式会社 The method of gaze tracker and its tracing fixation
WO2017059577A1 (en) * 2015-10-09 2017-04-13 华为技术有限公司 Eyeball tracking device and auxiliary light source control method and related device thereof
CN106843468A (en) * 2016-12-27 2017-06-13 努比亚技术有限公司 A kind of man-machine interaction method in terminal and VR scenes
CN107071267A (en) * 2017-01-19 2017-08-18 西安电子科技大学 A kind of synchronous bright dark pupil image-pickup method of two-way for gaze tracking system
CN107085680A (en) * 2017-04-18 2017-08-22 宇龙计算机通信科技(深圳)有限公司 A kind of method and device of operation terminal
CN107358217A (en) * 2017-07-21 2017-11-17 北京七鑫易维信息技术有限公司 A kind of gaze estimation method and device
CN107368774A (en) * 2016-03-31 2017-11-21 富士通株式会社 Gaze detection equipment and gaze detection method
CN107515474A (en) * 2017-09-22 2017-12-26 宁波维真显示科技股份有限公司 Autostereoscopic display method, apparatus and stereoscopic display device
CN108196676A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 Track and identify method and system
CN108334191A (en) * 2017-12-29 2018-07-27 北京七鑫易维信息技术有限公司 Based on the method and apparatus of the determination blinkpunkt of eye movement analysis equipment
CN108352133A (en) * 2015-11-05 2018-07-31 诺华股份有限公司 Phantom eye
CN109471523A (en) * 2017-09-08 2019-03-15 托比股份公司 Use the eye tracks of eyeball center
CN110096130A (en) * 2018-01-29 2019-08-06 美的集团股份有限公司 Control method and device, water heater and computer readable storage medium
CN110582781A (en) * 2018-04-11 2019-12-17 视信有限责任公司 Sight tracking system and method
CN110908511A (en) * 2019-11-08 2020-03-24 Oppo广东移动通信有限公司 Method for triggering recalibration and related device
CN110989166A (en) * 2019-12-25 2020-04-10 歌尔股份有限公司 Eyeball tracking system of near-to-eye display equipment and near-to-eye display equipment
CN111027502A (en) * 2019-12-17 2020-04-17 Oppo广东移动通信有限公司 Eye image positioning method and device, electronic equipment and computer storage medium
CN111208904A (en) * 2020-01-08 2020-05-29 北京未动科技有限公司 Sight estimation equipment performance evaluation method, system and equipment
CN111208905A (en) * 2020-01-08 2020-05-29 北京未动科技有限公司 Multi-module sight tracking method and system and sight tracking equipment
CN111309141A (en) * 2018-12-11 2020-06-19 托比股份公司 Screen estimation
CN111539984A (en) * 2018-12-21 2020-08-14 托比股份公司 Continuous calibration based on pupil characteristics
CN112099615A (en) * 2019-06-17 2020-12-18 北京七鑫易维科技有限公司 Gaze information determination method and device, eyeball tracking equipment and storage medium
CN112597972A (en) * 2021-01-27 2021-04-02 张鹏 Sight tracking device, system and method
CN112970248A (en) * 2018-11-05 2021-06-15 京瓷株式会社 Three-dimensional display device, head-up display system, moving object, and program
CN113052921A (en) * 2021-05-18 2021-06-29 北京科技大学 System calibration method of three-dimensional sight tracking system
CN113190119A (en) * 2021-05-06 2021-07-30 Tcl通讯(宁波)有限公司 Mobile terminal screen lighting control method and device, mobile terminal and storage medium
CN113318435A (en) * 2021-04-27 2021-08-31 青岛小鸟看看科技有限公司 Control method and device of handle control tracker and head-mounted display equipment
CN113827244A (en) * 2020-06-24 2021-12-24 比亚迪股份有限公司 Method, system and device for detecting and monitoring sight direction of driver
CN115840301A (en) * 2021-09-18 2023-03-24 华为技术有限公司 Lens, glasses and lens adjusting method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102347359B1 (en) * 2015-01-14 2022-01-06 삼성전자주식회사 Electronic device and method for tracking gaze in electronic device
KR20230101580A (en) * 2021-12-29 2023-07-06 삼성전자주식회사 Eye tracking method, apparatus and sensor for determining sensing coverage based on eye model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030098954A1 (en) * 2001-04-27 2003-05-29 International Business Machines Corporation Calibration-free eye gaze tracking
CN101699510A (en) * 2009-09-02 2010-04-28 北京科技大学 Particle filtering-based pupil tracking method in sight tracking system
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
CN102496005A (en) * 2011-12-03 2012-06-13 辽宁科锐科技有限公司 Eye characteristic-based trial auxiliary study and judging analysis system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030098954A1 (en) * 2001-04-27 2003-05-29 International Business Machines Corporation Calibration-free eye gaze tracking
US6578962B1 (en) * 2001-04-27 2003-06-17 International Business Machines Corporation Calibration-free eye gaze tracking
CN101699510A (en) * 2009-09-02 2010-04-28 北京科技大学 Particle filtering-based pupil tracking method in sight tracking system
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
CN102496005A (en) * 2011-12-03 2012-06-13 辽宁科锐科技有限公司 Eye characteristic-based trial auxiliary study and judging analysis system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
E.D. GUESTRIN等: "General Theory of Remote Gaze Estimation Using the Pupil Center and Corneal Reflections", 《IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING》 *
ZHIWEI ZHU等: "Novel Eye Gaze Tracking Techniques Under Natural Head Movement", 《IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING》 *

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106324831A (en) * 2015-06-30 2017-01-11 汤姆逊许可公司 A gaze tracking device and a head mounted device embedding said gaze tracking device
CN105138965A (en) * 2015-07-31 2015-12-09 东南大学 Near-to-eye sight tracking method and system thereof
CN105138965B (en) * 2015-07-31 2018-06-19 东南大学 A kind of near-to-eye sight tracing and its system
CN106547341B (en) * 2015-09-21 2023-12-08 现代自动车株式会社 Gaze tracker and method for tracking gaze thereof
CN106547341A (en) * 2015-09-21 2017-03-29 现代自动车株式会社 The method of gaze tracker and its tracing fixation
WO2017059577A1 (en) * 2015-10-09 2017-04-13 华为技术有限公司 Eyeball tracking device and auxiliary light source control method and related device thereof
CN108352133A (en) * 2015-11-05 2018-07-31 诺华股份有限公司 Phantom eye
CN107368774A (en) * 2016-03-31 2017-11-21 富士通株式会社 Gaze detection equipment and gaze detection method
CN106339085B (en) * 2016-08-22 2020-04-21 华为技术有限公司 Terminal with sight tracking function, method and device for determining user viewpoint
US10929659B2 (en) 2016-08-22 2021-02-23 Huawei Technologies Co., Ltd. Terminal with line-of-sight tracking function, and method and apparatus for determining point of gaze of user
CN106339085A (en) * 2016-08-22 2017-01-18 华为技术有限公司 Terminal with sight-line tracking function and method and device for determining user viewpoint
CN106339087A (en) * 2016-08-29 2017-01-18 上海青研科技有限公司 Eyeball tracking method based on multidimensional coordinate and device thereof
CN106339087B (en) * 2016-08-29 2019-01-29 上海青研科技有限公司 A kind of eyeball tracking method and device thereof based on multidimensional coordinate
CN106843468A (en) * 2016-12-27 2017-06-13 努比亚技术有限公司 A kind of man-machine interaction method in terminal and VR scenes
CN107071267A (en) * 2017-01-19 2017-08-18 西安电子科技大学 A kind of synchronous bright dark pupil image-pickup method of two-way for gaze tracking system
CN107085680B (en) * 2017-04-18 2019-10-11 宇龙计算机通信科技(深圳)有限公司 A kind of method and device of operating terminal
CN107085680A (en) * 2017-04-18 2017-08-22 宇龙计算机通信科技(深圳)有限公司 A kind of method and device of operation terminal
CN107358217A (en) * 2017-07-21 2017-11-17 北京七鑫易维信息技术有限公司 A kind of gaze estimation method and device
CN109471523A (en) * 2017-09-08 2019-03-15 托比股份公司 Use the eye tracks of eyeball center
CN109471523B (en) * 2017-09-08 2021-09-28 托比股份公司 Eye tracking using eyeball center position
CN107515474A (en) * 2017-09-22 2017-12-26 宁波维真显示科技股份有限公司 Autostereoscopic display method, apparatus and stereoscopic display device
CN108334191A (en) * 2017-12-29 2018-07-27 北京七鑫易维信息技术有限公司 Based on the method and apparatus of the determination blinkpunkt of eye movement analysis equipment
CN108196676A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 Track and identify method and system
CN108196676B (en) * 2018-01-02 2021-04-13 联想(北京)有限公司 Tracking identification method and system
CN110096130A (en) * 2018-01-29 2019-08-06 美的集团股份有限公司 Control method and device, water heater and computer readable storage medium
CN110582781A (en) * 2018-04-11 2019-12-17 视信有限责任公司 Sight tracking system and method
CN112970248A (en) * 2018-11-05 2021-06-15 京瓷株式会社 Three-dimensional display device, head-up display system, moving object, and program
CN111309141A (en) * 2018-12-11 2020-06-19 托比股份公司 Screen estimation
CN111309141B (en) * 2018-12-11 2023-08-04 托比股份公司 Screen estimation
CN111539984A (en) * 2018-12-21 2020-08-14 托比股份公司 Continuous calibration based on pupil characteristics
CN112099615A (en) * 2019-06-17 2020-12-18 北京七鑫易维科技有限公司 Gaze information determination method and device, eyeball tracking equipment and storage medium
CN112099615B (en) * 2019-06-17 2024-02-09 北京七鑫易维科技有限公司 Gaze information determination method, gaze information determination device, eyeball tracking device, and storage medium
CN110908511B (en) * 2019-11-08 2022-03-15 Oppo广东移动通信有限公司 Method for triggering recalibration and related device
CN110908511A (en) * 2019-11-08 2020-03-24 Oppo广东移动通信有限公司 Method for triggering recalibration and related device
CN111027502A (en) * 2019-12-17 2020-04-17 Oppo广东移动通信有限公司 Eye image positioning method and device, electronic equipment and computer storage medium
US11782268B2 (en) 2019-12-25 2023-10-10 Goertek Inc. Eyeball tracking system for near eye display apparatus, and near eye display apparatus
CN110989166A (en) * 2019-12-25 2020-04-10 歌尔股份有限公司 Eyeball tracking system of near-to-eye display equipment and near-to-eye display equipment
CN111208904A (en) * 2020-01-08 2020-05-29 北京未动科技有限公司 Sight estimation equipment performance evaluation method, system and equipment
CN111208905A (en) * 2020-01-08 2020-05-29 北京未动科技有限公司 Multi-module sight tracking method and system and sight tracking equipment
CN113827244A (en) * 2020-06-24 2021-12-24 比亚迪股份有限公司 Method, system and device for detecting and monitoring sight direction of driver
CN113827244B (en) * 2020-06-24 2023-10-17 比亚迪股份有限公司 Method for detecting and monitoring driver's sight line direction, system and device
CN112597972A (en) * 2021-01-27 2021-04-02 张鹏 Sight tracking device, system and method
CN113318435A (en) * 2021-04-27 2021-08-31 青岛小鸟看看科技有限公司 Control method and device of handle control tracker and head-mounted display equipment
US11896894B2 (en) 2021-04-27 2024-02-13 Qingdao Pico Technology Co., Ltd. Control method and apparatus for handle control, and head mounted display
CN113190119A (en) * 2021-05-06 2021-07-30 Tcl通讯(宁波)有限公司 Mobile terminal screen lighting control method and device, mobile terminal and storage medium
CN113052921B (en) * 2021-05-18 2021-10-15 北京科技大学 System calibration method of three-dimensional sight tracking system
CN113052921A (en) * 2021-05-18 2021-06-29 北京科技大学 System calibration method of three-dimensional sight tracking system
CN115840301A (en) * 2021-09-18 2023-03-24 华为技术有限公司 Lens, glasses and lens adjusting method

Also Published As

Publication number Publication date
KR20140125713A (en) 2014-10-29
CN104113680B (en) 2019-06-28

Similar Documents

Publication Publication Date Title
CN104113680A (en) Sight line tracking system and method
US9959678B2 (en) Face and eye tracking using facial sensors within a head-mounted display
CN109558012B (en) Eyeball tracking method and device
US11537202B2 (en) Methods for generating calibration data for head-wearable devices and eye tracking system
CN107004275B (en) Method and system for determining spatial coordinates of a 3D reconstruction of at least a part of a physical object
US10636193B1 (en) Generating graphical representation of a user's face and body using a monitoring system included on a head mounted display
JP2019519859A (en) System and method for performing gaze tracking
JP2020034919A (en) Eye tracking using structured light
CN112805659A (en) Selecting depth planes for a multi-depth plane display system by user classification
EP3252566B1 (en) Face and eye tracking and facial animation using facial sensors within a head-mounted display
CN109643152B (en) Face and eye tracking and face animation using face sensors within a head-mounted display
CN105138965A (en) Near-to-eye sight tracking method and system thereof
CN110018736A (en) The object via near-eye display interface in artificial reality enhances
CN108170279A (en) The eye of aobvious equipment is moved moves exchange method with head
CN102125422A (en) Pupil center-corneal reflection (PCCR) based sight line evaluation method in sight line tracking system
WO2019154509A1 (en) Devices, systems and methods for predicting gaze-related parameters
US20170352178A1 (en) Facial animation using facial sensors within a head-mounted display
CN105589551A (en) Eye tracking method for human-computer interaction of mobile device
JP2022538669A (en) Improved eye tracking latency
US20190196221A1 (en) System and Method of Obtaining Fit and Fabrication Measurements for Eyeglasses Using Simultaneous Localization and Mapping of Camera Images
US20220207919A1 (en) Methods, devices and systems for determining eye parameters
JP7168953B2 (en) Gaze measurement device for automatic calibration, Gaze measurement method and Gaze measurement program
CN116033864A (en) Eye tracking using non-spherical cornea models
CN114424147A (en) Determining eye rotation center using one or more eye tracking cameras
US11475592B2 (en) Systems and methods for determining an ear saddle point of a user to produce specifications to fit a wearable apparatus to the user's head

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190628