WO2012077713A1 - 注視点検出方法及び注視点検出装置 - Google Patents
注視点検出方法及び注視点検出装置 Download PDFInfo
- Publication number
- WO2012077713A1 WO2012077713A1 PCT/JP2011/078302 JP2011078302W WO2012077713A1 WO 2012077713 A1 WO2012077713 A1 WO 2012077713A1 JP 2011078302 W JP2011078302 W JP 2011078302W WO 2012077713 A1 WO2012077713 A1 WO 2012077713A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cameras
- subject
- vector
- point
- gaze
- Prior art date
Links
- 0 CCC1C*CC1 Chemical compound CCC1C*CC1 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
Definitions
- the present invention relates to a gaze point detection method and a gaze point detection apparatus for detecting a gaze point of a subject on a predetermined plane based on the subject's image.
- the center of the pupil is caused by having the subject gaze at a total of two points: a camera whose position is known and one point on the display screen whose position is known.
- a function for calculating the gaze direction of the subject is corrected from the distance
- the line-of-sight detection method disclosed in Patent Document 3 below is a method for simultaneously detecting the line of sight of both eyes using two cameras. Even in this method, it is necessary to have the subject look at the camera in order to calibrate the line-of-sight detection result.
- the reason for having the subject look at the camera for correction when calculating the line of sight is that, when looking at the camera, ideally, the corneal reflection image of the light source is the pupil due to the symmetry around the optical axis of the eyeball. This is because the cornea reflection image is actually shifted from the center of the pupil. For this reason, in the method of calculating the gaze direction of the subject using the function from the distance
- JP 2005-185431 A Japanese Patent Laid-Open No. 2005-230049 JP 2005-198743 A
- the present invention has been made in view of such a problem, and provides a gaze point detection method and a gaze point detection apparatus capable of realizing high-speed and high-precision gaze point detection while reducing the burden on the subject. Objective.
- a gaze point detection method is a face image that generates a face image of a subject using N cameras (N is a natural number of 2 or more) and a plurality of light sources.
- N is a natural number of 2 or more
- the angle ⁇ of the subject's line of sight with respect to each of the reference lines connecting the N cameras and the pupil center is set to at least a vector using the function f.
- the following formula (1) including M undetermined constants (M is a natural number of 3 or more) including a vector r 0 which is an offset vector of r; ⁇ f (
- the M undetermined constants included in the function f are determined using the line-of-sight direction calculating step calculated by the above and a plurality of relational expressions derived based on at least the angle ⁇ calculated corresponding to the N cameras.
- the gaze point detection device is a gaze point detection device that detects a gaze point of the target person based on the face image of the target person, and acquires the face image of the target person N
- a camera a plurality of light sources, a control circuit that controls the cameras and the light sources, and an image processing unit that processes image signals output from the N cameras.
- a vector r representing the actual distance from the corneal reflection point, which is the reflection point on the cornea of the subject of light from the light source, to the center of the pupil is calculated, and each corresponding to the N cameras.
- the angle ⁇ of the subject's line of sight with respect to each of the reference lines connecting the N cameras and the pupil center is set to M including at least a vector r 0 that is an offset vector of the vector r using the function f.
- M undetermined constants included in the function f are determined using a plurality of relational expressions derived based on at least the angle ⁇ calculated corresponding to the N cameras. Based on the line-of-sight direction calculated by the above equation (1) using the M undetermined constants, the gaze point of the target person is detected, and the number N of cameras is set to M ⁇ 1/2 or more. It is characterized by that.
- a face image of the subject is generated by N cameras and a plurality of light sources, and the center of the pupil is detected from the corneal reflection point of the subject based on each face image.
- a vector r is calculated corresponding to each of the N cameras, and each vector r is applied to a function f including M undetermined constants including the offset vector r 0 , whereby the reference line of the subject's line of sight Is calculated for each of the N cameras.
- a plurality of relational expressions are derived on the basis of the respective angles ⁇ calculated in this way, and the function f can be calculated using these relational expressions since the cameras are set to M ⁇ 1/2 or more.
- M undetermined constants included in are determined.
- the direction of the line of sight and the gazing point are detected from the face image of the subject using the determined function f.
- automatic correction related to the function for calculating the line-of-sight direction is performed with high accuracy without causing the subject to gaze at a plurality of specified points and without causing the subject to gaze at the opening of the camera. .
- the burden on the subject is reduced, and high-speed and high-precision gaze point detection is possible.
- the gaze point detection method and gaze point detection apparatus can realize high-speed and high-precision gaze point detection while reducing the burden on the subject.
- FIG. 1 is a perspective view showing a gazing point detection device 1 according to a preferred embodiment of the present invention. It is a top view of the light source attached to the opening of the camera of FIG. It is a figure which shows the positional relationship of the coordinate system set with the gazing point detection apparatus of FIG. It is a figure for demonstrating the gaze point detection procedure by the gaze point detection apparatus of FIG. It is a figure for demonstrating the gaze point detection procedure by the gaze point detection apparatus of FIG. (A), (b) is a figure which shows the vector r observed on a camera image, (c) is a figure which shows the gaze point T on a virtual viewpoint plane. It is a figure which shows the vector r observed on the image acquired with the camera of FIG. FIG.
- FIG. 6 is a diagram in which points O 1 , O 2 , and G S projected on the virtual viewpoint spherical surface S shown in FIG. 5 are further projected on a plane.
- FIG. 6 is a diagram in which points O 1 , O 2 , O 3 , and G S projected on the virtual viewpoint spherical surface S shown in FIG. 5 are further projected on a plane.
- It is a figure which shows angle (theta) i as a vector on the projection figure on the plane of the virtual viewpoint spherical surface S shown in FIG.
- FIG. 2 is a diagram illustrating vectors r i and r i ′ detected on the camera image of the camera in FIG. 1.
- 7 is a diagram showing a positional relationship between left and right pupils of a subject A and a gazing point Q on a screen of the display device 8.
- FIG. 1 is a diagram in which points O 1 , O 2 , and G S projected on the virtual viewpoint spherical surface S shown in FIG
- the gaze point detection device of the present invention is a device that detects a gaze point on a monitor screen of an information processing terminal such as a personal computer based on a face image of a subject.
- FIG. 1 is a perspective view showing a gazing point detection apparatus 1 according to a preferred embodiment of the present invention.
- the gazing point detection device 1 includes four cameras 2a, 2b, 2c, and 2d that capture the face image of the subject A and the openings of the respective cameras 2a, 2b, 2c, and 2d.
- the light-emitting circuit (control circuit) 4 for feeding the light sources 3a, 3b, 3c, 3d, and the cameras 2a, 2b, 2c, 2d
- a synchronization signal generator (control circuit) 5 that generates a synchronization signal
- a delay circuit (control circuit) 6 that delays the synchronization signal
- a personal computer that processes image signals generated by the cameras 2a, 2b, 2c, and 2d
- An image processing device (image processing unit) 7 such as a display device 8 disposed above the cameras 2a and 2b and between the cameras 2c and 2d so as to face the subject A and connected to the image processing device 7 With There.
- the light emitting circuit 4, the synchronization signal generator 5, and the delay circuit 6 constitute a control circuit for controlling the operations of the cameras 2a, 2b, 2c, 2d and the light sources 3a, 3b, 3c, 3d.
- the cameras 2a, 2b, 2c, and 2d generate image data by capturing the face of the subject A.
- NTSC system cameras which are one of interlace scan systems, are used.
- one frame of image data obtained 30 frames per second is composed of an odd field composed of odd-numbered horizontal pixel lines and an even field composed of even-numbered horizontal pixel lines excluding odd fields.
- the odd field image and the even field image are alternately captured at 1/60 second intervals. Specifically, in one frame, the pixel lines in the odd field and the pixel lines in the even field are generated alternately.
- These cameras 2a, 2c, and 2d are each input with four cameras 2a, 2b, and 2d by receiving a vertical synchronizing signal (VD signal) delayed from the synchronizing signal generator 5 via the delay circuit 6.
- VD signal vertical synchronizing signal
- the shooting timings 2c and 2d are shifted from each other.
- FIG. 2 shows a plan view of the light sources 3a, 3b, 3c, and 3d.
- the light sources 3a, 3b, 3c, and 3d are for irradiating illumination light toward the face of the subject A, and have a structure in which a plurality of two types of light emitting elements 11 and 12 are embedded in the ring-shaped base portion have.
- the light-emitting element 11 is a semiconductor light-emitting element (LED) having a center wavelength of output light of 850 nm, and is arranged in a ring shape at regular intervals along the edges of the openings 9a, 9b, 9c, and 9d on the pedestal part 10.
- the light emitting element 12 is a semiconductor light emitting element having a center wavelength of output light of 950 nm, and is arranged on the pedestal portion 10 adjacent to the outside of the light emitting element 11 in a ring shape at equal intervals. That is, the distance of the light emitting element 12 from the optical axis of the cameras 2a, 2b, 2c, 2d is set to be larger than the distance from the optical axis of the light emitting element 11.
- each light emitting element 11 and 12 is provided on the base part 10 so that illumination light may be emitted along the optical axis of the cameras 2a, 2b, 2c and 2d.
- the arrangement of the light sources is not limited to the above-described configuration, and other arrangement configurations may be adopted as long as the camera can be regarded as a pinhole model.
- These light emitting elements 11 and 12 can be independently controlled in light emission timing by the light emitting circuit 4. Specifically, the light emission timing is set so that the light emitting elements 11 and 12 emit light alternately in accordance with the shutter timings of the cameras 2a, 2b, 2c and 2d synchronized with the VD signal output from the synchronization signal generator 5. Be controlled.
- the four sets of light emitting elements 11 and the four sets of light emitting elements 12 are alternately lit so as to be synchronized with the shooting timings of the odd and even fields of the cameras 2a, 2b, 2c, and 2d, respectively.
- the bright and dark pupil images of the eyeball B are reflected in the odd and even fields generated by the cameras 2a, 2b, 2c, and 2d, respectively.
- the image processing device 7 processes the image data output from the four cameras 2a, 2b, 2c, 2d. Specifically, the image processing device 7 separates one frame of image data output from the cameras 2a, 2b, 2c, and 2d into an odd field and an even field. For example, this odd field image data (odd image data) is a bright pupil image, and even field image data (even image data) is a dark pupil image. Since these image data have effective pixels only in the odd field or even field, the image processing device 7 embeds the luminance average of the pixel lines of the adjacent effective pixels in the pixel values between the lines. Bright pupil image data and dark pupil image data are generated.
- the image processing device 7 repeatedly detects the left and right pupils of the subject A using the bright pupil image data and the dark pupil image data. That is, a difference image between bright pupil image data and dark pupil image data is generated, a window is set based on the position of the pupil detected in the previous pupil detection process, and the pupil is searched within the range of the window. Specifically, the image processing apparatus 7 binarizes the difference image with a threshold value determined by the P-tile method, and then performs isolated point removal and labeling, and the connected components of the labeled pixels are connected.
- a pupil candidate is selected from shape parameters such as the area, size, area ratio, squareness, and pupil feature amount that are likely to be pupils. Further, from among the connected components of the selected pupil candidates, those having a predetermined relationship between the two pupil candidates are determined as the left and right pupils, and the center coordinates of the left and right pupils in the image data are calculated.
- the image processing device 7 also detects the position of the corneal reflection point, which is the reflection point on the cornea of light from the light source in the left and right eyes of the subject A, with the bright pupil image data and the dark pupil image data as targets. . That is, a window centered on the detected pupil is set, image data in which only the window range is increased in resolution is created, and corneal reflection is detected from the image data. Specifically, a threshold value for binarization is determined by the P tile method, a binarized image is created from the image, labeling is performed, and a portion whose area is not more than a certain value is selected.
- the image processing device 7 gives a separability filter to the center coordinates of the selected portion, obtains a feature amount obtained by multiplying the separability and the luminance, and if the value is equal to or less than a certain value, it is not corneal reflection. Judge. Further, the image processing device 7 calculates the movement amount of the corneal reflection in the bright pupil image data and the dark pupil image data, and sets the movement amount as the difference position correction amount. The image processing device 7 shifts the difference position correction amount so that the corneal reflection positions of the bright pupil image data and the dark pupil image data coincide with each other, adds the luminance of the image data, and sets the luminance centroid coordinates to the corneal reflection coordinates. And decide.
- the image processing device 7 calculates the three-dimensional positions of the left and right pupils of the subject A from the pupil center coordinates detected based on the image data output from the two cameras 2a and 2b. At this time, the image processing device 7 measures the three-dimensional coordinates of the left and right pupils by a stereo method.
- the stereo method measures the internal parameters such as the focal length of the camera lens, image center, and pixel size, and external parameters such as the camera position and orientation in advance, and shoots the object using multiple stereo cameras. In some cases, the position of a point in space is determined using the internal and external parameters based on the coordinates of the point in the image.
- a coordinate system as shown in FIG. 3 is used.
- World coordinate system shown in FIG. (X W, Y W, Z W) is two cameras 2a, the coordinate system origin O W is positioned for example at the center of the screen of the display unit 8 to be shared by 2b, the camera coordinate
- the system (X, Y, Z) is a coordinate system in which the origin C is the optical center of the cameras 2a and 2b, and the Z axis is parallel to the optical axis drawn perpendicularly to the image plane from the optical center.
- the coordinate system (X G , Y G ) is a coordinate system that is parallel to the XY plane along the image plane on which the image sensor is placed and has the intersection Ci (image center) between the optical axis and the image plane as the origin Ci.
- the projection point (X d , Y d ) on the image coordinate system when using the cameras 2a, 2b is an ideal projection point (X u , Y u ) due to image distortion. ). Therefore, in order to accurately perform the three-dimensional position measurement using the stereo method, it is necessary to previously acquire calibration data in which the correspondence between the world coordinates of the target point P and the image coordinates is recorded.
- the translation vector of the camera coordinate system with respect to the world coordinates as external parameters and the rotation matrix of the camera coordinate system with respect to the world coordinate system, the focal length as the internal parameters, the image center coordinates, the scale A coefficient, a lens distortion coefficient, an image sensor interval, and the like are acquired in advance and stored in the image processing device 7.
- the image processing device 7 represents the relational expression between the pupil center coordinates in the image coordinate system detected based on the output data from the two cameras 2a and 2b and the pupil center coordinates in the world coordinate system, and the calibration data. Get with reference.
- the image processing device 7 obtains the three-dimensional position coordinates in the world coordinate system of the pupil of the subject A from the two relational expressions.
- the image processing apparatus 7 can obtain the three-dimensional positions of the left and right pupils of the subject A.
- the image processing device 7 detects the gaze point Q of the subject on the display device 8 using the position of the left or right corneal reflection point of the subject A and the position of the pupil center.
- a procedure for detecting the gazing point Q by the image processing apparatus 7 will be described with reference to FIGS. 4 and 5.
- a gaze point detection procedure using only the camera images of the cameras 2a and 2b will be described.
- the center of the openings 9a and 9b of the cameras 2a and 2b is set as the origin O, and a reference line OP connecting the origin O and the pupil P is used.
- the X ′ axis corresponds to a line of intersection between the X W -Y W plane of the world coordinate system and the virtual viewpoint plane X′-Y ′.
- the image processing device 7 calculates a vector r G from the corneal reflection point G to the pupil center P on the image plane S G. Then, the vector r G is converted into a vector r converted to the actual size using the magnification rate of the camera obtained from the distance OP (vector calculation step).
- each camera 2a, 2b is considered as a pinhole model, and it is assumed that the corneal reflection point G and the pupil center P are on a plane parallel to the virtual viewpoint plane X′-Y ′. That is, the image processing apparatus 7 calculates the relative coordinate between the pupil center P and the corneal reflection point G as a vector r on a plane parallel to the virtual viewpoint plane and including the three-dimensional coordinates of the pupil P. r represents the actual distance from the corneal reflection point G to the pupil center P.
- Such calculation of the angles ⁇ and ⁇ is performed by assuming that the vector r on the plane where the pupil center P exists is enlarged on the virtual viewpoint plane and corresponds to the gaze point of the subject A as it is. More specifically, it is assumed that the angle ⁇ of the subject A's line of sight PT with respect to the reference line OP has a linear relationship between the pupil center and the correction value
- of the corneal reflection distance. Note that the origin correction vector r 0 included in the function f 1 includes the vector r because the actual corneal reflection-pupil center vector r 0 when the subject A looks at the camera ( ⁇ 0) is not zero. 0 is set.
- the gain value k and the origin correction vector r 0 need to be calibrated because they differ depending on each subject A and the left and right eyeballs. Therefore, as the gain value k and the origin correction vector r 0, values corrected by parameter correction processing described later with respect to preset initial values are used.
- the image processing apparatus 7 refers to the subject A by referring to ⁇ 1 , ⁇ 2 , ⁇ 1 , and ⁇ 2 that are angles ⁇ and ⁇ calculated corresponding to the camera images of the two cameras 2 a and 2 b.
- the gazing point on the screen of the display device 8 is detected (gazing point detection step).
- a coordinate system as shown in FIG. 5 is defined.
- Two virtual viewpoint planes H 1 and H 2 having origins O 1 ′ and O 2 ′ corresponding to the positions of the two cameras 2b and 2a, and a virtual viewpoint spherical surface S having an arbitrary radius around the pupil center P Define.
- the two virtual viewpoint planes H 1 and H 2 are planes perpendicular to the straight lines PO 1 ′ and PO 2 ′, respectively.
- the intersection of the intersection of the virtual viewpoint sphere S and a straight line (line of sight) passing through the fixation point Q on the pupil center P and the display screen and a straight line passing through the G S, pupil center P and the origin O 1 'and the virtual viewpoint sphere S O 1 , and the intersection of the virtual viewpoint spherical surface S and the straight line passing through the pupil center P and the origin O 2 ′ is O 2 .
- the angle formed by the horizontal axis of the straight line O 1 ′ G 1 and the virtual viewpoint plane H 1 is ⁇ 1 .
- angle between the horizontal axis and the straight line O 2 'G 2 and the virtual viewpoint plane H 2 is phi 2.
- the angle formed line of intersection between the horizontal plane and the sphere S passing through the point O 1 at the point O 1 (curve) of the curve O 1 G S is equal to the angle phi 1.
- the angle formed line of intersection between the horizontal plane and the sphere S passing through the point O 2 at the point O 2 (curve) of the curve O 2 G S is equal to the angle phi 2.
- the point P, O 1, O 1 ' is present on the same straight line L 1
- the point P, O 2, O 2' so are present on the same straight line L 2
- the straight line L 1 The angle formed by the line of sight is ⁇ 1
- the angle formed by the straight line L 2 and the line of sight is ⁇ 2 .
- the image processing apparatus 7 uses the relationship as described above to refer to the position coordinates of the origins O 1 ′ and O 2 ′ known in advance and the position and orientation data of the display device 8 on the screen.
- a gaze point can be calculated. That is, the relative positional relationship of the points G S , O 1 , O 2 on the virtual viewpoint spherical surface S from the angles ⁇ 1 , ⁇ 2 , ⁇ 1 , ⁇ 2 calculated from the camera images of the two cameras 2a, 2b. Can be obtained.
- the image processing apparatus 7 is a known origin O 1 ', O 2' and coordinates, from an already calculated pupil center P of coordinates, uniquely able to determine the line of sight PG S, and the line of sight PG S
- the gazing point Q can be detected by calculating the intersection with the screen of the display device 8.
- the function f 1 used by the image processing device 7 in the line-of-sight direction calculation step includes a gain value k and an origin correction vector r 0 as parameters.
- the gain value k has a linear relationship between the magnitude of the vector (r ⁇ r 0 ) after adjusting the corneal reflection-pupil center vector r and the angle ⁇ indicating the line-of-sight direction. Assuming that there is a magnification, this is the magnification used when obtaining the angle ⁇ from the vector r. Ideally, if the angle ⁇ and the vector
- FIG. 6C shows a gazing point T on the virtual viewpoint plane including the camera position O ′.
- FIGS. 6A and 6B and FIGS. 7A and 7B show the camera.
- the vector r observed on the image acquired by (2) is shown.
- the length of the line segment O′T on the virtual viewpoint plane can be calculated by
- ⁇ ′ ⁇ is established between the angle ⁇ ′ on the camera image and the angle ⁇ on the virtual viewpoint plane.
- the pupil center P and the corneal reflection point G in the camera image do not match.
- the corneal reflection point G is located on the lower right side with respect to the pupil center P.
- the position of the pupil center P is further deviated from the corneal reflection G.
- a coordinate system having the pupil center P when the subject A is looking at the camera as the origin is indicated by a dotted line.
- the vector r 0 is the origin correction vector.
- the parameters k and r 0 described above differ depending on the subject A, it is necessary to obtain parameters by performing parameter calibration in advance.
- the parameters k, r 0 are undetermined constants at an initial stage after the apparatus is started, and it is necessary to determine them in advance by parameter calibration in order to accurately detect the line of sight.
- a parameter correction procedure executed by the image processing apparatus 7 before the gazing point detection process will be described.
- FIG. 8 is a diagram in which the points O 1 , O 2 , G S projected on the virtual viewpoint spherical surface S shown in FIG. 5 are further projected on the plane.
- the vector r 1, r 2 is the actual size of the corneal reflection calculated from each camera 2a, the image taken with 2b when the subject is viewed in the direction of point G S on the virtual viewpoint sphere S - pupil center vector
- the line-of-sight angles ⁇ 1 and ⁇ 2 can be calculated from the specified points, and if this is applied to the above equation (6) including two relational expressions together with the vectors r 1 and r 2 , the three unknown parameters k, x 0 , y 0 is included in the two relational expressions. Therefore, in order to obtain these unknown parameters, it is only necessary to establish three or more relational expressions.
- the image processing apparatus 7 can calculate the parameters k, x 0 , and y 0 and store them as correction values by establishing simultaneous equations with three relational expressions. As shown in FIG. 1, the gaze point detection device 1 is provided with four sets of cameras 2a, 2b, 2c, and 2d and light sources 3a, 3b, 3c, and 3d. In order to realize parameter correction processing. It is sufficient that at least three sets are provided.
- the parameter correction process as described above is executed for 30 frames of a camera image within a period of about 1 second, and a value obtained by averaging the parameters calculated for each frame is stored as a correction value.
- and the angle ⁇ is nonlinear may be used.
- the linearity as shown in Equation (3) is established until the line of sight angle ⁇ is about 20 degrees, but the nonlinearity is increased with respect to many subjects A when it is about 30 degrees.
- the image processing apparatus 7 has four parameters k, h, x 0 , and y 0 as unknown parameters. Therefore, more than four relational expressions are required to realize the parameter correction processing.
- the image processing apparatus 7 may use an expression including a plurality of other nonlinear terms such as the square of
- the multiplier of the nonlinear term itself may be set as an undetermined parameter. Also in such a case, the number of cameras of the gazing point detection apparatus 1 is set to be equal to or greater than a predetermined number so that a relational expression greater than the number of parameters that need correction is derived.
- the positions at which the distances from the cameras are shifted from each other are not positions at equal distances from the cameras. It is preferable to keep a gaze. In this case, for example, when the right end of the display screen is watched, the distances from the cameras are different from each other so that non-linear parameters are accurately obtained, and the calibration accuracy is further improved.
- the angle ⁇ i is used as a scalar quantity for parameter correction, but the angle ⁇ i may be handled as a vector as described below.
- FIG. 10 in the projection view onto the plane of the virtual viewpoint spherical surface S shown in FIG. 8, the angle ⁇ i is shown as a vector.
- the origin correction vectors r 10 and r 20 are expressed by the following formula (11) and the following formula (12); It is expressed.
- the unknown parameter is the reciprocal of the two components of the origin correction vector r 0 and the gain value k, in contrast to the four relational expressions that take into account the real and imaginary components included in the above equations (11) and (12).
- the image processing device 7 calculates three parameters and determines the correction values based on camera images obtained by at least two cameras when the target person A gazes at one specified point. be able to.
- the image processing apparatus 7 may use a function in which the relationship between the vector
- s
- the number of undetermined constants is M
- the number of cameras is at least M ⁇ 1/2 (rounded up). It turns out that it is necessary.
- the four cameras 2a, 2b, 2c, 2d and the light sources 3a, 3b, 3c, 3d outside the opening are provided.
- relational expressions more than the number of unknown parameters are derived based on the respective angles ⁇ i calculated in this way, and the parameters included in the function are corrected using these relational expressions. Then, the direction of the line of sight and the gazing point Q are detected from the face image of the subject A using the corrected function.
- automatic correction related to the function for calculating the line-of-sight direction is performed with high accuracy without causing the subject A to gaze at a plurality of specified points and without causing the subject to gaze at the opening of the camera.
- the burden on the subject is reduced, and high-speed and high-precision gaze point detection is possible.
- the gain value k can be accurately obtained as a result, and the gaze angle ⁇ i can be calculated more accurately over the entire display screen for detecting the viewpoint. Can do.
- the error of the correction value of the gain value k also increases, and the deviation (gazing point detection error) between the point at which the subject A is actually gazing and the detected gazing point position changes depending on the index position. Subsequent recorrection becomes difficult.
- the burden on the subject A is reduced, and it is not necessary to view the opening of the camera as the specified point.
- the viewpoint detection error is also reduced over the entire display screen.
- the gaze point detection device 1 and the gaze point detection method of the present embodiment are different in the ratio of the time when the subject is looking at the human eye reflected on the person's eye or the display facing the subject and the time when the subject is looking elsewhere. It is applicable also to the system which supports a test subject's autism diagnosis by judging.
- the present invention is not limited to the embodiment described above.
- various other modifications can be adopted as the parameter correction procedure.
- the image processing apparatus 7 completes the parameter calibration while the subject is looking at an appropriate position without causing the subject A to look at the specified point specified in advance by the following parameter calibration procedure. You can also
- FIG. 12 is a diagram showing the positional relationship between the left and right pupils of the subject A and the gazing point Q on the screen of the display device 8 for explaining the parameter calibration procedure at this time.
- the P L and P R respectively, represent the center coordinates of the left pupil and the right pupil
- the fixation point Q is assumed to be a point where both eyes of the subject A are seen commonly. That is, the straight line P L Q and the straight line P R Q represent the visual axes of the left and right eyes.
- each of the straight line P L Q, P R Q point G 1 'L, G 2' on L and G 1 'R, G 2' R , the camera 2b, the position O 1 of 2a ', O 2' and It represents the intersection of two virtual viewpoint planes and their straight lines.
- the line-of-sight angles ⁇ detected corresponding to the left and right eyeballs of the subject A are ⁇ 1 L , ⁇ 1 R , ⁇ 2 L , and ⁇ 2 R , respectively, and the gain value k and origin correction vector corresponding to the left and right eyeballs are used.
- r 0 be k L , k R and r 0 L , r 0 R.
- the image processing device 7 has the following formula (20); Can be used to calculate the angle ⁇ 1 R. Further, the angle ⁇ 1 R is expressed by the following formula (21) using the inner product of the vector P R Q and the vector P R O 1 ′; From these equations (20) and (21), the following equation (22); The relationship can also be derived. Similarly, the following equations (23) to (25) are derived for the angles ⁇ 1 L , ⁇ 2 R , and ⁇ 2 L.
- the image processing apparatus 7 uses the constraint condition that the intersection point Q on the display screen of the line of sight of the left and right eyes of the subject A matches the cameras obtained by the four cameras 2a, 2b, 2c, and 2d.
- the image processing apparatus 7 can calculate eight parameters based on camera images obtained by at least two cameras and determine the correction values.
- the image processing apparatus 7 may use a function in which the relationship between the vector
- the cameras 2a, 2b, 2c, and 2d digital cameras such as a CCD camera and a CMOS camera may be used.
- the coefficient k is also obtained accurately, and the line-of-sight angle ⁇ can be calculated more accurately over the entire surface from which the viewpoint is detected.
- the inclination ⁇ of the vector r corrected by the vector r 0 on the face images of N cameras is further calculated, and in the undetermined constant determination step, the inclination ⁇ and the angle ⁇ are at least based. It is also possible to determine M undetermined constants using a plurality of derived relational expressions. In this case, the function f is corrected based on the angle of the line of sight along the image plane of the camera, so that the accuracy of calculation of the corrected line of sight direction is maintained and the line of sight calculation is performed while reducing the number of cameras. Calibration is realized.
- the angle ⁇ corresponding to the N cameras when the subject gazes at the specified point on the predetermined surface is calculated.
- the undetermined constant determination step the position and angle ⁇ of the specified point are calculated.
- M undetermined constants can also be determined. This makes it possible to determine M undefined constants by gazing at one specified point on the predetermined screen, thereby reducing the burden on the subject during the calibration process and immediately paying attention. Viewpoint detection is performed.
- vectors r R and r L from the corneal reflection point to the pupil of each of the left and right eyes of the subject are calculated based on the face images from the N cameras, and in the gaze direction calculation step Based on the respective vectors r R and r L corresponding to the N cameras, the angles ⁇ R and ⁇ L of the eyes of the subject's left and right eyes with respect to the reference lines for the N cameras, respectively, and the function f
- M points are used under the condition that the intersection of the line of sight of the left and right eyes and the predetermined plane coincides. It is also possible to determine an undetermined constant.
- the function f can be automatically corrected without causing the subject to pay attention to the specified point, and the burden on the subject during the calibration process can be further reduced.
- the present invention uses a gazing point detection method and a gazing point detection device for detecting a gazing point of a subject on a predetermined plane based on an image of the subject, and uses the gazing point detection apparatus with high speed and high accuracy while reducing the burden on the subject. It is possible to realize viewpoint detection.
Abstract
Description
θ=f(|r-r0|) …(1)
によって計算する視線方向計算ステップと、N台のカメラに対応して計算された角度θを少なくとも基にして導かれた複数の関係式を用いて、関数fに含まれるM個の未定定数を決定する未定定数決定ステップと、未定定数決定ステップによって決定されたM個の未定定数を用いて視線方向計算ステップによって計算された視線の方向に基づき、対象者の注視点を検出する注視点検出ステップと、を備え、カメラの台数Nは、M×1/2個以上に設定されている、ことを特徴とする。
θ=f(|r-r0|) …(1)
によって計算し、N台のカメラに対応して計算された角度θを少なくとも基にして導かれた複数の関係式を用いて、関数fに含まれるM個の未定定数を決定し、当該決定されたM個の未定定数を用いて上記式(1)によって計算された視線の方向に基づき、対象者の注視点を検出し、カメラの台数Nは、M×1/2個以上に設定されている、ことを特徴とする。
まず、本発明にかかる注視点検出装置を実施するための注視点検出装置の構成について、図面を参照しながら説明する。本発明の注視点検出装置は、対象者の顔画像を基にパーソナルコンピュータ等の情報処理端末のモニター画面上の注視点を検出する装置である。
ここでは、図4に示すように、検出した瞳孔の3次元位置Pを元に、カメラ2a,2bの開口部9a,9bの中心を原点Oとし、その原点Oと瞳孔Pを結ぶ基準線OPを法線とする仮想視点平面X’-Y’を設定する。ここで、X’軸は、世界座標系のXW-YW平面と仮想視点平面X’-Y’との交線に相当する。
θ=f1(r)=k×|r-r0| …(3)
によって計算する(視線方向計算ステップ)。
r’=r-r0 …(4)
を用いてベクトルr’が求まる(図7(a))。さらに、求められたベクトルr’を対象にゲイン値kを適用すれば角度θを正しく求めることができ、さらにベクトルr’から角度φも求めることができる。このベクトルr0が原点補正ベクトルである。
図8は、図5に示す仮想視点球面S上に投影された点O1,O2,GSを平面上にさらに投影した図である。同図でベクトルとして示されたθ1,θ2は、それぞれ視線の角度を示している。ここで、ベクトルr1,r2は対象者が仮想視点球面S上の点GSの方向を見たときのそれぞれのカメラ2a,2bで撮影した画像から算出した実寸の角膜反射-瞳孔中心ベクトルであり、対象者Aがカメラ方向の点O1,O2,O1’,O2’を見たときの角膜反射-瞳孔中心ベクトルをr0=(x0,y0)とすると、ベクトルr1,r2の原点補正後のベクトルr1’,r2’は、下記式(5);
r1’=r1-r0
r2’=r2-r0 …(5)
によって表される。さらに式(3)から、下記式(6);
θ1=k|r1-r0|
θ2=k|r2-r0| …(6)
の関係も得られる。
θ3=k|r3-r0| …(7)
に代入して3つ目の関係式を導く。そして、画像処理装置7は、3式の関係式で連立方程式を立てることにより、パラメータk,x0,y0を計算して補正値として記憶させることができる。なお、図1に示すように、注視点検出装置1には4組のカメラ2a,2b,2c,2d及び光源3a,3b,3c,3dが設けられているが、パラメータ補正処理を実現するためには、少なくとも3組が設けられていればよい。
θ=f2(r)=k|r’|+h|r’|4 …(8)
のように、ベクトル|r’|と角度θとの関係が非線形な関数f2を用いても良い。視線の角度θが20度程度までは式(3)のような線形性が成立するが、30度程度になると多くの対象者Aに関して非線形性が強くなる。この場合、画像処理装置7は、未知のパラメータとして4つのパラメータk,h,x0,y0を持つことになるので、パラメータ補正処理を実現するために、4式以上の関係式が必要となる。そこで、画像処理装置7は、対象者Aに1点の規定点を注視させた際の4台のカメラ2a,2b,2c,2dのカメラの画像を用いて、ベクトルri=(xi,yi)(i=1~4)を検出し、このベクトルriと角度θiを上記式(8)に代入して4つの関係式を導く。そして、画像処理装置7は、4式の関係式で連立方程式を立てることにより、4つのパラメータk,h,x0,y0を計算して補正値として記憶させることができる。すなわち、この場合にパラメータ補正を行うためには、カメラと光源の組み合わせが少なくとも4組必要ということになる。
また、上記のパラメータ補正手順では、角度θiをスカラー量としてパラメータ補正に用いていたが、下記のように角度θiをベクトルとして扱ってもよい。図10では、図8に示した仮想視点球面Sの平面上への投影図において、角度θiがベクトルとして図示されている。また、図11(a),(b)には、2台のカメラ2b,2aのカメラ画像上で検出されたベクトルri,ri’(i=1,2)を図示している。このように、画像処理装置7によってカメラ画像から直接検出されるのはベクトルr1,r2であり、ベクトルr10,r20は、それぞれのカメラ画像に対応する原点補正ベクトルである。
が成り立つ。さらに、原点補正ベクトルr10,r20は、下記式(11)及び下記式(12);
と表される。ここで、原点補正ベクトルr10,r20は、2つのカメラでの検出結果において等しいと考えられるので、r10=r20=r0の関係より、以下の2つの関係式(13),(14)が導かれる。これらの式(13),(14)中における未知パラメータはパラメータsのみであるので、画像処理装置7は、両方の式からパラメータsを求めてその平均値を補正値とすることができる。さらに、画像処理装置7は、決定されたパラメータsを式(11)に代入することで、原点補正ベクトルr0を決定することができる。
|ri’|=s|θi|-t|θi|4 …(15)
を用いる。このような非線形関数に基づいて、原点補正ベクトルr10,r20は、下記式(16)及び下記式(17)のように計算される。
上記式(16),(17)に含まれる実数成分及び虚数成分を考慮した4式の関係式に対して、未知のパラメータは原点補正ベクトルr0の2成分とパラメータs,tの4つである。従って、画像処理装置7は、対象者Aに1点の規定点を注視させた際の少なくとも2台のカメラによって得られたカメラ画像を基に、非線形要素も考慮した4つのパラメータを計算して補正値として決定することができる。
さらに、図10を参照すると、下記式(19);
が得られる。上記式(19)の右辺は角O1PO2をベクトル化して表現したものである。上記式(18)に含まれる実数成分及び虚数成分を考慮した4式の関係式に対して、未知のパラメータは原点補正ベクトルr0の2成分とパラメータkの3つである。従って、画像処理装置7は、対象者Aに1点の規定点を注視させた際の少なくとも2台のカメラによって得られたカメラ画像を基に、3つのパラメータを計算して補正値として決定することができる。
また、上記関数が原点補正ベクトルr0も含めて補正されるので、その結果ゲイン値kも正確に求められ、視点を検出するディスプレイ画面の全体にわたって視線の角度θiをより正確に計算することができる。
を用いて角度θ1 Rを計算することができる。また、角度θ1 RはベクトルPRQとベクトルPRO1’の内積を用いて、下記式(21);
の関係も導かれ、これらの式(20),(21)から、下記式(22);
の関係も導くことができる。同様にして、角度θ1 L,θ2 R,θ2 Lに関しても、下記式(23)~(25)が導かれる。
が導かれ、同様に下記式(27)も導かれる。
さらに、原点補正ベクトルに関して下記式(28);
も成立する。上記式(28)に含まれる実数成分及び虚数成分を考慮した8式の関係式に対して、未知のパラメータは2つの原点補正ベクトルr0 R,r0 Lの4成分とゲイン値kの逆数であるパラメータsR,sL、及び注視点Qの2次元座標の合計8つである。従って、画像処理装置7は、少なくとも2台のカメラによって得られたカメラ画像を基に、8つのパラメータを計算して補正値として決定することができる。また、画像処理装置7は、ベクトル|ri’|と角度θiとの関係が非線形な関数を用いても良く、その場合は式(15)~(17)と同様な関係式を使用すれば、未定のパラメータを2つ増やしてもカメラを3台使用すればパラメータ補正を行うことができる。
θ=k|r-r0| …(2)
を用いて角度θを計算し、未定定数決定ステップでは、係数k及びベクトルr0を決定する、ことができる。この場合、関数fが原点補正も含めて決定されるので、その結果係数kも正確に求められ、視点を検出する面全体にわたって視線の角度θをより正確に計算することができる。
Claims (6)
- N台(Nは2以上の自然数)のカメラ及び複数の光源を用いて、対象者の顔画像を生成する顔画像生成ステップと、
前記N台のカメラによるそれぞれの顔画像に基づいて、前記光源からの光の前記対象者の角膜上の反射点である角膜反射点から瞳孔中心までの実距離を表すベクトルrを計算するベクトル計算ステップと、
前記N台のカメラに対応するそれぞれの前記ベクトルrを基に、前記N台のカメラと前記瞳孔中心とを結ぶ基準線それぞれに対する前記対象者の視線の角度θを、関数fを用いて、少なくとも前記ベクトルrのオフセットベクトルであるベクトルr0を含むM個の未定定数(Mは3以上の自然数)を含む下記式(1);
θ=f(|r-r0|) …(1)
によって計算する視線方向計算ステップと、
前記N台のカメラに対応して計算された前記角度θを少なくとも基にして導かれた複数の関係式を用いて、前記関数fに含まれる前記M個の未定定数を決定する未定定数決定ステップと、
前記未定定数決定ステップによって決定された前記M個の未定定数を用いて前記視線方向計算ステップによって計算された前記視線の方向に基づき、前記対象者の注視点を検出する注視点検出ステップと、
を備え、
前記カメラの台数Nは、M×1/2個以上に設定されている、
ことを特徴とする注視点検出方法。 - 前記視線方向計算ステップでは、前記未定定数として係数k及び前記ベクトルr0を含む下記式(2);
θ=k|r-r0| …(2)
を用いて前記角度θを計算し、
前記未定定数決定ステップでは、前記係数k及び前記ベクトルr0を決定する、
ことを特徴とする請求項1記載の注視点検出方法。 - 前記視線方向計算ステップでは、前記N台のカメラの前記顔画像上における前記ベクトルr0による補正後の前記ベクトルrの傾きφをさらに計算し、
前記未定定数決定ステップでは、前記傾きφ及び前記角度θを少なくとも基にして導かれた複数の関係式を用いて、前記M個の未定定数を決定する、
ことを特徴とする請求項1又は2記載の注視点検出方法。 - 前記視線方向計算ステップでは、前記対象者に所定面上の規定点を注視させた際の前記N台のカメラに対応する前記角度θを計算し、
前記未定定数決定ステップでは、前記規定点の位置及び前記角度θを基に、前記M個の未定定数を決定する、
ことを特徴とする請求項1~3のいずれか1項に記載の注視点検出方法。 - 前記ベクトル計算ステップでは、前記N台のカメラによるそれぞれの顔画像に基づいて、前記対象者の左右の目それぞれの前記角膜反射点から前記瞳孔中心までのベクトルrR,rLを計算し、
前記視線方向計算ステップでは、前記N台のカメラに対応するそれぞれの前記ベクトルrR,rLを基に、前記N台のカメラに関する前記基準線それぞれに対する前記対象者の左右の目それぞれの視線の角度θR,θLを、前記関数fを用いて計算し、
前記未定定数決定ステップでは、前記N台のカメラに対応する前記角度θR,θLを基に、前記左右の目の視線と所定面との交点が一致するという条件を用いて、前記M個の未定定数を決定する、
ことを特徴とする請求項1~3のいずれか1項に記載の注視点検出方法。 - 対象者の顔画像に基づいて、該対象者の注視点を検出する注視点検出装置であって、
前記対象者の顔画像を取得するN台のカメラと、
複数の光源と、
前記カメラ及び光源を制御する制御回路と、
前記N台のカメラから出力された画像信号を処理する画像処理部とを備え、
前記画像処理部は、
前記N台のカメラによるそれぞれの顔画像に基づいて、前記光源からの光の前記対象者の角膜上の反射点である角膜反射点から瞳孔中心までの実距離を表すベクトルrを計算し、前記N台のカメラに対応するそれぞれの前記ベクトルrを基に、前記N台のカメラと前記瞳孔中心とを結ぶ基準線それぞれに対する前記対象者の視線の角度θを、関数fを用いて、少なくとも前記ベクトルrのオフセットベクトルであるベクトルr0を含むM個の未定定数(Mは3以上の自然数)を含む下記式(1);
θ=f(|r-r0|) …(1)
によって計算し、
前記N台のカメラに対応して計算された前記角度θを少なくとも基にして導かれた複数の関係式を用いて、前記関数fに含まれる前記M個の未定定数を決定し、
当該決定された前記M個の未定定数を用いて上記式(1)によって計算された前記視線の方向に基づき、前記対象者の注視点を検出し、
前記カメラの台数Nは、M×1/2個以上に設定されている、
ことを特徴とする注視点検出装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/992,877 US9329683B2 (en) | 2010-12-08 | 2011-12-07 | Method for detecting point of gaze and device for detecting point of gaze |
JP2012547890A JP5858433B2 (ja) | 2010-12-08 | 2011-12-07 | 注視点検出方法及び注視点検出装置 |
EP11847335.4A EP2649932A4 (en) | 2010-12-08 | 2011-12-07 | Method for detecting point of gaze and device for detecting point of gaze |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-274074 | 2010-12-08 | ||
JP2010274074 | 2010-12-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012077713A1 true WO2012077713A1 (ja) | 2012-06-14 |
Family
ID=46207199
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/078302 WO2012077713A1 (ja) | 2010-12-08 | 2011-12-07 | 注視点検出方法及び注視点検出装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9329683B2 (ja) |
EP (1) | EP2649932A4 (ja) |
JP (1) | JP5858433B2 (ja) |
WO (1) | WO2012077713A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2778846A2 (en) * | 2013-03-15 | 2014-09-17 | Tobii Technology AB | Eye/gaze tracker and method of tracking the position of an eye and/or a gaze point of a subject |
JP2014195641A (ja) * | 2013-03-07 | 2014-10-16 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
WO2014188727A1 (ja) * | 2013-05-22 | 2014-11-27 | 国立大学法人神戸大学 | 視線計測装置、視線計測方法および視線計測プログラム |
JP2015027423A (ja) * | 2013-06-28 | 2015-02-12 | 株式会社Jvcケンウッド | 制御装置、診断支援装置、制御方法及び制御プログラム |
JP2015043963A (ja) * | 2013-07-29 | 2015-03-12 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
JP2015043966A (ja) * | 2013-07-30 | 2015-03-12 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
JP2015169959A (ja) * | 2014-03-04 | 2015-09-28 | 国立大学法人静岡大学 | 回転角度算出方法、注視点検出方法、情報入力方法、回転角度算出装置、注視点検出装置、情報入力装置、回転角度算出プログラム、注視点検出プログラム及び情報入力プログラム |
JP2015175971A (ja) * | 2014-03-14 | 2015-10-05 | 富士通株式会社 | Led光源装置、led制御装置及び端末装置 |
JP2016122380A (ja) * | 2014-12-25 | 2016-07-07 | 国立大学法人静岡大学 | 位置検出装置、位置検出方法、注視点検出装置、及び画像生成装置 |
JP2017111746A (ja) * | 2015-12-18 | 2017-06-22 | 国立大学法人静岡大学 | 視線検出装置及び視線検出方法 |
CN108737642A (zh) * | 2018-04-13 | 2018-11-02 | 维沃移动通信有限公司 | 内容的显示方法及装置 |
US10417782B2 (en) | 2014-08-22 | 2019-09-17 | National University Corporation Shizuoka University | Corneal reflection position estimation system, corneal reflection position estimation method, corneal reflection position estimation program, pupil detection system, pupil detection method, pupil detection program, gaze detection system, gaze detection method, gaze detection program, face orientation detection system, face orientation detection method, and face orientation detection program |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5414946B2 (ja) * | 2011-06-16 | 2014-02-12 | パナソニック株式会社 | ヘッドマウントディスプレイおよびその位置ずれ調整方法 |
US8885882B1 (en) * | 2011-07-14 | 2014-11-11 | The Research Foundation For The State University Of New York | Real time eye tracking for human computer interaction |
JP5983131B2 (ja) * | 2012-07-19 | 2016-08-31 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
EP2887184A1 (en) * | 2013-12-23 | 2015-06-24 | Movea | Air pointer with improved user experience |
US9430040B2 (en) * | 2014-01-14 | 2016-08-30 | Microsoft Technology Licensing, Llc | Eye gaze detection with multiple light sources and sensors |
JP2015152939A (ja) | 2014-02-10 | 2015-08-24 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
EP3153092B1 (en) * | 2014-06-09 | 2021-08-04 | National University Corporation Shizuoka University | Pupil detection system, gaze detection system, pupil detection method, and pupil detection program |
KR101610496B1 (ko) * | 2014-08-26 | 2016-04-07 | 현대자동차주식회사 | 시선 추적 방법 및 장치 |
EP3187100A4 (en) * | 2014-08-29 | 2018-05-09 | Alps Electric Co., Ltd. | Line-of-sight detection device |
JP2016151798A (ja) * | 2015-02-16 | 2016-08-22 | ソニー株式会社 | 情報処理装置および方法、並びにプログラム |
JP6327171B2 (ja) * | 2015-02-16 | 2018-05-23 | 株式会社Jvcケンウッド | 注視点検出装置および注視点検出方法 |
US10061383B1 (en) * | 2015-09-16 | 2018-08-28 | Mirametrix Inc. | Multi-feature gaze tracking system and method |
KR101697286B1 (ko) * | 2015-11-09 | 2017-01-18 | 경북대학교 산학협력단 | 사용자 스타일링을 위한 증강현실 제공 장치 및 방법 |
JP6097377B1 (ja) * | 2015-11-27 | 2017-03-15 | 株式会社コロプラ | 画像表示方法及びプログラム |
WO2017090203A1 (ja) * | 2015-11-27 | 2017-06-01 | フォーブ インコーポレーテッド | 視線検出システム、注視点特定方法及び注視点特定プログラム |
JP6638354B2 (ja) * | 2015-12-01 | 2020-01-29 | 株式会社Jvcケンウッド | 視線検出装置及び視線検出方法 |
US10137893B2 (en) * | 2016-09-26 | 2018-11-27 | Keith J. Hanna | Combining driver alertness with advanced driver assistance systems (ADAS) |
US9898082B1 (en) * | 2016-11-01 | 2018-02-20 | Massachusetts Institute Of Technology | Methods and apparatus for eye tracking |
US10777018B2 (en) * | 2017-05-17 | 2020-09-15 | Bespoke, Inc. | Systems and methods for determining the scale of human anatomy from images |
CN107357429B (zh) * | 2017-07-10 | 2020-04-07 | 京东方科技集团股份有限公司 | 用于确定视线的方法、设备和计算机可读存储介质 |
US11042994B2 (en) | 2017-11-15 | 2021-06-22 | Toyota Research Institute, Inc. | Systems and methods for gaze tracking from arbitrary viewpoints |
US10564716B2 (en) | 2018-02-12 | 2020-02-18 | Hong Kong Applied Science and Technology Research Institute Company Limited | 3D gazing point detection by binocular homography mapping |
US10324529B1 (en) * | 2018-05-31 | 2019-06-18 | Tobii Ab | Method and system for glint/reflection identification |
TWI704473B (zh) * | 2018-11-16 | 2020-09-11 | 財團法人工業技術研究院 | 視線向量偵測方向與裝置 |
US11436756B2 (en) * | 2018-12-20 | 2022-09-06 | Microsoft Technology Licensing, Llc | Calibrating a machine vision camera |
AU2020355335A1 (en) * | 2019-09-27 | 2022-02-03 | Alcon Inc. | Patient-induced trigger of a measurement for ophthalmic diagnostic devices |
EP4033962A1 (en) * | 2019-09-27 | 2022-08-03 | Alcon Inc. | Instant eye gaze calibration systems and methods |
CN111772572B (zh) * | 2020-05-09 | 2022-12-20 | 温州医科大学 | 一种人眼Kappa角测量装置及Kappa角测量方法 |
CN113808160B (zh) * | 2021-08-05 | 2024-01-16 | 虹软科技股份有限公司 | 视线方向追踪方法和装置 |
SE545952C2 (en) * | 2022-03-31 | 2024-03-26 | Tobii Ab | Dynamic Camera Rotation Calibration |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005185431A (ja) | 2003-12-25 | 2005-07-14 | National Univ Corp Shizuoka Univ | 視線検出方法および視線検出装置 |
JP2005198743A (ja) | 2004-01-14 | 2005-07-28 | National Univ Corp Shizuoka Univ | 三次元視点計測装置 |
JP2005230049A (ja) | 2004-02-17 | 2005-09-02 | National Univ Corp Shizuoka Univ | 距離イメージセンサを用いた視線検出装置 |
WO2007113975A1 (ja) * | 2006-03-31 | 2007-10-11 | National University Corporation Shizuoka University | 視点検出装置 |
JP2008029702A (ja) * | 2006-07-31 | 2008-02-14 | National Univ Corp Shizuoka Univ | 瞳孔を検出する方法及び装置 |
JP2009297323A (ja) * | 2008-06-16 | 2009-12-24 | Kobe Univ | 視線計測装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6578962B1 (en) * | 2001-04-27 | 2003-06-17 | International Business Machines Corporation | Calibration-free eye gaze tracking |
US6659611B2 (en) * | 2001-12-28 | 2003-12-09 | International Business Machines Corporation | System and method for eye gaze tracking using corneal image mapping |
US7963652B2 (en) * | 2003-11-14 | 2011-06-21 | Queen's University At Kingston | Method and apparatus for calibration-free eye tracking |
EP1691670B1 (en) * | 2003-11-14 | 2014-07-16 | Queen's University At Kingston | Method and apparatus for calibration-free eye tracking |
WO2005063114A1 (ja) * | 2003-12-25 | 2005-07-14 | National University Corporation Shizuoka University | 視線検出方法および装置ならびに三次元視点計測装置 |
WO2006108017A2 (en) * | 2005-04-04 | 2006-10-12 | Lc Technologies, Inc. | Explicit raytracing for gimbal-based gazepoint trackers |
EP2338416B1 (en) * | 2008-09-26 | 2019-02-27 | Panasonic Intellectual Property Corporation of America | Line-of-sight direction determination device and line-of-sight direction determination method |
US8371693B2 (en) * | 2010-03-30 | 2013-02-12 | National University Corporation Shizuoka University | Autism diagnosis support apparatus |
WO2012020760A1 (ja) * | 2010-08-09 | 2012-02-16 | 国立大学法人静岡大学 | 注視点検出方法及び注視点検出装置 |
-
2011
- 2011-12-07 JP JP2012547890A patent/JP5858433B2/ja active Active
- 2011-12-07 WO PCT/JP2011/078302 patent/WO2012077713A1/ja active Application Filing
- 2011-12-07 US US13/992,877 patent/US9329683B2/en active Active
- 2011-12-07 EP EP11847335.4A patent/EP2649932A4/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005185431A (ja) | 2003-12-25 | 2005-07-14 | National Univ Corp Shizuoka Univ | 視線検出方法および視線検出装置 |
JP2005198743A (ja) | 2004-01-14 | 2005-07-28 | National Univ Corp Shizuoka Univ | 三次元視点計測装置 |
JP2005230049A (ja) | 2004-02-17 | 2005-09-02 | National Univ Corp Shizuoka Univ | 距離イメージセンサを用いた視線検出装置 |
WO2007113975A1 (ja) * | 2006-03-31 | 2007-10-11 | National University Corporation Shizuoka University | 視点検出装置 |
JP2008029702A (ja) * | 2006-07-31 | 2008-02-14 | National Univ Corp Shizuoka Univ | 瞳孔を検出する方法及び装置 |
JP2009297323A (ja) * | 2008-06-16 | 2009-12-24 | Kobe Univ | 視線計測装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2649932A4 |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014195641A (ja) * | 2013-03-07 | 2014-10-16 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
EP2778846A3 (en) * | 2013-03-15 | 2014-12-31 | Tobii Technology AB | Eye/gaze tracker and method of tracking the position of an eye and/or a gaze point of a subject |
EP2778846A2 (en) * | 2013-03-15 | 2014-09-17 | Tobii Technology AB | Eye/gaze tracker and method of tracking the position of an eye and/or a gaze point of a subject |
US10379609B2 (en) | 2013-05-22 | 2019-08-13 | National University Corporation Kobe University | Line-of-sight measurement device, line-of-sight measurement method and line-of-sight measurement program |
WO2014188727A1 (ja) * | 2013-05-22 | 2014-11-27 | 国立大学法人神戸大学 | 視線計測装置、視線計測方法および視線計測プログラム |
JP2015027423A (ja) * | 2013-06-28 | 2015-02-12 | 株式会社Jvcケンウッド | 制御装置、診断支援装置、制御方法及び制御プログラム |
JP2015043963A (ja) * | 2013-07-29 | 2015-03-12 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
JP2015043966A (ja) * | 2013-07-30 | 2015-03-12 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
JP2015043965A (ja) * | 2013-07-30 | 2015-03-12 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
JP2015169959A (ja) * | 2014-03-04 | 2015-09-28 | 国立大学法人静岡大学 | 回転角度算出方法、注視点検出方法、情報入力方法、回転角度算出装置、注視点検出装置、情報入力装置、回転角度算出プログラム、注視点検出プログラム及び情報入力プログラム |
JP2015175971A (ja) * | 2014-03-14 | 2015-10-05 | 富士通株式会社 | Led光源装置、led制御装置及び端末装置 |
US10417782B2 (en) | 2014-08-22 | 2019-09-17 | National University Corporation Shizuoka University | Corneal reflection position estimation system, corneal reflection position estimation method, corneal reflection position estimation program, pupil detection system, pupil detection method, pupil detection program, gaze detection system, gaze detection method, gaze detection program, face orientation detection system, face orientation detection method, and face orientation detection program |
JP2016122380A (ja) * | 2014-12-25 | 2016-07-07 | 国立大学法人静岡大学 | 位置検出装置、位置検出方法、注視点検出装置、及び画像生成装置 |
JP2017111746A (ja) * | 2015-12-18 | 2017-06-22 | 国立大学法人静岡大学 | 視線検出装置及び視線検出方法 |
CN108737642A (zh) * | 2018-04-13 | 2018-11-02 | 维沃移动通信有限公司 | 内容的显示方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
JP5858433B2 (ja) | 2016-02-10 |
US20130329957A1 (en) | 2013-12-12 |
EP2649932A1 (en) | 2013-10-16 |
EP2649932A4 (en) | 2017-06-14 |
US9329683B2 (en) | 2016-05-03 |
JPWO2012077713A1 (ja) | 2014-05-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5858433B2 (ja) | 注視点検出方法及び注視点検出装置 | |
JP5915981B2 (ja) | 注視点検出方法及び注視点検出装置 | |
EP3153092B1 (en) | Pupil detection system, gaze detection system, pupil detection method, and pupil detection program | |
US8371693B2 (en) | Autism diagnosis support apparatus | |
JP6963820B2 (ja) | 視線検出装置 | |
JP5167545B2 (ja) | 視点検出装置 | |
JP6631951B2 (ja) | 視線検出装置及び視線検出方法 | |
JP2010259605A (ja) | 視線測定装置および視線測定プログラム | |
JP6324119B2 (ja) | 回転角度算出方法、注視点検出方法、情報入力方法、回転角度算出装置、注視点検出装置、情報入力装置、回転角度算出プログラム、注視点検出プログラム及び情報入力プログラム | |
US11023039B2 (en) | Visual line detection apparatus and visual line detection method | |
US11115643B2 (en) | Alignment system | |
JP6452235B2 (ja) | 顔検出方法、顔検出装置、及び顔検出プログラム | |
JP7046347B2 (ja) | 画像処理装置及び画像処理方法 | |
US10542875B2 (en) | Imaging device, endoscope apparatus, and imaging method | |
JP6346018B2 (ja) | 眼球計測システム、視線検出システム、眼球計測方法、眼球計測プログラム、視線検出方法、および視線検出プログラム | |
JP6780161B2 (ja) | 視線検出装置及び視線検出方法 | |
JP6452236B2 (ja) | 眼球識別装置及び眼球識別方法 | |
JP2015232771A (ja) | 顔検出方法、顔検出システム、および顔検出プログラム | |
JP6430813B2 (ja) | 位置検出装置、位置検出方法、注視点検出装置、及び画像生成装置 | |
JP2020081756A (ja) | 顔画像処理装置、画像観察システム、及び瞳孔検出システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11847335 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012547890 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011847335 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13992877 Country of ref document: US |