WO2016027627A1 - 角膜反射位置推定システム、角膜反射位置推定方法、角膜反射位置推定プログラム、瞳孔検出システム、瞳孔検出方法、瞳孔検出プログラム、視線検出システム、視線検出方法、視線検出プログラム、顔姿勢検出システム、顔姿勢検出方法、および顔姿勢検出プログラム - Google Patents
角膜反射位置推定システム、角膜反射位置推定方法、角膜反射位置推定プログラム、瞳孔検出システム、瞳孔検出方法、瞳孔検出プログラム、視線検出システム、視線検出方法、視線検出プログラム、顔姿勢検出システム、顔姿勢検出方法、および顔姿勢検出プログラム Download PDFInfo
- Publication number
- WO2016027627A1 WO2016027627A1 PCT/JP2015/071270 JP2015071270W WO2016027627A1 WO 2016027627 A1 WO2016027627 A1 WO 2016027627A1 JP 2015071270 W JP2015071270 W JP 2015071270W WO 2016027627 A1 WO2016027627 A1 WO 2016027627A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pupil
- corneal reflection
- reflection position
- corneal
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- One aspect of the present invention is a corneal reflection position estimation system, a corneal reflection position estimation method, a corneal reflection position estimation program, a pupil detection system, a pupil detection method, a pupil detection program, a gaze detection system, a gaze detection method, a gaze detection program, a face
- the present invention relates to a posture detection system, a face posture detection method, and a face posture detection program.
- a technique for detecting a pupil of a subject is known. This technology can be applied to detection of looking away, detection of driver drowsiness, investigation of the degree of interest in products, data input to a computer, and the like.
- Patent Document 1 describes a pupil detection method for detecting a pupil by performing image difference processing after obtaining a bright pupil image, an unilluminated image, and a dark pupil image.
- a corneal reflection image is detected from a difference image obtained by subtracting an unilluminated image from a bright pupil image or a difference image obtained by subtracting an unilluminated image from a dark pupil image.
- these difference images only images from the respective light sources can be obtained, so that it is possible to easily detect corneal reflection by removing images of ambient light (disturbance light) such as the sun. Therefore, the pupil can be detected by performing position correction on the two difference images based on corneal reflection and then subtracting the difference images.
- the subject's head moves while photographing the subject's eyes one by one. Then, since the false image due to the disturbance light also moves, the false image (false cornea) that cannot be distinguished from true corneal reflection simply by subtracting the unilluminated image from the bright pupil image (or dark pupil image). Reflection) cannot be removed, and position correction based on corneal reflection cannot be performed. In order to eliminate such a situation, position correction based on corneal reflection should be performed even when a non-illuminated image is subtracted from a bright pupil image (or dark pupil image). Since there is no reflection, the position cannot be corrected.
- a corneal reflection position estimation system is an image acquisition unit that continuously acquires images of a subject's eyes by controlling a camera including a light source, and shoots using light from the light source.
- An image acquisition unit that acquires one unilluminated image captured without using light from the light source after acquiring the plurality of pupil images, and a corneal reflection or cornea corresponding to each of the plurality of pupil images Estimating the position of the sphere center as a reference position, calculating a corneal reflection or corneal sphere center movement vector based on the plurality of reference positions, and estimating a corneal reflection position in an unilluminated image based at least on the movement vector A part.
- a corneal reflection position estimation method is a corneal reflection position estimation method executed by a corneal reflection position estimation system including a processor, and controls an image of a subject's eye by controlling a camera including a light source.
- An image acquisition step for continuously acquiring, after acquiring a plurality of pupil images captured using light from the light source, one unilluminated image captured without using light from the light source.
- the acquired image acquisition step and the position of the corneal reflection or corneal sphere center corresponding to each of the plurality of pupil images are calculated as reference positions, and the corneal reflection or corneal sphere center movement vector is calculated based on the plurality of reference positions. And calculating and estimating a corneal reflection position in the unilluminated image based on the movement vector.
- a corneal reflection position estimation program is an image acquisition unit that controls a camera including a light source to continuously acquire images of a subject's eyes, and uses the light from the light source to capture an image.
- An image acquisition unit that acquires one unilluminated image captured without using light from the light source after acquiring the plurality of pupil images, and a corneal reflection or cornea corresponding to each of the plurality of pupil images Estimating the position of the sphere center as a reference position, calculating a corneal reflection or corneal sphere center movement vector based on the plurality of reference positions, and estimating a corneal reflection position in an unilluminated image based at least on the movement vector Make the computer function as a part.
- a plurality of reference positions (corner reflection or corneal sphere center positions) corresponding to a plurality of pupil images obtained before the unilluminated image are calculated. Since the pupil image is photographed using light from the light source, the position of the corneal reflection or the center of the corneal sphere can be obtained from the pupil image. The movement of the corneal reflection or corneal sphere center immediately before obtaining the unilluminated image can be grasped by obtaining the movement vector of the corneal reflection or corneal sphere center from a plurality of reference positions. The corneal reflection position can be estimated.
- the corneal reflection position in an unilluminated image can be estimated.
- FIG. 1 is a perspective view showing a detection system according to a first embodiment. It is a top view which shows the lens part of a camera. It is a figure which shows the hardware constitutions of the image processing apparatus shown in FIG. It is a block diagram which shows the function structure of the detection system shown in FIG. It is a figure for demonstrating position correction based on corneal reflection. It is a figure showing acquisition and generation of an image in a 1st embodiment. It is a figure which shows the movement of a pupil. It is a flowchart which shows the estimation method of the corneal reflection in an unilluminated image. It is a figure which shows the structure of the detection program which concerns on 1st Embodiment.
- the detection system 1 is a computer system that detects the pupil of the subject. With this system, the corneal reflection position estimation method and the pupil detection method according to the present embodiment are implemented.
- the target person is a person who detects the pupil and can also be called a subject.
- the purpose of use of the detection system 1, the corneal reflection position estimation method, and the pupil detection method is not limited at all. For example, detection of looking away, detection of driver drowsiness, investigation of the degree of interest of goods, data input to a computer For example, the detection system 1 can be used.
- the detection system 1 includes one camera 10 and an image processing device 20.
- the detection system 1 further includes a display device 30 that is an object viewed by the subject A.
- the display device 30 is not an essential element in the detection system 1.
- the camera 10 is connected to the image processing apparatus 20 by wireless or wired, and various data or commands are transmitted and received between the camera 10 and the image processing apparatus 20. Camera calibration is performed on the camera 10 in advance.
- the camera 10 is used for photographing the eye of the subject A and its surroundings.
- the camera 10 is provided at a position lower than the face of the subject A for the purpose of preventing reflection of reflected light in the face image when the subject A is wearing glasses.
- the elevation angle of the camera 10 with respect to the horizontal direction is set to, for example, a range of 20 to 35 degrees in consideration of both reliable detection of eyes and avoidance of obstruction of the visual field range of the subject A.
- each camera 10 is 240 fps (Frames Per Second).
- the camera 10 can acquire an image about every 4.2 ms.
- the camera 10 captures the subject A in response to a command from the image processing apparatus 20 and outputs image data to the image processing apparatus 20.
- the lens part of the camera 10 is schematically shown in FIG.
- the objective lens 11 is accommodated in a circular opening 12, and a light source 13 is provided outside the opening 12.
- the light source 13 is a device for irradiating illumination light toward the face of the subject A, and includes a plurality of light emitting elements 13a and a plurality of light emitting elements 13b.
- the light emitting elements 13 a are semiconductor light emitting elements (LEDs) having a center wavelength of output light of 850 nm, and are arranged in a ring shape at equal intervals along the edge of the opening 12.
- the light emitting element 13b is a semiconductor light emitting element having a center wavelength of output light of 940 nm, and is arranged in a ring shape at equal intervals outside the light emitting element 13a. Therefore, the distance from the optical axis of the camera 10 to the light emitting element 13b is larger than the distance from the optical axis to the light emitting element 13a.
- Each of the light emitting elements 13 a and 13 b is provided so as to emit illumination light along the optical axis of the camera 10. Note that the arrangement of the light source 13 is not limited to the configuration shown in FIG. 2, and other arrangements may be used as long as the camera can be regarded as a pinhole model.
- the image processing apparatus 20 is a computer that executes control of the camera 10 and detection of the pupil of the subject A.
- the image processing apparatus 20 may be constructed by a stationary or portable personal computer (PC), may be constructed by a workstation, or may be constructed by another type of computer.
- the image processing apparatus 20 may be constructed by combining a plurality of arbitrary types of computers. When a plurality of computers are used, these computers are connected via a communication network such as the Internet or an intranet.
- FIG. 3 shows a general hardware configuration of the image processing apparatus 20.
- the image processing apparatus 20 includes a CPU (processor) 101 that executes an operating system, application programs, and the like, a main storage unit 102 that includes a ROM and a RAM, and an auxiliary storage unit 103 that includes a hard disk, a flash memory, and the like.
- the communication control unit 104 includes a network card or a wireless communication module, an input device 105 such as a keyboard and a mouse, and an output device 106 such as a display and a printer.
- Each functional element of the image processing apparatus 20 described later reads predetermined software on the CPU 101 or the main storage unit 102, and operates the communication control unit 104, the input device 105, the output device 106, and the like under the control of the CPU 101. This is realized by reading and writing data in the main storage unit 102 or the auxiliary storage unit 103. Data and a database necessary for processing are stored in the main storage unit 102 or the auxiliary storage unit 103.
- the image processing apparatus 20 includes an image acquisition unit 21 and a calculation unit 22 as functional components.
- the image acquisition unit 21 is a functional element that acquires at least image data in which the eye of the subject A is captured from the camera 10 by controlling the photographing timing of the camera 10 and the light emission timing of the light source 13 of the camera 10.
- the calculation unit 22 is a functional element that calculates the pupil position of the subject A based on the image data.
- the output destination of the processing result by the image processing apparatus 20 is not limited at all.
- the image processing apparatus 20 may display the processing result as an image, graphic, or text on a monitor, store it in a storage device such as a memory or database, or send it to another computer system via a communication network. You may send it.
- the image processing apparatus 20 may execute further arbitrary processing based on the result.
- a light emitting element 13a that emits light with a high transmittance (center wavelength is 850 nm) is provided at a position adjacent to the opening 12, and emits light with a low transmittance (central wavelength is 940 nm).
- the light emitting element 13 b is provided at a position away from the opening 12.
- the image acquisition unit 21 illuminates the light emitting element 13a to shoot a bright pupil image, illuminates the light emitting element 13b to shoot a dark pupil image, and shoots an unilluminated image without illuminating the light emitting elements 13a and 13b.
- the bright pupil image and the dark pupil image are pupil images obtained by photographing using the light from the light source 13, and the unilluminated image is obtained by photographing without using the light from the light source 13. It is a pupil image.
- the image acquisition unit 21 controls the camera 10 to acquire a bright pupil image, a dark pupil image, and a non-illuminated image, and outputs these images to the calculation unit 22.
- the image acquisition unit 21 repeats the process of acquiring pupil images in the order of “bright pupil image ⁇ non-illuminated image ⁇ dark pupil image ⁇ non-illuminated image”.
- the calculation unit 22 determines the pupil center of one eye or both eyes of the subject A based on the pupil image input from the image acquisition unit 21.
- the target is obtained while these two pupil images are obtained.
- a difference image in which the pupil portion is raised can be generated by simply taking the difference between the bright pupil image and the dark pupil image.
- the calculation unit 22 performs position correction based on corneal reflection (hereinafter simply referred to as “position correction”) on the bright pupil image and the dark pupil image before obtaining the difference image.
- the calculation unit 22 shifts the i-th image so that the positions of the corneal reflection points R detected from the i-th image and the (i + 1) -th image coincide with each other (in FIG. 5). Take the difference between the two images.
- the pupil P can be detected from this difference image shown in the lower part of FIG. More specifically, the calculation unit 22 performs position correction on the region including the corneal reflection and the pupil to generate a difference image. This area is called a window.
- the window setting method is not limited, as an example, a small area centered on the position of corneal reflection and including the pupil may be set as the window.
- the position correction is a process of shifting one of the images in the two windows so that the positions of the corneal reflections coincide with each other.
- the calculation unit 22 uses the average luminance of the pupil detected in the previous image, and uses the half value of the average luminance as a threshold value as a difference.
- the image is binarized and labeled.
- the calculation unit 22 selects a pupil candidate from among the connected components of the labeled pixels based on shape parameters such as the area, size, area ratio, squareness, and pupil feature amount that are likely to be pupils. Then, the calculation unit 22 determines the pupil having the largest area as the pupil, and obtains the center coordinates of the pupil.
- the calculation unit 22 can also obtain an accurate corneal reflection position based on the pupil center coordinates.
- the method of detecting the pupil from the difference image directly obtained from the bright pupil image and the dark pupil image is easily affected by disturbance light, and false corneal reflection due to the disturbance light appears on the difference image. In many cases, it is difficult to distinguish between true corneal reflection and false corneal reflection only from image features.
- the detection system 1 acquires a non-illuminated image in addition to a bright pupil image and a dark pupil image, and generates a differential bright pupil image and a differential dark pupil image.
- a non-illuminated image in addition to a bright pupil image and a dark pupil image, and generates a differential bright pupil image and a differential dark pupil image.
- the detection system 1 acquires a non-illuminated image in addition to a bright pupil image and a dark pupil image, and generates a differential bright pupil image and a differential dark pupil image.
- a bright pupil image and a dark pupil image that is, a differential bright pupil image and a differential dark pupil image
- a good difference image for detecting the pupil can be obtained.
- a difference image is obtained by the procedure of FIG. That is, the calculation unit 22 obtains a difference bright pupil image by taking a difference between the continuously acquired bright pupil image and the non-illuminated image, and obtains a difference between the continuously acquired dark pupil image and the non-illuminated image. A differential dark pupil image is acquired. Subsequently, the calculation unit 22 obtains a difference image by taking the difference between the difference bright pupil image and the difference dark pupil image obtained successively. Then, the calculation unit 22 obtains pupil center coordinates based on the difference image (detects the pupil).
- the calculating unit 22 may use subtraction or division when obtaining a difference image from the difference bright pupil image and the difference dark pupil image.
- the “difference image” in this specification is a concept including not only an image obtained by subtraction but also a divided image obtained by division.
- the time difference from the acquisition of the i-th image to the acquisition of the (i + 1) -th image is about 4.2 ms. .
- the time difference from the acquisition of the i-th image to the acquisition of the (i + 1) -th image is about 4.2 ms. .
- the calculation unit 22 estimates the true corneal reflection position in the unilluminated image. Then, the calculation unit 22 performs position correction on two adjacent windows w (that is, the corneal reflection in the previous window w matches the corneal reflection in the subsequent window w). The difference bright pupil image or the difference dark pupil image is obtained from these two windows w.
- the process of acquiring four images (bright pupil image ⁇ non-illuminated image ⁇ dark pupil image ⁇ non-illuminated image) shown in FIG. 6 is a one-cycle process. If the cycle number is expressed by an integer greater than or equal to 0, the 4nth image is a bright pupil image, the (4n + 1) th image is an unilluminated image, and the (4n + 2) th image is dark in the nth cycle. It is a pupil image, and the (4n + 3) -th image is an unilluminated image.
- the calculation unit 22 estimates the corneal reflection position by the method described below.
- the calculation unit 22 detects the corneal reflection from the differential bright pupil image obtained from the 4n-th image (bright pupil image) and the (4n + 1) -th image (non-illuminated image), and thereby the corneal reflection position in the bright pupil image.
- X 4n is obtained.
- the calculation unit 22 detects the corneal reflection from the differential dark pupil image obtained from the (4n + 2) th image (dark pupil image) and the (4n + 1) th image (non-illuminated image), thereby the dark pupil image.
- the corneal reflection position is indicated by two-dimensional coordinates (x, y) on the image.
- the calculation unit 22 estimates the corneal reflection position X 4n + 3 in the (4n + 3) -th image (non-illuminated image) by a constant velocity model expressed by the following equation (1). .
- the corneal reflection position X 4n + 3 corresponds to a true corneal reflection position that occurs when the light source of the camera 10 is turned on.
- X 4n + 3 X 4n + 2 +1/2 (X 4n + 2 ⁇ X 4n ) (1)
- This expression (1) is obtained by calculating the movement of the corneal reflection position between the bright pupil image and the dark pupil image obtained immediately before the (4n + 3) -th image (non-illuminated image) (movement vector (X 4n + 2 ⁇ X 4n )). Is assumed to continue to the (4n + 3) -th image as it is, and the corneal reflection position in the processing target image is estimated.
- the “movement vector” in the present embodiment is a vector indicating the direction and amount of movement of the corneal reflection position.
- the reference position used for calculating the movement vector is the corneal reflection position.
- the procedure for estimating the corneal reflection position in the unilluminated image can be represented by the flowchart of FIG. That is, when the image acquisition unit 21 acquires the non-illuminated image (step S11, image acquisition step), the calculation unit 22 sets the corneal reflection position in each of the bright pupil image and the dark pupil image obtained immediately before the non-illuminated image. Based on this, a movement vector of corneal reflection is calculated (step S12, estimation step). Then, the calculation unit 22 estimates the corneal reflection position in the unilluminated image based on the movement vector (step S13, estimation step). Thus, the calculation unit 22 functions as an estimation unit that estimates the corneal reflection position in the non-illuminated image.
- the calculation unit 22 estimates the following is a bright pupil image ⁇ 4 (n + 1) ⁇ th corneal reflection position in the image of X 4 (n + 1) of the constant velocity model represented by the following formula (2).
- X 4 (n + 1) X 4n + 2 + (X 4n + 2 ⁇ X 4n ) (2)
- This equation (2) assumes that the corneal reflection motion (movement vector (X 4n + 2 ⁇ X 4n )) continues to the image of the ⁇ 4 (n + 1) ⁇ th image as it is, and represents the corneal reflection position in the processing target image. It means to estimate.
- the calculation unit 22 performs position correction on the (4n + 2) th image (dark pupil image) and the (4n + 3) th image (non-illuminated image) (that is, dark pupil).
- the calculation unit 22 performs position correction on the (4n + 3) th image (non-illuminated image) and the ⁇ 4 (n + 1) ⁇ th image (bright pupil image).
- the calculation unit 22 calculates the ⁇ 4 (n + 1) ⁇ -th image (bright pupil image) and the ⁇ 4 (n + 1) +1 ⁇ -th image (non-illuminated image). Position correction is performed to generate a differential bright pupil image, and corneal reflection is detected from the differential bright pupil image.
- the calculation unit 22 calculates the ⁇ 4 (n + 1) +1 ⁇ th image (non-illuminated image) and the ⁇ 4 (n + 1) +2 ⁇ th image (dark pupil image). Then, a position correction is performed for the difference dark pupil image to generate a corneal reflection from the difference dark pupil image.
- the calculation unit 22 performs position correction on the difference bright pupil image and the difference dark pupil image to generate a difference image, and obtains the coordinates of the pupil center from the difference image.
- the calculation unit 22 also obtains an accurate corneal reflection position X4 (n + 1) +2 based on the pupil center coordinates, thereby determining the corneal reflection position X4 (n + 1) +2 .
- the calculation unit 22 can track the position of the pupil by repeatedly executing the above processing.
- the image acquisition interval may be irregular.
- the calculation unit 22 may measure the acquisition time of each image and estimate the corneal reflection position using a constant velocity model that takes into account the image acquisition interval obtained from the acquisition time.
- the time interval from obtaining the 4n-th image (bright pupil image) until obtaining the (4n + 1) -th image (non-illuminated image) is ⁇ t 1 n, and after obtaining the non-illuminated image, the (4n + 2) -th image A time interval until obtaining a (dark pupil image) is represented by ⁇ t 2 n .
- the time interval from obtaining the dark pupil image to obtaining the (4n + 3) th image (non-illuminated image) is ⁇ t 3 n
- the non-illuminated image is obtained and the 4 (n + 1) -th image (bright A time interval until obtaining a pupil image is denoted by ⁇ t 4 n .
- This time interval may be the difference between the exposure start time when acquiring a certain image and the exposure start time when acquiring the next image, or may be the difference between the exposure end times. It may be the difference between the transfer start time or transfer end time of the image to the processing device 20.
- the calculation unit 22 uses the constant velocity model represented by the following equations (5) to (8) to use the corneal reflection positions X 4n + 3 , X 4 (n + 1) , X 4 (n + 1) +1 , X 4 ( n + 1) +2 .
- These expressions (5) to (8) correspond to the above expressions (1) to (4). Therefore, even when the image acquisition interval is varied, the calculation unit 22 can generate the differential bright pupil image and the differential dark pupil image from which the false corneal reflection is removed, and the differential image is obtained from the two images. The pupil can be detected.
- the time interval between two frames disappears by about a minute, so that the time intervals ⁇ t 1 n , ⁇ t 2 n , ⁇ t 3 n , ⁇ t 4 n can be expressed by the number of frames. .
- equations (1) to (4) are based on the movement vectors and the individual time intervals when acquiring the individual images. It means estimating the corneal reflection position.
- the detection program P1 includes a main module P10, an image acquisition module P11, and a calculation module P12.
- the main module P10 is a part that comprehensively controls the pupil detection function including the corneal reflection position estimation function.
- the functions realized by executing the image acquisition module P11 and the calculation module P12 are the same as the functions of the image acquisition unit 21 and the calculation unit 22, respectively.
- the detection program P1 may be provided after being fixedly recorded on a tangible recording medium such as a CD-ROM, DVD-ROM, or semiconductor memory.
- the detection program P1 may be provided via a communication network as a data signal superimposed on a carrier wave.
- the image processing apparatus 20 acquires four images (bright pupil image ⁇ non-illuminated image ⁇ dark pupil image ⁇ non-illuminated image) in one cycle, but the image acquisition method is not limited to this.
- the image processing apparatus 20 acquires a non-illuminated image between the bright pupil image and the dark pupil image so as to acquire a bright pupil image (or dark pupil image) every three images as shown in FIG. May be.
- the calculation unit 22 obtains a differential bright pupil image by taking the difference between the 3n-th acquired bright pupil image and the (3n + 1) -th non-illuminated image, and the non-illuminated image and (3n + 2) )
- a difference dark pupil image is acquired by taking a difference from the dark pupil image acquired first.
- the calculation part 22 acquires a difference image by taking the difference of these difference bright pupil images and difference dark pupil images.
- the calculating unit 22 further obtains the difference between the dark pupil image and the bright pupil image acquired successively (for example, the (3n + 2) th acquired dark pupil image and the 3 (n + 1) th acquired bright pupil image).
- a difference image may be acquired.
- the calculation unit 22 estimates the corneal reflection position in the non-illuminated image as follows.
- the calculation unit 22 detects a corneal reflection from the differential bright pupil image obtained from the 3n-th bright pupil image and the (3n + 1) -th non-illuminated image, thereby corneal reflection position X 3n in the bright pupil image. Get. Further, the calculation unit 22 detects the corneal reflection from the difference dark pupil image obtained from the (3n + 2) th dark pupil image obtained and the (3n + 1) th non-illuminated image, whereby the cornea in the dark pupil image is obtained. A reflection position X 3n + 2 is obtained. It is assumed that the corneal reflection positions X 3n and X 3n + 2 are finally determined by the previous processing. Here, the corneal reflection position is indicated by two-dimensional coordinates (x, y) on the image.
- the calculation unit 22 Based on these two corneal reflection positions, the calculation unit 22 performs corneal reflection position X3 (n + 1) in the 3 (n + 1) th bright pupil image and non-illumination obtained in the ⁇ 3 (n + 1) +1 ⁇ th.
- the corneal reflection position X3 (n + 1) +1 in the image is estimated by a constant velocity model expressed by the following equations (9) and (10).
- X 3 (n + 1) X 3n + 2 +1/2 (X 3n + 2 ⁇ X 3n ) (9)
- X 3 (n + 1) +1 X 3n + 2 + (X 3n + 2 ⁇ X 3n ) (10)
- the calculation unit 22 performs position correction on the 3 (n + 1) th bright pupil image and the ⁇ 3 (n + 1) +1 ⁇ th unilluminated image. To generate a differential bright pupil image.
- the corneal reflection position X 3 (n + 1) to obtain a calculator 22 ⁇ 3 (n + 1) +2 ⁇ th corneal reflection position in a dark pupil image obtained X 3 (n + 1) +2 the following formula (11) Can be predicted.
- X3 (n + 1) + 2 X (3n + 1) +2 (X3 (n + 1) -X3n + 2 ) (11)
- the calculation unit 22 can track the pupil by repeatedly executing the above processing.
- the calculation unit 22 estimates the corneal reflection position using a constant velocity model in consideration of the image acquisition interval, as in the first embodiment. Note that the interval may be expressed by the number of frames, as in the first embodiment.
- the time interval from obtaining the 3nth bright pupil image to obtaining the (3n + 1) th unilluminated image is ⁇ t 1 n, and from obtaining the unilluminated image to obtaining the (3n + 2) th dark pupil image.
- the time interval is ⁇ t 2 n, and the time interval from when the dark pupil image is obtained until the 3 (n + 1) th bright pupil image is obtained is ⁇ t 3 n .
- this time interval may be a difference in exposure start time, a difference in exposure end time, a difference in image transfer start time or transfer end time, or an image. It may be expressed based on the number of missed items.
- the calculation unit 22 uses the following formulas (12) to (14) corresponding to the above formulas (9) to (11) to use the corneal reflection positions X 3 (n + 1) , X 3 (n + 1) +1 , X 3 (n + 1) +2 is estimated. Therefore, even when the image acquisition interval is varied, the calculation unit 22 can generate the differential bright pupil image and the differential dark pupil image from which the false corneal reflection is removed, and the differential image is obtained from the two images. The pupil can be detected.
- the image acquisition unit 21 may acquire pupil images in the order of “dark pupil image ⁇ non-illuminated image ⁇ bright pupil image” in one cycle. Also in this case, the calculation unit 22 can estimate the corneal reflection position in each of the non-illuminated image, the bright pupil image, and the dark pupil image by the same method as described above.
- the constant velocity model is applied to the coordinates on the image, that is, the two-dimensional coordinates.
- the calculation unit 22 may apply the constant velocity model to the three-dimensional coordinates.
- the detection system 1 needs to include two cameras 10 that function as stereo cameras.
- the pair of cameras 10 includes a left camera 10 L on the left side of the subject A and a right camera 10 R on the right side of the subject A.
- the pair of cameras 10 are arranged at a predetermined interval along the horizontal direction.
- the image acquisition unit 21 slightly shifts the operation timing between the two cameras 10, and the exposure time of each camera 10 is set to be equal to or less than the shift time.
- the image acquisition unit 21 causes the corresponding light emitting element 13a and light emitting element 13b to emit light alternately during the exposure time of each camera 10 so that the light from the light source 13 of one camera 10 becomes an image of the other camera 10. Do not influence (do not cause crosstalk). Then, the image acquisition unit 21 acquires the pupil image from each of the left camera 10 L and the right camera 10 R.
- the calculation unit 22 determines the corneal reflection position on the pupil image from each of the cameras 10 L and 10 R in a certain phase by the method in the first or second embodiment.
- the corneal reflection position obtained at this time is a two-dimensional coordinate
- the calculation unit 22 obtains a three-dimensional coordinate from the two-dimensional coordinate using a stereo method (stereo matching).
- the stereo method measures internal parameters such as the focal length of the camera lens, image center, and pixel size, and external parameters such as the camera position and orientation, and shoots an object using multiple stereo cameras. In this case, based on the coordinates of the point in the image, the position of the point in the space is determined using the internal parameter and the external parameter.
- the calculation unit 22 acquires a relational expression between the corneal reflection position in the image coordinate system detected based on the output data from the two cameras 10 and the corneal reflection position in the world coordinate system with reference to the calibration data. .
- the calculation part 22 calculates
- the calculation unit 22 can obtain the three-dimensional coordinates of the corneal reflection position for each of the left and right pupils of the subject A.
- the calculation unit 22 each eye, and the straight line passing through the corneal reflection position corresponding left camera 10 L lens centers of the (lens may be a point other than the center) to the left camera 10 L, the right camera 10 R It sets the three-dimensional coordinates of the intersection of the lens center (which may be a point other than the lens center) and a straight line passing through the corneal reflection position corresponding to the right camera 10 R as a three-dimensional coordinates of the cornea sphere center.
- Calculator 22 and the 4n-th image (bright pupil image) (4n + 1) th picture and the right of a process for detecting the corneal reflection from the left camera 10 L from the difference bright pupil image obtained from the (non-illumination image) Perform for each of the images from camera 10R .
- the calculation unit 22 obtains the corneal sphere center coordinates X ′ 4n corresponding to the bright pupil image using the stereo method as described above.
- the calculation unit 22 performs processing for detecting corneal reflection from the difference dark pupil image obtained from the (4n + 2) th image (dark pupil image) and the (4n + 1) th image (non-illuminated image) from the left camera 10L. And the image from the right camera 10R .
- the calculation unit 22 obtains corneal sphere center coordinates X ′ 4n + 2 corresponding to the dark pupil image using the stereo method.
- the corneal sphere center coordinates are three-dimensional coordinates.
- the calculation unit 22 estimates the corneal sphere center coordinates X ′ 4n + 3 in the (4n + 3) -th image (non-illuminated image) using a constant velocity model represented by the following equation (15). .
- X ′ 4n + 3 X ′ 4n + 2 +1/2 (X ′ 4n + 2 ⁇ X ′ 4n ) (15)
- (X ′ 4n + 2 ⁇ X ′ 4n ) is a movement vector indicating the motion of the corneal sphere center in the three-dimensional space.
- the “movement vector” in the present embodiment is a vector indicating the direction and amount of movement of the position of the corneal sphere center.
- the reference position used for calculating the movement vector is the position of the corneal sphere center.
- the calculation unit 22 converts the estimated corneal sphere center coordinates X ′ 4n + 3 into two-dimensional coordinates on the imaging plane (pupil image).
- This conversion can be said to be a process of projecting three-dimensional coordinates onto the imaging plane using a pinhole model.
- a two-dimensional position X 4n + 3 on the imaging plane corresponding to the three-dimensional coordinate X ′ 4n + 3 (x, y, z) is obtained by the following equation (16) using the focal length f of the camera.
- Calculating unit 22 performs a transformation into two-dimensional coordinates for each of the images obtained from the image and the right camera 10 R obtained from the left camera 10 L.
- the calculation unit 22 estimates the corneal sphere center coordinates X′4 (n + 1) in the ⁇ 4 (n + 1) ⁇ -th image, that is, the next bright pupil image, using a constant velocity model expressed by the following equation (17).
- X'4 (n + 1) X'4n + 2 + (X'4n + 2 -X' 4n) ... (17)
- the calculation unit 22 converts the estimated corneal sphere center coordinate X ′ 4 (n + 1) into the two-dimensional coordinate X 4 (n + 1) by the above equation (16).
- the calculation unit 22 performs position correction on the (4n + 2) -th image (dark pupil image) and the (4n + 3) -th image (non-illuminated image) to perform differential darkness.
- a pupil image is generated, and corneal reflection is detected from the difference dark pupil image.
- the calculation unit 22 performs position correction on the (4n + 3) th image (non-illuminated image) and the ⁇ 4 (n + 1) ⁇ th image (bright pupil image).
- the calculation unit 22 performs position correction on the difference dark pupil image and the difference bright pupil image to generate a difference image, detects the pupil from the difference image, and determines an accurate corneal reflection position X4 (n + 1) .
- the calculation unit 22 calculates the corneal sphere center coordinates X 4 ′ (n + 1) +1 and the ⁇ 4 (n + 1) +2 ⁇ -th image (dark pupil image) in the ⁇ 4 (n + 1) +1 ⁇ -th image (non-illuminated image).
- the corneal sphere center coordinates X 4 ′ (n + 1) +2 can be estimated by a constant velocity model expressed by the following equations (18) and (19).
- X'4 (n + 1) +1 X 4 '(n + 1) +1/2 (X 4' (n + 1) -X' 4n + 2) ...
- X'4 (n + 1) +2 X 4 '(n + 1) + (X 4' (n + 1) -X' 4n + 2) ... (19)
- the calculation unit 22 uses the estimated corneal sphere center coordinates X ′ 4 (n + 1) +1 and X ′ 4 (n + 1) +2 as the two-dimensional coordinates X 4 ( on the imaging plane (pupil image)) according to the above equation (16). n + 1) +1 and X4 (n + 1) +2 .
- the calculation unit 22 calculates the ⁇ 4 (n + 1) ⁇ -th image (bright pupil image) and the ⁇ 4 (n + 1) +1 ⁇ -th image (non-illuminated image). Position correction is performed to generate a differential bright pupil image, and corneal reflection is detected from the differential bright pupil image. Further, when the corneal reflection position X4 (n + 1) +2 is estimated, the calculation unit 22 calculates the ⁇ 4 (n + 1) +1 ⁇ th image (non-illuminated image) and the ⁇ 4 (n + 1) +2 ⁇ th image (dark pupil image). Then, a position correction is performed for the difference dark pupil image to generate a corneal reflection from the difference dark pupil image. Then, the calculation unit 22 performs position correction on the difference bright pupil image and the difference dark pupil image to generate a difference image, detects the pupil from the difference image, and detects the accurate corneal reflection position X4 (n + 1) +2. Ask for.
- the calculation unit 22 can track the pupil by repeatedly executing the above processing.
- the method of applying the constant velocity model to the corneal sphere center coordinates can also be applied to the second embodiment in which three images are acquired in one cycle.
- the calculation unit 22 estimates the corneal reflection position using the constant velocity model, but the calculation unit 22 may execute the estimation using the constant acceleration model.
- the calculation unit 22 uses the corneal reflection position in the pupil image (bright pupil image or dark pupil image) obtained one, two, and three before the non-illuminated image for which the corneal reflection position is to be estimated.
- Let x 0 , x ⁇ 1 , x ⁇ 2 , and x ⁇ 3 represent the two-dimensional vectors of the corneal reflection positions in the current image, the previous image, the previous image, and the previous image, respectively. .
- the time when each vector is obtained is represented by t 0 , t ⁇ 1 , t ⁇ 2 , and t ⁇ 3 , respectively.
- the current speed v 0 is obtained from the coordinates in the current image and the coordinates in the previous image
- the previous speed v ⁇ 1 from the coordinates in the previous image and the coordinates in the second previous image
- the previous velocity v- 2 is obtained from the coordinates in the previous image and the coordinates in the previous image.
- t 0 to t ⁇ 3 are acquisition times of the respective images.
- the corneal reflection position can be estimated using an iso-acceleration model, considering up to four images before the current image.
- a model in which the order is increased such as an equal acceleration model and an equal jerk model, is effective when the number of frames of the camera 10 per unit time is large.
- the method using the constant acceleration model or the constant jerk model can be applied to the second embodiment in which three images are acquired in one cycle and the third embodiment using corneal sphere center coordinates.
- the corneal reflection position estimation system is an image acquisition unit that controls a camera including a light source to continuously acquire an image of a subject's eye, from the light source. After acquiring a plurality of pupil images photographed using the light of the image, the image acquisition section for obtaining one unilluminated image photographed without using the light from the light source, and each of the plurality of pupil images The position of the corresponding corneal reflection or corneal sphere center is calculated as the reference position, the corneal reflection or corneal sphere center movement vector is calculated based on the plurality of reference positions, and the corneal reflection position in the unilluminated image is used as the movement vector. And an estimation unit that estimates based at least.
- a corneal reflection position estimation method is a corneal reflection position estimation method executed by a corneal reflection position estimation system including a processor, and controls an image of a subject's eye by controlling a camera including a light source.
- An image acquisition step for continuously acquiring, after acquiring a plurality of pupil images captured using light from the light source, one unilluminated image captured without using light from the light source.
- the acquired image acquisition step and the position of the corneal reflection or corneal sphere center corresponding to each of the plurality of pupil images are calculated as reference positions, and the corneal reflection or corneal sphere center movement vector is calculated based on the plurality of reference positions. And calculating and estimating a corneal reflection position in the unilluminated image based on the movement vector.
- a corneal reflection position estimation program is an image acquisition unit that controls a camera including a light source to continuously acquire images of a subject's eyes, and uses the light from the light source to capture an image.
- An image acquisition unit that acquires one unilluminated image captured without using light from the light source after acquiring the plurality of pupil images, and a corneal reflection or cornea corresponding to each of the plurality of pupil images Estimating the position of the sphere center as a reference position, calculating a corneal reflection or corneal sphere center movement vector based on the plurality of reference positions, and estimating a corneal reflection position in an unilluminated image based at least on the movement vector Make the computer function as a part.
- a plurality of reference positions (corner reflection or corneal sphere center positions) corresponding to a plurality of pupil images obtained before the unilluminated image are calculated. Since the pupil image is photographed using light from the light source, the position of the corneal reflection or the center of the corneal sphere can be obtained from the pupil image. The movement of the corneal reflection or corneal sphere center immediately before obtaining the unilluminated image can be grasped by obtaining the movement vector of the corneal reflection or corneal sphere center from a plurality of reference positions. The corneal reflection position can be estimated.
- the time difference in image acquisition is only 0.5 ms, and a bright pupil image, an unilluminated image, and a dark pupil image can be obtained in 1.5 ms. If the photographing time interval is about this level, pupil movement between images can be ignored even if the head moves during that time.
- high-speed cameras are currently expensive and difficult to adopt easily.
- the camera with a frame rate of 240 fps used in this embodiment is cheaper than a high-speed camera and is easy to introduce into the system.
- the time of 4.2 ms requires no position correction. It's not the level to finish.
- the pupil is tracked while tracking only the window including the corneal reflective position in the image. Can be detected. Therefore, the image processing load can be reduced as a whole.
- the corneal reflection position of the non-illuminated image can be easily estimated without considering the three-dimensional coordinates or using the stereo method.
- the position of the center of the corneal sphere is used as the reference position, the corneal reflection position of the unilluminated image can be estimated more accurately.
- the estimation unit estimates the corneal reflection position in the unilluminated image based on the movement vector and each time interval when the image acquisition unit acquires each image. May be. Considering not only the movement vector but also the time interval for acquiring an image, the motion of the corneal reflection can be accurately estimated.
- the plurality of pupil images are two pupil images
- the estimation unit calculates a reference position in each of the two pupil images and moves based on the two reference positions.
- a vector may be calculated, and the corneal reflection position in the unilluminated image may be estimated by a constant velocity model based at least on the movement vector.
- the plurality of pupil images are three pupil images
- the estimation unit calculates a reference position in each of the three pupil images, and based on the three reference positions
- Two movement vectors may be calculated, and the corneal reflection position in the non-illuminated image may be estimated by a constant acceleration model based at least on the two movement vectors.
- the uniform acceleration model it is possible to estimate the movement of the movement vector in detail, and thus it is possible to accurately estimate the corneal reflection position in the non-illuminated image.
- the image acquisition unit acquires an unilluminated image, then acquires one further pupil image captured using light from the light source, and the estimation unit further updates the corneal reflection position estimation system.
- the corneal reflection position in the pupil image may be estimated based on at least the movement vector. In this case, not only the corneal reflection position in the non-illuminated image but also the corneal reflection position in the subsequent pupil image can be estimated.
- a pupil detection system includes a pupil calculation unit that calculates a pupil position based on a corneal reflection position in a non-illuminated image estimated by the corneal reflection position estimation system, and a plurality of pupil images are bright pupils.
- One of the image and the dark pupil image, the further pupil image is the other of the bright pupil image and the dark pupil image
- the pupil calculation unit calculates the difference bright pupil image from the bright pupil image and the non-illuminated image.
- the pupil position is calculated based on the difference image.
- a pupil detection method is a pupil detection method executed by a computer system including a processor, and the pupil position based on a corneal reflection position in an unilluminated image estimated by the corneal reflection position estimation method.
- the image acquisition step after acquiring an unilluminated image, one additional pupil image photographed using light from the light source is acquired, and in the estimation step, a further pupil image is acquired.
- the corneal reflection position in the image is estimated based at least on the movement vector, and the pupil image obtained immediately before the unilluminated image among the plurality of pupil images includes one of a bright pupil image and a dark pupil image, and further pupil images Is the other of the bright pupil image and the dark pupil image, and in the pupil calculation step, the difference bright pupil from the bright pupil image and the non-illuminated image.
- the image generation and the generation of the differential dark pupil image from the dark pupil image and the non-illuminated image are executed based on the corneal reflection position in the non-illuminated image, and the differential image is obtained from the differential bright pupil image and the differential dark pupil image.
- the pupil position is calculated based on the difference image.
- the pupil detection program causes a computer to function as the pupil detection system.
- the differential bright pupil image and the differential dark pupil with high accuracy can be obtained from the bright pupil image, the unilluminated image, and the dark pupil image that are continuously captured. An image can be generated. Therefore, a difference image obtained from the difference bright pupil image and the difference dark pupil image is also obtained with high accuracy, and an accurate pupil position can be obtained from the difference image.
- the calculation unit 22 may detect an approximate line of sight from the relative positional relationship between the pupil center coordinates and the corneal reflection position obtained by the above method.
- the detection system 1 also functions as a line-of-sight detection system.
- a method for strictly performing this line-of-sight detection is described, for example, in the above-mentioned References 1 and 2 or Reference 3 below. (Reference 3) Japanese Patent Application Laid-Open No. 2005-185431
- a gaze detection program can be created by adding a module for realizing such a gaze detection function to the detection program P1.
- the line-of-sight detection system detects the line of sight based on the pupil position calculated by the pupil detection system.
- the gaze detection method detects a gaze based on the pupil position calculated by the pupil detection method.
- the visual line detection program causes a computer to function as the visual line detection system.
- the calculation unit 22 may detect the face posture using the pupil center coordinates obtained by the above method.
- the detection system 1 also functions as a face posture detection system.
- the method of detecting the face posture itself is described in Reference Document 4 below, for example. (Reference 4) Japanese Patent Application Laid-Open No. 2007-271554
- the calculation unit 22 estimates the corneal reflection position for each of the left and right pupils based on any of the above embodiments, and detects the pupil based on the estimation result. Find the center coordinates of the pupil. Further, the calculation unit 22 refers to the bright pupil image or the dark pupil image to obtain the two-dimensional coordinates of the left and right nostril centers. Then, the calculation unit 22 calculates the three-dimensional coordinates of the left and right pupil centers and the three-dimensional coordinates of the left and right nostrils based on the two-dimensional coordinates of the left and right pupil centers and the left and right nostril centers, and the face of the subject A Find posture (center of gravity and face direction vector). Thus, the methods of the above embodiments can be applied to the calculation of the pupil center coordinates described in Reference 4.
- a face posture detection program can be created by adding a module for realizing such a face posture detection function to the detection program P1.
- the face posture detection system calculates a face posture based on the pupil position calculated by the pupil detection system and the nostril position calculated by referring to the bright pupil image or the dark pupil image. To detect.
- the face posture detection method is based on the pupil position calculated by the pupil detection method and the nostril position calculated by referring to the bright pupil image or the dark pupil image. To detect.
- a face posture detection program causes a computer to function as the face posture detection system.
- the calculation unit 22 cannot detect the pupil and fails to track the pupil. .
- the calculation unit 22 generates a difference image from the pair without performing position correction on the pair of the bright pupil image and the dark pupil image without using the non-illuminated image, and the pupil from the difference image. Attempt detection.
- the calculation unit 22 repeats this trial until pupil detection is successful. Then, when the pupil detection is successful, this time, the process of any one of the above embodiments, that is, the pupil detection including the estimation of the corneal reflection position using the non-illuminated image is resumed.
- the pupil calculation unit fails to calculate the pupil position based on the corneal reflection position in the non-illuminated image, it is newly acquired without using the non-illuminated image.
- the pupil position is calculated based on the new difference image, and the pupil position is obtained based on the new difference image
- the image acquisition unit And the process by an estimation part may be performed and a pupil calculation part may calculate a pupil position based on the corneal reflection position in the new unilluminated image estimated based on this process.
- the subject's pupil can be continuously tracked even after pupil detection using the non-illuminated image fails due to the subject's eyes closing his eyes for a moment.
- a linear Kalman filter may be used when a Kalman filter is used instead of the constant velocity model, and an extended Kalman filter may be used when a Kalman filter is used instead of the constant acceleration model.
- the plurality of pupil images are two pupil images
- the estimation unit calculates a reference position in each of the two pupil images, and based on the two reference positions Then, the movement vector may be calculated, and the corneal reflection position in the unilluminated image may be estimated based on at least the movement vector using a constant velocity model or a linear Kalman filter.
- the plurality of pupil images are three pupil images
- the estimation unit calculates a reference position in each of the three pupil images, and based on the three reference positions. Then, the two movement vectors may be calculated, and the corneal reflection position in the non-illuminated image may be estimated based on at least the two movement vectors using an equal acceleration model or an extended Kalman filter.
- the image processing apparatus 20 may acquire two images in one cycle as shown in FIG. FIG. 12A shows a mode in which a bright pupil image and an unilluminated image are acquired in one cycle, and FIG. 12B shows a mode in which a dark pupil image and an unilluminated image are acquired in one cycle.
- the image processing apparatus 20 may alternately acquire a bright pupil image (or dark pupil image) and an unilluminated image, and estimate the corneal reflection position in the unilluminated image by applying the method of the above embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ophthalmology & Optometry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Geometry (AREA)
- Eye Examination Apparatus (AREA)
- Image Analysis (AREA)
Abstract
Description
<検出システムの構成>
図1~4を用いて、実施形態に係る検出システム1の構成を説明する。検出システム1は、対象者の瞳孔を検出するコンピュータシステムであり、このシステムにより、本実施形態に係る角膜反射位置推定方法および瞳孔検出方法が実施される。対象者とは、瞳孔を検出する対象となる人であり、被験者ともいうことができる。検出システム1、角膜反射位置推定方法、および瞳孔検出方法の利用目的は何ら限定されず、例えば、よそ見運転の検出、運転者の眠気の検出、商品の興味の度合いの調査、コンピュータへのデータ入力などに検出システム1を利用することができる。
次に、図5~8を用いて、検出システム1の動作について説明するとともに、本実施形態に係る角膜反射位置推定方法および瞳孔検出方法について説明する。
眼に入った光は網膜で乱反射し、反射光のうち瞳孔を通り抜けた光は強い指向性をもって光源へ戻る性質がある。カメラの開口部近くにある光源が発光した時にカメラを露光させると、網膜で反射した光の一部がその開口部に入るため、瞳孔が瞳孔周辺よりも明るく写った画像を取得することができる。この画像が明瞳孔画像である。これに対して、カメラの開口部から離れた位置にある光源が発光した時にカメラを露光させると、眼から戻ってきた光はカメラの開口部にほとんど戻らないため、瞳孔が暗く写った画像を取得することができる。この画像が暗瞳孔画像である。また、透過率が高い波長の光を眼に照射すると、網膜での光の反射が多くなるので瞳孔が明るく写り、透過率が低い波長の光を眼に照射すると、網膜での光の反射が少なくなるので瞳孔が暗く写る。カメラのどの光源も点灯させることなくカメラを露光させて得られる画像が無照明画像であり、この無照明画像も瞳孔画像の一種である。
算出部22は、画像取得部21から入力された瞳孔画像に基づいて、対象者Aの片眼または両眼の瞳孔中心を求める。
図6に示す、4枚の画像(明瞳孔画像→無照明画像→暗瞳孔画像→無照明画像)を取得する処理を1サイクルの処理とする。サイクルの番号を0以上の整数で表すとすると、nサイクル目では、4n番目の画像は明瞳孔画像であり、(4n+1)番目の画像は無照明画像であり、(4n+2)番目の画像は暗瞳孔画像であり、(4n+3)番目の画像は無照明画像である。
X4n+3=X4n+2+1/2(X4n+2-X4n) …(1)
この式(1)は、(4n+3)番目の画像(無照明画像)の直前に得られた明瞳孔画像および暗瞳孔画像の間の角膜反射位置の動き(移動ベクトル(X4n+2-X4n))が当該(4n+3)番目の画像までそのまま続くと仮定して、処理対象画像における角膜反射位置を推定することを意味する。なお、本実施形態における「移動ベクトル」とは、角膜反射位置の移動の方向および量を示すベクトルである。また、本実施形態では、移動ベクトルの計算に用いる基準位置は角膜反射位置である。
X4(n+1)=X4n+2+(X4n+2-X4n) …(2)
この式(2)は、角膜反射の動き(移動ベクトル(X4n+2-X4n))が{4(n+1)}番目の画像の画像までそのまま続くと仮定して、処理対象画像における角膜反射位置を推定することを意味する。
X4(n+1)+1=X4(n+1)+1/2(X4(n+1)-X4n+2) …(3)
X4(n+1)+2=X4(n+1)+(X4(n+1)-X4n+2) …(4)
上記式(1)~(4)を示した説明は画像の取得間隔が一定であることを前提とした。しかし、実際には画像の取得間隔が区々である状況が生じ得る。例えば、画像処理に多くの時間が掛かったり、画像処理装置20のマルチタスクOSが他の処理をして画像処理のタイミングが遅れたりするなどの理由により、所定のタイミングで(例えば約4.2ms間隔で)画像を取得することができず(画像の取りこぼし)、何回目かの再試行でやっと所望の画像を取得する状況が起こり得る。この場合には画像の取得間隔が変わり得る(例えば、その間隔が約8.4(=4.2×2)msになったり約12.6(=4.2×3)msになったりする)。一方、画像処理装置20がある画像を処理し終えた段階で次の画像取得命令をカメラ10に送る外部同期を採用した場合には画像の取りこぼしは起こらないが、この場合には画像処理装置20内のハードウェアの状況(例えばCPUの稼働状況)により画像の取得間隔が不規則になり得る。
次に、図9を用いて、画像処理装置20を実現するための検出プログラム(角膜反射位置推定プログラムおよび瞳孔検出プログラム)P1を説明する。
第1実施形態では、画像処理装置20が1サイクルで4枚の画像(明瞳孔画像→無照明画像→暗瞳孔画像→無照明画像)を取得したが、画像の取得方法はこれに限定されない。本実施形態では、画像処理装置20は図10に示すように3枚毎に明瞳孔画像(または暗瞳孔画像)を取得するように明瞳孔画像と暗瞳孔画像との間に無照明画像を取得してもよい。この場合には、算出部22は3n番目に取得した明瞳孔画像と(3n+1)番目に取得した無照明画像との差分を取ることで差分明瞳孔画像を取得し、その無照明画像と(3n+2)番目に取得した暗瞳孔画像との差分を取ることで差分暗瞳孔画像を取得する。そして、算出部22はこれらの差分明瞳孔画像および差分暗瞳孔画像の差分を取ることで差分画像を取得する。算出部22は、連続して取得した暗瞳孔画像および明瞳孔画像(例えば、(3n+2)番目に取得した暗瞳孔画像と3(n+1)番目に取得した明瞳孔画像)の差分を取ることで更なる差分画像を取得してもよい。
X3(n+1)=X3n+2+1/2(X3n+2-X3n) …(9)
X3(n+1)+1=X3n+2+(X3n+2-X3n) …(10)
X3(n+1)+2=X(3n+1)+2(X3(n+1)-X3n+2) …(11)
第1および第2実施形態では画像上での座標、すなわち2次元座標に対して等速度モデルを適用したが、算出部22は3次元座標に対して等速度モデルを適用してもよい。3次元座標に対して等速度モデルを適用して角膜反射位置を推定することで、撮像平面と交差する方向に瞳孔が移動する場合にも角膜反射位置を正確に推定することができる。
(参考文献1)国際公開第2012/077713号パンフレット
(参考文献2)国際公開第2012/020760号パンフレット
X´4n+3=X´4n+2+1/2(X´4n+2-X´4n) …(15)
X4n+3=(x(f/z),y(f/z)) …(16)
算出部22は、左カメラ10Lから得られた画像と右カメラ10Rから得られた画像のそれぞれに対してその2次元座標への変換を実行する。
X´4(n+1)=X´4n+2+(X´4n+2-X´4n) …(17)
X´4(n+1)+1=X4´(n+1)+1/2(X4´(n+1)-X´4n+2) …(18)
X´4(n+1)+2=X4´(n+1)+(X4´(n+1)-X´4n+2) …(19)
第1実施形態および第2実施形態では算出部22は等速度モデルを用いて角膜反射位置を推定したが、算出部22は等加速度モデルを用いてその推定を実行してもよい。算出部22は、角膜反射位置を推定しようとする無照明画像の一つ前、二つ前、および3つ前に得た瞳孔画像(明瞳孔画像または暗瞳孔画像)における角膜反射位置を用いる。現在の画像、一つ前の画像、二つ前の画像、および三つ前の画像における角膜反射位置の2次元ベクトルをそれぞれx0,x-1,x-2,x-3で表すとする。また、各ベクトルが得られた時刻をそれぞれt0,t-1,t-2,t-3で表すとする。すると、現在の画像における座標と一つ前の画像における座標とから現在の速度v0が求まり、一つ前の画像における座標と二つ前の画像における座標とから一つ前の速度v-1が求まり、二つ前の画像における座標と三つ前の画像における座標とから二つ前の速度v-2が求まる。これらの速度は下記式(20)~(22)で得られる。
v0=(x0-x-1)/(t0-t-1) …(20)
v-1=(x-1-x-2)/(t-1-t-2) …(21)
v-2=(x-2-x-3)/(t-2-t-3) …(22)
ここで、t0~t-3はそれぞれの画像の取得時刻である。
a0=(v0-v-1)/(t0-t-1) …(23)
a-1=(v-1-v-2)/(t-1-t-2) …(24)
v4(n+1)=X4(n+1)-X4n+2
v4n+2=X4n+2-X4n
である。
算出部22は、上記の手法により求めた瞳孔中心座標および角膜反射位置の相対的な位置関係からおよその視線を検出してもよい。この場合には、検出システム1は視線検出システムとしても機能する。この視線検出を厳密に実施するための方法は、例えば上記の参考文献1,2あるいは下記の参考文献3に記載されている。
(参考文献3)特開2005-185431号公報
算出部22は、上記の手法により求めた瞳孔中心座標を用いて顔姿勢を検出してもよい。この場合には、検出システム1は顔姿勢検出システムとしても機能する。顔姿勢検出の方法そのものは、例えば下記参考文献4に記載されている。
(参考文献4)特開2007-271554号公報
瞳孔を追跡している最中に対象者Aが眼を閉じた場合には画像中に瞳孔および角膜反射が写らないので、算出部22は瞳孔を検出できず、瞳孔の追跡に失敗してしまう。この場合には、算出部22は無照明画像を用いずに、明瞳孔画像および暗瞳孔画像のペアに対して位置補正をすることなく当該ペアから差分画像を生成し、その差分画像からの瞳孔検出を試みる。算出部22は瞳孔検出に成功するまでこの試行を繰り返す。そして、瞳孔検出に成功すると、今度は上記実施形態のうちのいずれかの処理、すなわち無照明画像を用いた角膜反射位置の推定を含む瞳孔検出を再開する。
なお、等速度モデルに代えてカルマンフィルタを用いる場合には線形カルマンフィルタを用いればよく、等加速度モデルに代えてカルマンフィルタを用いる場合には拡張カルマンフィルタを用いればよい。
Claims (17)
- 光源を備えるカメラを制御して対象者の眼の画像を連続的に取得する画像取得部であって、該光源からの光を用いて撮影された複数の瞳孔画像を取得した後に、該光源からの光を用いることなく撮影された一つの無照明画像を取得する該画像取得部と、
前記複数の瞳孔画像のそれぞれに対応する角膜反射または角膜球中心の位置を基準位置として算出し、複数の該基準位置に基づいて角膜反射または角膜球中心の移動ベクトルを算出し、前記無照明画像における角膜反射位置を該移動ベクトルに少なくとも基づいて推定する推定部と
を備える角膜反射位置推定システム。 - 前記推定部が、前記移動ベクトルと、前記画像取得部が個々の画像を取得する際の個々の時間間隔とに基づいて、前記無照明画像における角膜反射位置を推定する、
請求項1に記載の角膜反射位置推定システム。 - 前記複数の瞳孔画像が二つの瞳孔画像であり、
前記推定部が、前記二つの瞳孔画像のそれぞれにおける前記基準位置を算出し、二つの該基準位置に基づいて前記移動ベクトルを算出し、前記無照明画像における角膜反射位置を該移動ベクトルに少なくとも基づいて等速度モデルもしくは線形カルマンフィルタにより推定する、
請求項1または2に記載の角膜反射位置推定システム。 - 前記複数の瞳孔画像が三つの瞳孔画像であり、
前記推定部が、前記三つの瞳孔画像のそれぞれにおける前記基準位置を算出し、三つの該基準位置に基づいて二つの移動ベクトルを算出し、前記無照明画像における角膜反射位置を該二つの移動ベクトルに少なくとも基づいて等加速度モデルもしくは拡張カルマンフィルタにより推定する、
請求項1または2に記載の角膜反射位置推定システム。 - 前記画像取得部が、前記無照明画像を取得した後に、前記光源からの光を用いて撮影された一つの更なる瞳孔画像を取得し、
前記推定部が、前記更なる瞳孔画像における角膜反射位置を前記移動ベクトルに少なくとも基づいて推定する、
請求項1~4のいずれか一項に記載の角膜反射位置推定システム。 - 請求項5に記載の角膜反射位置推定システムにより推定された前記無照明画像における角膜反射位置に基づいて瞳孔位置を算出する瞳孔算出部を備え、
前記複数の瞳孔画像が明瞳孔画像および暗瞳孔画像のうちの一方を含み、前記更なる瞳孔画像が前記明瞳孔画像および前記暗瞳孔画像のうちの他方であり、
前記瞳孔算出部が、前記明瞳孔画像および前記無照明画像からの差分明瞳孔画像の生成と、前記暗瞳孔画像および前記無照明画像からの差分暗瞳孔画像の生成とを前記無照明画像における角膜反射位置に基づいて実行し、該差分明瞳孔画像および該差分暗瞳孔画像から差分画像を生成し、該差分画像に基づいて前記瞳孔位置を算出する、
瞳孔検出システム。 - 前記瞳孔算出部が、前記無照明画像における角膜反射位置に基づく前記瞳孔位置の算出に失敗した場合には、無照明画像を用いることなく、新たに取得された明瞳孔画像および暗瞳孔画像から新たな差分画像を生成し、該新たな差分画像に基づいて前記瞳孔位置を算出し、
前記新たな差分画像に基づいて前記瞳孔位置が得られた場合には、前記画像取得部および前記推定部による処理が実行され、前記瞳孔算出部が該処理に基づいて推定された新たな無照明画像における角膜反射位置に基づいて瞳孔位置を算出する、
請求項6に記載の瞳孔検出システム。 - 請求項6または7に記載の瞳孔検出システムにより算出された瞳孔位置に基づいて視線を検出する視線検出システム。
- 請求項6または7に記載の瞳孔検出システムにより算出された瞳孔位置と、前記明瞳孔画像または前記暗瞳孔画像を参照して算出された鼻孔位置とに基づいて顔姿勢を検出する顔姿勢検出システム。
- プロセッサを備える角膜反射位置推定システムにより実行される角膜反射位置推定方法であって、
光源を備えるカメラを制御して対象者の眼の画像を連続的に取得する画像取得ステップであって、該光源からの光を用いて撮影された複数の瞳孔画像を取得した後に、該光源からの光を用いることなく撮影された一つの無照明画像を取得する該画像取得ステップと、
前記複数の瞳孔画像のそれぞれに対応する角膜反射または角膜球中心の位置を基準位置として算出し、複数の該基準位置に基づいて角膜反射または角膜球中心の移動ベクトルを算出し、前記無照明画像における角膜反射位置を該移動ベクトルに基づいて推定する推定ステップと
を含む角膜反射位置推定方法。 - 光源を備えるカメラを制御して対象者の眼の画像を連続的に取得する画像取得部であって、該光源からの光を用いて撮影された複数の瞳孔画像を取得した後に、該光源からの光を用いることなく撮影された一つの無照明画像を取得する該画像取得部と、
前記複数の瞳孔画像のそれぞれに対応する角膜反射または角膜球中心の位置を基準位置として算出し、複数の該基準位置に基づいて角膜反射または角膜球中心の移動ベクトルを算出し、前記無照明画像における角膜反射位置を該移動ベクトルに少なくとも基づいて推定する推定部と
してコンピュータを機能させるための角膜反射位置推定プログラム。 - プロセッサを備えるコンピュータシステムにより実行される瞳孔検出方法であって、
請求項10に記載の角膜反射位置推定方法により推定された前記無照明画像における角膜反射位置に基づいて瞳孔位置を算出する瞳孔算出ステップを含み、
前記画像取得ステップでは、前記無照明画像を取得した後に、前記光源からの光を用いて撮影された一つの更なる瞳孔画像を取得し、
前記推定ステップでは、前記更なる瞳孔画像における角膜反射位置を前記移動ベクトルに少なくとも基づいて推定し、
前記複数の瞳孔画像のうち前記無照明画像の直前に得られた瞳孔画像が明瞳孔画像および暗瞳孔画像のうちの一方を含み、前記更なる瞳孔画像が前記明瞳孔画像および前記暗瞳孔画像のうちの他方であり、
前記瞳孔算出ステップでは、前記明瞳孔画像および前記無照明画像からの差分明瞳孔画像の生成と、前記暗瞳孔画像および前記無照明画像からの差分暗瞳孔画像の生成とを前記無照明画像における角膜反射位置に基づいて実行し、該差分明瞳孔画像および該差分暗瞳孔画像から差分画像を生成し、該差分画像に基づいて前記瞳孔位置を算出する、
瞳孔検出方法。 - コンピュータを請求項6または7に記載の瞳孔検出システムとして機能させるための瞳孔検出プログラム。
- 請求項12に記載の瞳孔検出方法により算出された瞳孔位置に基づいて視線を検出する視線検出方法。
- 請求項12に記載の瞳孔検出方法により算出された瞳孔位置と、前記明瞳孔画像または前記暗瞳孔画像を参照して算出された鼻孔位置とに基づいて顔姿勢を検出する顔姿勢検出方法。
- コンピュータを請求項8に記載の視線検出システムとして機能させるための視線検出プログラム。
- コンピュータを請求項9に記載の顔姿勢検出システムとして機能させるための顔姿勢検出プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016543883A JP6583734B2 (ja) | 2014-08-22 | 2015-07-27 | 角膜反射位置推定システム、角膜反射位置推定方法、角膜反射位置推定プログラム、瞳孔検出システム、瞳孔検出方法、瞳孔検出プログラム、視線検出システム、視線検出方法、視線検出プログラム、顔姿勢検出システム、顔姿勢検出方法、および顔姿勢検出プログラム |
EP15833722.0A EP3185211B1 (en) | 2014-08-22 | 2015-07-27 | Corneal reflection position estimation system, corneal reflection position estimation method, corneal reflection position estimation program, pupil detection system, pupil detection method, pupil detection program, gaze detection system, gaze detection method, gaze detection program, face orientation detection system, face orientation detection method, and face orientation detection program |
US15/505,781 US10417782B2 (en) | 2014-08-22 | 2015-07-27 | Corneal reflection position estimation system, corneal reflection position estimation method, corneal reflection position estimation program, pupil detection system, pupil detection method, pupil detection program, gaze detection system, gaze detection method, gaze detection program, face orientation detection system, face orientation detection method, and face orientation detection program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014169508 | 2014-08-22 | ||
JP2014-169508 | 2014-08-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016027627A1 true WO2016027627A1 (ja) | 2016-02-25 |
Family
ID=55350573
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/071270 WO2016027627A1 (ja) | 2014-08-22 | 2015-07-27 | 角膜反射位置推定システム、角膜反射位置推定方法、角膜反射位置推定プログラム、瞳孔検出システム、瞳孔検出方法、瞳孔検出プログラム、視線検出システム、視線検出方法、視線検出プログラム、顔姿勢検出システム、顔姿勢検出方法、および顔姿勢検出プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10417782B2 (ja) |
EP (1) | EP3185211B1 (ja) |
JP (1) | JP6583734B2 (ja) |
WO (1) | WO2016027627A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017154370A1 (ja) * | 2016-03-09 | 2017-09-14 | アルプス電気株式会社 | 視線検出装置および視線検出方法 |
EP3459436A1 (en) * | 2017-09-22 | 2019-03-27 | Smart Eye AB | Image acquisition with reflex reduction |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102466996B1 (ko) * | 2016-01-06 | 2022-11-14 | 삼성전자주식회사 | 눈 위치 예측 방법 및 장치 |
CN107357429B (zh) * | 2017-07-10 | 2020-04-07 | 京东方科技集团股份有限公司 | 用于确定视线的方法、设备和计算机可读存储介质 |
JP2019046141A (ja) * | 2017-09-01 | 2019-03-22 | アルパイン株式会社 | 運転者監視装置、運転者監視方法、及びプログラム |
CN108537111A (zh) * | 2018-02-26 | 2018-09-14 | 阿里巴巴集团控股有限公司 | 一种活体检测的方法、装置及设备 |
SE543240C2 (en) | 2018-12-21 | 2020-10-27 | Tobii Ab | Classification of glints using an eye tracking system |
CN112099615B (zh) * | 2019-06-17 | 2024-02-09 | 北京七鑫易维科技有限公司 | 注视信息确定方法、装置、眼球追踪设备及存储介质 |
SE543455C2 (en) * | 2019-06-28 | 2021-02-23 | Tobii Ab | Method and system for 3d cornea position estimation |
CN112464012B (zh) * | 2020-10-31 | 2022-06-17 | 浙江工业大学 | 可自动筛选照片的景区自动拍照系统及景区自动拍照方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008029702A (ja) * | 2006-07-31 | 2008-02-14 | National Univ Corp Shizuoka Univ | 瞳孔を検出する方法及び装置 |
WO2008120321A1 (ja) * | 2007-03-28 | 2008-10-09 | Fujitsu Limited | 画像処理装置、画像処理方法、画像処理プログラム |
JP2008246004A (ja) * | 2007-03-30 | 2008-10-16 | National Univ Corp Shizuoka Univ | 瞳孔検出方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4517049B2 (ja) | 2003-12-25 | 2010-08-04 | 国立大学法人静岡大学 | 視線検出方法および視線検出装置 |
JP4431749B2 (ja) | 2006-03-31 | 2010-03-17 | 国立大学法人静岡大学 | 顔姿勢検出方法 |
US8371693B2 (en) * | 2010-03-30 | 2013-02-12 | National University Corporation Shizuoka University | Autism diagnosis support apparatus |
JP5915981B2 (ja) * | 2010-08-09 | 2016-05-11 | 国立大学法人静岡大学 | 注視点検出方法及び注視点検出装置 |
JP5644342B2 (ja) * | 2010-10-05 | 2014-12-24 | トヨタ自動車株式会社 | 多気筒内燃機関の制御装置 |
EP2649932A4 (en) | 2010-12-08 | 2017-06-14 | National University Corporation Shizuoka University | Method for detecting point of gaze and device for detecting point of gaze |
CN104780834B (zh) * | 2012-11-12 | 2016-12-28 | 阿尔卑斯电气株式会社 | 生物体信息计测装置及使用该装置的输入装置 |
-
2015
- 2015-07-27 WO PCT/JP2015/071270 patent/WO2016027627A1/ja active Application Filing
- 2015-07-27 US US15/505,781 patent/US10417782B2/en not_active Expired - Fee Related
- 2015-07-27 EP EP15833722.0A patent/EP3185211B1/en active Active
- 2015-07-27 JP JP2016543883A patent/JP6583734B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008029702A (ja) * | 2006-07-31 | 2008-02-14 | National Univ Corp Shizuoka Univ | 瞳孔を検出する方法及び装置 |
WO2008120321A1 (ja) * | 2007-03-28 | 2008-10-09 | Fujitsu Limited | 画像処理装置、画像処理方法、画像処理プログラム |
JP2008246004A (ja) * | 2007-03-30 | 2008-10-16 | National Univ Corp Shizuoka Univ | 瞳孔検出方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3185211A4 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017154370A1 (ja) * | 2016-03-09 | 2017-09-14 | アルプス電気株式会社 | 視線検出装置および視線検出方法 |
EP3459436A1 (en) * | 2017-09-22 | 2019-03-27 | Smart Eye AB | Image acquisition with reflex reduction |
WO2019057766A1 (en) * | 2017-09-22 | 2019-03-28 | Smart Eye Ab | IMAGE ACQUISITION WITH REDUCTION OF REFLECTIONS |
US11653832B2 (en) | 2017-09-22 | 2023-05-23 | Smart Eye Ab | Image acquisition with reflex reduction |
Also Published As
Publication number | Publication date |
---|---|
EP3185211A4 (en) | 2018-01-24 |
US20170278269A1 (en) | 2017-09-28 |
JPWO2016027627A1 (ja) | 2017-06-08 |
EP3185211A1 (en) | 2017-06-28 |
EP3185211B1 (en) | 2021-06-02 |
JP6583734B2 (ja) | 2019-10-02 |
US10417782B2 (en) | 2019-09-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6583734B2 (ja) | 角膜反射位置推定システム、角膜反射位置推定方法、角膜反射位置推定プログラム、瞳孔検出システム、瞳孔検出方法、瞳孔検出プログラム、視線検出システム、視線検出方法、視線検出プログラム、顔姿勢検出システム、顔姿勢検出方法、および顔姿勢検出プログラム | |
JP7076368B2 (ja) | レンジゲート式デプスカメラ部品 | |
EP3589978B1 (en) | Multi-spectrum illumination-and-sensor module for head tracking, gesture recognition and spatial mapping | |
JP6695503B2 (ja) | 車両の運転者の状態を監視するための方法及びシステム | |
JP6377863B2 (ja) | 反射マップ表現による奥行きマップ表現の増補 | |
JP5467303B1 (ja) | 注視点検出装置、注視点検出方法、個人パラメータ算出装置、個人パラメータ算出方法、プログラム、及びコンピュータ読み取り可能な記録媒体 | |
JP6548171B2 (ja) | 瞳孔検出システム、視線検出システム、瞳孔検出方法、および瞳孔検出プログラム | |
JP2018522348A (ja) | センサーの3次元姿勢を推定する方法及びシステム | |
JP2018511098A (ja) | 複合現実システム | |
US10552675B2 (en) | Method and apparatus for eye detection from glints | |
JP6452235B2 (ja) | 顔検出方法、顔検出装置、及び顔検出プログラム | |
JP5001930B2 (ja) | 動作認識装置及び方法 | |
US11624907B2 (en) | Method and device for eye metric acquisition | |
US10475415B1 (en) | Strobe tracking of head-mounted displays (HMDs) in virtual, augmented, and mixed reality (xR) applications | |
US20200372673A1 (en) | Resolving region-of-interest (roi) overlaps for distributed simultaneous localization and mapping (slam) in edge cloud architectures | |
EP3042341A1 (en) | Method and apparatus for eye detection from glints | |
JP6288770B2 (ja) | 顔検出方法、顔検出システム、および顔検出プログラム | |
JP2017138645A (ja) | 視線検出装置 | |
JP6468755B2 (ja) | 特徴点検出システム、特徴点検出方法、および特徴点検出プログラム | |
JP6169446B2 (ja) | 視線検出装置 | |
JP7269617B2 (ja) | 顔画像処理装置、画像観察システム、及び瞳孔検出システム | |
Fujiyoshi et al. | Inside-out camera for acquiring 3D gaze points | |
US20230122185A1 (en) | Determining relative position and orientation of cameras using hardware | |
JP2022131345A (ja) | 瞳孔検出装置及び瞳孔検出方法 | |
JP2006209342A (ja) | 画像処理装置及び画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15833722 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016543883 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15505781 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2015833722 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015833722 Country of ref document: EP |